Home Forums HDP on Windows – Installation HDP2 Windows Add User to Hadoop

Tagged: 

This topic contains 8 replies, has 7 voices, and was last updated by  Bill Carroll 7 months, 4 weeks ago.

  • Creator
    Topic
  • #47462

    Matt Workman
    Participant

    Hi,

    I am installing HDP 2.0 for Windows as my first attempt at a Hadoop stand alone environment! I have successfully gone through the install, and I am now running the Smoke Test command file.

    The smoke test is running under my username “mworkman” and is failing because Hadoop is expecting the user “hadoop” :) So, my question is how do I add my username as a valid user in the Hadoop eco system?

    From the log files I can see these commands start:

    hadoop-namenode.log

    2014-01-27 09:32:00,261 WARN org.apache.hadoop.security.UserGroupInformation: No groups available for user mworkman
    2014-01-27 09:32:00,294 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* allocateBlock: /user/mworkman/hadoop-139081511681791._COPYING_. BP-265433645-10.192.100.185-1390429513502 blk_1073741845_1021{blockUCState=UNDER_CONSTRUCTION, primaryNodeIndex=-1, replicas=[ReplicaUnderConstruction[10.192.100.185:50010|RBW]]}

    2014-01-27 09:32:12,925 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:mworkman (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=mworkman, access=EXECUTE, inode=”/mapred”:hadoop:hdfs:drwxrwx—
    2014-01-27 09:32:12,925 INFO org.apache.hadoop.ipc.Server: IPC Server handler 34 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo from 10.192.100.185:57860 Call#5 Retry#0: error: org.apache.hadoop.security.AccessControlException: Permission denied: user=mworkman, access=EXECUTE, inode=”/mapred”:hadoop:hdfs:drwxrwx—
    2014-01-27 09:32:13,323 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 8020: readAndProcess from client 10.192.100.185 threw exception [java 1="An" 2="existing" 3="connection" 4="was" 5="forcibly" 6="closed" 7="by" 8="the" 9="remote" 10="host" language=".io.IOException:"][/java]
    java.io.IOException: An existing connection was forcibly closed by the remote host
    at sun.nio.ch.SocketDispatcher.read0(Native Method)
    at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
    at sun.nio.ch.IOUtil.read(IOUtil.java:197)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:379)
    at org.apache.hadoop.ipc.Server.channelRead(Server.java:2602)
    at org.apache.hadoop.ipc.Server.access$3200(Server.java:122)
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1505)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:792)
    at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:591)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:566)

    If you could help point me in the right direction to add another user to have access to my hadoop system I would appreciate it!!

    Thanks in advance!

    Matt

Viewing 8 replies - 1 through 8 (of 8 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #51230

    Bill Carroll
    Participant

    I think I hit a similar problem as this exception which was answered in http://hortonworks.com/community/forums/topic/how-to-set-execute-permissions-on-windows/

    2014-01-27 09:32:12,925 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:mworkman (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=mworkman, access=EXECUTE, inode=”/mapred”:hadoop:hdfs:drwxrwx—

    I created a hdfs local group and added the hadoop user and my account or mworkman. After this I could run mapreduce job successfully.
    HTH
    Bill

    Collapse
    #51206

    Hkils2001 Hkils2001
    Participant

    Thanks the syntax:
    hadoop fs -chmod -R 755 /mapred
    if this one not worked and you are in testing, try:
    hadoop fs -chmod -R 777 /

    All permission problems are gone and fixed my question on these days. It also points out that it is the HDFS permission problem.

    Collapse
    #48317

    Matt Workman
    Participant

    Thank you all for your comments!! I was able to get most of the Smoke Test working and I applied the suggestions in the order listed below:

    Seth, I removed all the read-only attriubutes on the folders “C:\hdp” and “C:\Hadoop” where my installation resides. Is this what you were refering too?

    erwee, I added myself to the Hadoop User windows group. This did not help me get past the current failure, but I think it is the correct thing to do.

    Toby, your suggestion got the Smoke Test working. I still need to figure out what is wrong with my permissions, but at least I can get the Smoke Test working and move on from here. I don’t have a file named “hdfs.xml”, or at least I didn’t find it when searching for it, but I do have the file “hdfs-site.xml” with the xml node you listed. I have set that to false.

    <name>dfs.permissions.enabled</name>
    <value>false</value>

    My final error is a HIVE smoke test error, but I think that is a different issue than we have here in this thread. The error is below for reference.

    Thanks for your help!!

    Matt

    Connecting to jdbc:hive2://LEWBSG10:10001/?hive.server2.servermode=http
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/hdp/hadoop-2.2.0.2.0.6.0-0009/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/C:/hdp/hive-0.12.0.2.0.6.0-0009/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Connected to: Hive (version 0.12.0.2.0.6.0-0009)
    Driver: Hive (version 0.12.0.2.0.6.0-0009)
    Transaction isolation: TRANSACTION_REPEATABLE_READ
    Error: Error while processing statement: Authorization failed:java.security.AccessControlException: action WRITE not permitted on path hdfs://LEWBSG10:8020/hive/warehouse for user
    hadoop. Use show grant to get more details. (state=,code=403)
    Beeline version 0.12.0.2.0.6.0-0009 by Apache Hive
    Closing: org.apache.hive.jdbc.HiveConnection

    Matt

    Collapse
    #48143

    Toby Evans
    Participant

    Hi Matt,

    I had this same problem, and it’s really frustrating. I’ve kicked the can down the road by switching off permission checking in hdfs.xml

    <name>dfs.permissions.enabled</name>
    <value>false</value>

    remember to restart the server for the changes to take effect

    you probably want to fix this in a production environment :-)

    Collapse
    #47851

    Dean nicholson
    Participant

    I have HDP2 set up as a VM and when I log on as myself directly on the server I get the same error. I am in the Hadoop_users group. When I run the smoketests through remote desktop, logging in as hadoop I get the error on several directories also.

    Also is there any way to turn off the depreciation messages when you run the smoke tests?

    Collapse
    #47754

    Seth Lyubich
    Keymaster

    Hi Matt,

    Can you please try setting permissions to 755 on /mapred in HDFS? You can execute following command as user hadoop:

    #hadoop fs -chmod -R 755 /mapred

    Please let me know if this resolves your issue.

    Thanks,
    Seth

    Collapse
    #47562
    Collapse
    #47559

    erwee
    Participant

    Hi Matt. When Hadoop installs , it creates a “Hadoop Users” Windows group (at least it did on my install) . Have you tried going into computer manager win windows and adding “mworkman” to the “Hadoop users” group ?

    Collapse
Viewing 8 replies - 1 through 8 (of 8 total)