HDP on Windows – Installation Forum

HDP2 Windows Add User to Hadoop

  • #47462
    Matt Workman


    I am installing HDP 2.0 for Windows as my first attempt at a Hadoop stand alone environment! I have successfully gone through the install, and I am now running the Smoke Test command file.

    The smoke test is running under my username “mworkman” and is failing because Hadoop is expecting the user “hadoop” :) So, my question is how do I add my username as a valid user in the Hadoop eco system?

    From the log files I can see these commands start:


    2014-01-27 09:32:00,261 WARN org.apache.hadoop.security.UserGroupInformation: No groups available for user mworkman
    2014-01-27 09:32:00,294 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* allocateBlock: /user/mworkman/hadoop-139081511681791._COPYING_. BP-265433645- blk_1073741845_1021{blockUCState=UNDER_CONSTRUCTION, primaryNodeIndex=-1, replicas=[ReplicaUnderConstruction[|RBW]]}

    2014-01-27 09:32:12,925 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:mworkman (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=mworkman, access=EXECUTE, inode=”/mapred”:hadoop:hdfs:drwxrwx—
    2014-01-27 09:32:12,925 INFO org.apache.hadoop.ipc.Server: IPC Server handler 34 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo from Call#5 Retry#0: error: org.apache.hadoop.security.AccessControlException: Permission denied: user=mworkman, access=EXECUTE, inode=”/mapred”:hadoop:hdfs:drwxrwx—
    2014-01-27 09:32:13,323 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 8020: readAndProcess from client threw exception [java 1="An" 2="existing" 3="connection" 4="was" 5="forcibly" 6="closed" 7="by" 8="the" 9="remote" 10="host" language=".io.IOException:"][/java]
    java.io.IOException: An existing connection was forcibly closed by the remote host
    at sun.nio.ch.SocketDispatcher.read0(Native Method)
    at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
    at sun.nio.ch.IOUtil.read(IOUtil.java:197)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:379)
    at org.apache.hadoop.ipc.Server.channelRead(Server.java:2602)
    at org.apache.hadoop.ipc.Server.access$3200(Server.java:122)
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1505)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:792)
    at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:591)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:566)

    If you could help point me in the right direction to add another user to have access to my hadoop system I would appreciate it!!

    Thanks in advance!


to create new topics or reply. | New User Registration

  • Author
  • #47559

    Hi Matt. When Hadoop installs , it creates a “Hadoop Users” Windows group (at least it did on my install) . Have you tried going into computer manager win windows and adding “mworkman” to the “Hadoop users” group ?

    Seth Lyubich

    Hi Matt,

    Can you please try setting permissions to 755 on /mapred in HDFS? You can execute following command as user hadoop:

    #hadoop fs -chmod -R 755 /mapred

    Please let me know if this resolves your issue.


    Dean nicholson

    I have HDP2 set up as a VM and when I log on as myself directly on the server I get the same error. I am in the Hadoop_users group. When I run the smoketests through remote desktop, logging in as hadoop I get the error on several directories also.

    Also is there any way to turn off the depreciation messages when you run the smoke tests?

    Toby Evans

    Hi Matt,

    I had this same problem, and it’s really frustrating. I’ve kicked the can down the road by switching off permission checking in hdfs.xml


    remember to restart the server for the changes to take effect

    you probably want to fix this in a production environment :-)

    Matt Workman

    Thank you all for your comments!! I was able to get most of the Smoke Test working and I applied the suggestions in the order listed below:

    Seth, I removed all the read-only attriubutes on the folders “C:\hdp” and “C:\Hadoop” where my installation resides. Is this what you were refering too?

    erwee, I added myself to the Hadoop User windows group. This did not help me get past the current failure, but I think it is the correct thing to do.

    Toby, your suggestion got the Smoke Test working. I still need to figure out what is wrong with my permissions, but at least I can get the Smoke Test working and move on from here. I don’t have a file named “hdfs.xml”, or at least I didn’t find it when searching for it, but I do have the file “hdfs-site.xml” with the xml node you listed. I have set that to false.


    My final error is a HIVE smoke test error, but I think that is a different issue than we have here in this thread. The error is below for reference.

    Thanks for your help!!


    Connecting to jdbc:hive2://LEWBSG10:10001/?hive.server2.servermode=http
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/hdp/hadoop-!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/C:/hdp/hive-!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Connected to: Hive (version
    Driver: Hive (version
    Transaction isolation: TRANSACTION_REPEATABLE_READ
    Error: Error while processing statement: Authorization failed:java.security.AccessControlException: action WRITE not permitted on path hdfs://LEWBSG10:8020/hive/warehouse for user
    hadoop. Use show grant to get more details. (state=,code=403)
    Beeline version by Apache Hive
    Closing: org.apache.hive.jdbc.HiveConnection


    Hkils2001 Hkils2001

    Thanks the syntax:
    hadoop fs -chmod -R 755 /mapred
    if this one not worked and you are in testing, try:
    hadoop fs -chmod -R 777 /

    All permission problems are gone and fixed my question on these days. It also points out that it is the HDFS permission problem.

    Bill Carroll

    I think I hit a similar problem as this exception which was answered in http://hortonworks.com/community/forums/topic/how-to-set-execute-permissions-on-windows/

    2014-01-27 09:32:12,925 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:mworkman (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=mworkman, access=EXECUTE, inode=”/mapred”:hadoop:hdfs:drwxrwx—

    I created a hdfs local group and added the hadoop user and my account or mworkman. After this I could run mapreduce job successfully.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.