The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Windows – Installation Forum

Smoke Test Fails After Installing HDP 2.0.6 on Windows Successfully

  • #48184
    Samy Alihamad

    I have installed the latest version of Hadoop using the multi-node installation option. The setup includes a master node on a single Server 2012 VM and three slave nodes on separate Server 2012 VMs. My cluster properties configuration file include the following options:


    #Data directory


    #Database host

    #Hive properties

    #Oozie properties

    When the Hadoop installation completes successfully, I try to execute a Smoke Test by executing the following command: C:/hdp/Run-SmokeTests.cmd

    The first thing I notice is that the Run-SmokeTests.cmd script is deprecated. Does anyone know the steps to update the smoke test script to run the latest jobs?

    The error I receive during the execution of the Smoke Test after installing Hadoop 2.0.6 for windows are as follows:

    14/02/05 10:13:36 INFO mapreduce.Job: Running job: job_1391616346011_0003
    14/02/05 10:39:23 INFO mapreduce.Job: Job job_1391616346011_0003 running in uber mode : false
    14/02/05 10:39:23 INFO mapreduce.Job: map 0% reduce 0%
    14/02/05 10:39:23 INFO mapreduce.Job: Job job_1391616346011_0003 failed with state FAILED due to: Application
    application_1391616346011_0003 failed 2 times due to AM Container for appattempt_1391616346011_0003_000002 exi
    ted with exitCode: -100 due to: Container expired since it was unused.Failing this attempt.. Failing the appl
    14/02/05 10:39:23 INFO mapreduce.Job: Counters: 0
    Run-HadoopSmokeTest : Hadoop Smoke Test: FAILED
    At line:1 char:1
    + Run-HadoopSmokeTest
    + ~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Run-HadoopSmokeTest

  • Author
  • #48245
    Seth Lyubich

    Hi Samy,

    Can you please let us know which user you are running the smoke test with? Can you please try running smoke test as user ‘hadoop’?


    Samy Alihamad

    Hi Seth,
    Thank you for your quick response. I took your suggestion and I tried running the smoke test as the hadoop user by using the following commands:
    runas /user:hadoop cmd
    hadoop fs -chmod -R 755 /mapred

    I referenced this hortonworks forum post Hortonworks Smoke-Test-Failing post

    I also tried to create a new smoke test user by following the hortonworks
    Chapter 8. Appendix A: Create a Smoke Test User

    The four VMs that I am running are on our Company’s domain account, but the hadoop user that is created for each VM during installation is a local account. As a result, I tried to run the smoke test as a domain user, but I received the following error ‘Error copying the input file for the Hadoop smoke test’.

    To Recap,
    When running as a hadoop user on my Master node the following tests fail

    • Hadoop Smoke Test
    • Pig Test
    • Zookeeper

    When running the hadoop user on my Slave node the following tests fail

    • Hadoop Smoke Test
    • Pig Test
    • Oozie

    Thank you,
    Samy Alihamad

    Samy Alihamad

    I disabled the firewall on my master VM and the three slave nodes and executed the Smoke Test under the hadoop user account. As a result, I only received two errors this time around.

    The first error states:
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/hdp/hadoop-
    SLF4J: Found binding in [jar:file:/C:/hdp/hive-!/org/slf4j/impl
    SLF4J: See for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Authorization action WRITE not permitted on path hdfs://VMRDHWWS0 for user hadoop. Use show grant to get more details.
    Run-HCatalogSmokeTest : HCatalog Smoke Test: FAILED
    At line:1 char:1
    + Run-HCatalogSmokeTest
    + ~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Run-HCatalogSmokeTest
    Hive smoke test – drop table, create table and describe table
    Running hive command: drop table if exists hivesmoke

    The second error states:
    SLF4J: See for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Authorization action WRITE not permitted on path hdfs://VMRDHWWS0 for user hadoop. Use show grant to get more details.
    Run-HiveSmokeTest : Hive Smoke Test: FAILED
    At line:1 char:1
    + Run-HiveSmokeTest
    + ~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Run-HiveSmokeTest

    HiveServer2 smoke test – drop table, create table and describe table

    I know that disabling the firewall is not recommended, but if I can get the WRITE Permissions error fixed then I can go back and open any necessary ports to get the same results.

    However, I would like to point out that with the Firewalls disabled, if I run this Smoke Test as a local Administrator, it passes the above tests but still fails the Hadoop Smoke Test, Pig Smoke Test, and Oozie.

    Thank you,
    Samy Alihamad

    Samy Alihamad

    I should have made this clear during my last post, but I tried to change the permissions on both the hcatsmoke and hivesmoke folder, but the only way to get past the permissions error is to give full permissions to all (777) on both folders. Doing so allows me to run the smoke test successfully.

    However, taking it a step further I now have installed the additional components that come with hadoop, including Zookeeper. Running the smoke test with the hadoop user and modified permissions, I now get a java error (Error details listed below) only when running it in the master node.

    ZooKeeper smoke test – create root znode and verify data consistency across the quorum
    Delete /zk_smoketest znode if exists
    ‘java’ is not recognized as an internal or external command,
    operable program or batch file.
    Run-ZooKeeperSmokeTest : ZooKeeper Smoke Test: FAILED failed to run command delete /zk_smoketest
    At line:1 char:1
    + Run-ZooKeeperSmokeTest
    + ~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Run-ZooKeeperSmokeTest




    runas /user:hadoop cmd



    Hello Samy,

    I have the same problem, did you solve it?

    Samy Alihamad


    I was able to resolve my issues. However, it must be noted that I am installing Hadoop on Windows as a POC and not implementing it in a production environment and that I would not recommend these changes in a production environment.

    That being said I took the following steps to resolve my issues:

    1. Disable the Firewall on the master node and all slave nodes
    2. Be sure to run the smoke test as a hadoop user using the command: runas /user:hadoop cmd
    3. Run Smoke Test
    4. If you run into any issues about the hadoop user not having permissions to write or read against a particular folder, you have to change the permissions for the hadoop user. However, simply running the command: hadoop fs -chmod -R 755 /path-of-file-here or the command: hadoop fs -chmod -R 775 /path-of-file-here did not work for me. I had to give permissions to all users by running this command: hadoop fs -chmod -R 777 /path-of-file-here. I really didn’t want to give access to all users, but that was the only way I could move forward.

    If you have other issue, let me know and I’ll do my best to answer them for you.

    Samy Alihamad


    Thanks Samy for helping me, I followed your notes and all tests passed successfully.

    Seth Lyubich

    Hi all,

    Here are additional instructions for adding user to the cluster . The reason user hadoop does not hit this issue is because user hadoop is super user in the cluster.


The forum ‘HDP on Windows – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.