Home Forums HDP on Windows – Installation Smoke Test Fails After Installing HDP 2.0.6 on Windows Successfully

This topic contains 9 replies, has 4 voices, and was last updated by  Seth Lyubich 9 months, 1 week ago.

  • Creator
    Topic
  • #48184

    Samy Alihamad
    Participant

    I have installed the latest version of Hadoop using the multi-node installation option. The setup includes a master node on a single Server 2012 VM and three slave nodes on separate Server 2012 VMs. My cluster properties configuration file include the following options:

    HDP_LOG_DIR=c:\hadoop\logs

    #Data directory
    HDP_DATA_DIR=c:\hdpdata

    #hosts
    NAMENODE_HOST=VMRDHWWS01.domain.com
    SECONDARY_NAMENODE_HOST=VMRDHWWS01.domain.com
    RESOURCEMANAGER_HOST=VMRDHWWS01.domain.com
    HIVE_SERVER_HOST=VMRDHWWS01.domain.com
    OOZIE_SERVER_HOST=VMRDHWWS01.domain.com
    WEBHCAT_HOST=VMRDHWWS01.domain.com
    SLAVE_HOSTS=VMRDHWWS03.domain.com,VMRDHWWS02.domain.com,VMRDHWWS04.domain.com
    HBASE_MASTER=VMRDHWWS01.domain.com
    HBASE_REGIONSERVERS=VMRDHWWS03.domain.com,VMRDHWWS02.domain.com,VMRDHWWS04.domain.com
    ZOOKEEPER_HOSTS=VMRDHWWS01.domain.com,VMRDHWWS02.domain.com,VMRDHWWS03.domain.com,VMRDHWWS04.domain.com
    FLUME_HOSTS=VMRDHWWS01.domain.com,VMRDHWWS02.domain.com,VMRDHWWS03.domain.com,VMRDHWWS04.domain.com

    #Database host
    DB_FLAVOR=derby
    DB_HOSTNAME=VMRDHWWS01.domain.com
    DB_PORT=1527

    #Hive properties
    HIVE_DB_NAME=hive
    HIVE_DB_USERNAME=hive
    HIVE_DB_PASSWORD=******

    #Oozie properties
    OOZIE_DB_NAME=oozie
    OOZIE_DB_USERNAME=oozie
    OOZIE_DB_PASSWORD=******

    When the Hadoop installation completes successfully, I try to execute a Smoke Test by executing the following command: C:/hdp/Run-SmokeTests.cmd

    The first thing I notice is that the Run-SmokeTests.cmd script is deprecated. Does anyone know the steps to update the smoke test script to run the latest jobs?

    The error I receive during the execution of the Smoke Test after installing Hadoop 2.0.6 for windows are as follows:

    14/02/05 10:13:36 INFO mapreduce.Job: Running job: job_1391616346011_0003
    14/02/05 10:39:23 INFO mapreduce.Job: Job job_1391616346011_0003 running in uber mode : false
    14/02/05 10:39:23 INFO mapreduce.Job: map 0% reduce 0%
    14/02/05 10:39:23 INFO mapreduce.Job: Job job_1391616346011_0003 failed with state FAILED due to: Application
    application_1391616346011_0003 failed 2 times due to AM Container for appattempt_1391616346011_0003_000002 exi
    ted with exitCode: -100 due to: Container expired since it was unused.Failing this attempt.. Failing the appl
    ication.
    14/02/05 10:39:23 INFO mapreduce.Job: Counters: 0
    Run-HadoopSmokeTest : Hadoop Smoke Test: FAILED
    At line:1 char:1
    + Run-HadoopSmokeTest
    + ~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Run-HadoopSmokeTest

Viewing 9 replies - 1 through 9 (of 9 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #48877

    Seth Lyubich
    Keymaster

    Hi all,

    Here are additional instructions for adding user to the cluster http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.6.0-Win/bk_installing_hdp_for_windows/content/ch_appendix-add-user-smoketestuser.html . The reason user hadoop does not hit this issue is because user hadoop is super user in the cluster.

    Thanks,
    Seth

    Collapse
    #48763

    Thanks Samy for helping me, I followed your notes and all tests passed successfully.

    Collapse
    #48684

    Samy Alihamad
    Participant

    RetailGreen,

    I was able to resolve my issues. However, it must be noted that I am installing Hadoop on Windows as a POC and not implementing it in a production environment and that I would not recommend these changes in a production environment.

    That being said I took the following steps to resolve my issues:

    1. Disable the Firewall on the master node and all slave nodes
    2. Be sure to run the smoke test as a hadoop user using the command: runas /user:hadoop cmd
    3. Run Smoke Test
    4. If you run into any issues about the hadoop user not having permissions to write or read against a particular folder, you have to change the permissions for the hadoop user. However, simply running the command: hadoop fs -chmod -R 755 /path-of-file-here or the command: hadoop fs -chmod -R 775 /path-of-file-here did not work for me. I had to give permissions to all users by running this command: hadoop fs -chmod -R 777 /path-of-file-here. I really didn’t want to give access to all users, but that was the only way I could move forward.

    If you have other issue, let me know and I’ll do my best to answer them for you.

    Thanks,
    Samy Alihamad

    Collapse
    #48633

    Hello Samy,

    I have the same problem, did you solve it?

    Collapse
    #48502

    Zbw
    Participant

    cmd

    runas /user:hadoop cmd

    runTest….

    Collapse
    #48412

    Samy Alihamad
    Participant

    I should have made this clear during my last post, but I tried to change the permissions on both the hcatsmoke and hivesmoke folder, but the only way to get past the permissions error is to give full permissions to all (777) on both folders. Doing so allows me to run the smoke test successfully.

    However, taking it a step further I now have installed the additional components that come with hadoop, including Zookeeper. Running the smoke test with the hadoop user and modified permissions, I now get a java error (Error details listed below) only when running it in the master node.

    ZooKeeper smoke test – create root znode and verify data consistency across the quorum
    Delete /zk_smoketest znode if exists
    ‘java’ is not recognized as an internal or external command,
    operable program or batch file.
    Run-ZooKeeperSmokeTest : ZooKeeper Smoke Test: FAILED failed to run command delete /zk_smoketest
    At line:1 char:1
    + Run-ZooKeeperSmokeTest
    + ~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Run-ZooKeeperSmokeTest

    9009

    Collapse
    #48307

    Samy Alihamad
    Participant

    UPDATE
    I disabled the firewall on my master VM and the three slave nodes and executed the Smoke Test under the hadoop user account. As a result, I only received two errors this time around.

    The first error states:
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/hdp/hadoop-2.2.0.2.0.6.0-0009/share/hadoop/common/lib/slf4j-log4j12-1.7.
    5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/C:/hdp/hive-0.12.0.2.0.6.0-0009/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl
    /StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Authorization failed:java.security.AccessControlException: action WRITE not permitted on path hdfs://VMRDHWWS0
    1.askcts.com:8020/hive/warehouse/hcatsmoke for user hadoop. Use show grant to get more details.
    Run-HCatalogSmokeTest : HCatalog Smoke Test: FAILED
    At line:1 char:1
    + Run-HCatalogSmokeTest
    + ~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Run-HCatalogSmokeTest
    403
    Hive smoke test – drop table, create table and describe table
    Running hive command: drop table if exists hivesmoke

    The second error states:
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Authorization failed:java.security.AccessControlException: action WRITE not permitted on path hdfs://VMRDHWWS0
    1.askcts.com:8020/hive/warehouse/hivesmoke for user hadoop. Use show grant to get more details.
    Run-HiveSmokeTest : Hive Smoke Test: FAILED
    At line:1 char:1
    + Run-HiveSmokeTest
    + ~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Run-HiveSmokeTest

    403
    HiveServer2 smoke test – drop table, create table and describe table

    I know that disabling the firewall is not recommended, but if I can get the WRITE Permissions error fixed then I can go back and open any necessary ports to get the same results.

    However, I would like to point out that with the Firewalls disabled, if I run this Smoke Test as a local Administrator, it passes the above tests but still fails the Hadoop Smoke Test, Pig Smoke Test, and Oozie.

    Thank you,
    Samy Alihamad

    Collapse
    #48268

    Samy Alihamad
    Participant

    Hi Seth,
    Thank you for your quick response. I took your suggestion and I tried running the smoke test as the hadoop user by using the following commands:
    runas /user:hadoop cmd
    hadoop fs -chmod -R 755 /mapred

    I referenced this hortonworks forum post Hortonworks Smoke-Test-Failing post

    I also tried to create a new smoke test user by following the hortonworks
    Chapter 8. Appendix A: Create a Smoke Test User

    The four VMs that I am running are on our Company’s domain account, but the hadoop user that is created for each VM during installation is a local account. As a result, I tried to run the smoke test as a domain user, but I received the following error ‘Error copying the input file for the Hadoop smoke test’.

    To Recap,
    When running as a hadoop user on my Master node the following tests fail

    • Hadoop Smoke Test
    • Pig Test
    • Zookeeper

    When running the hadoop user on my Slave node the following tests fail

    • Hadoop Smoke Test
    • Pig Test
    • Oozie

    Thank you,
    Samy Alihamad

    Collapse
    #48245

    Seth Lyubich
    Keymaster

    Hi Samy,

    Can you please let us know which user you are running the smoke test with? Can you please try running smoke test as user ‘hadoop’?

    Thanks,
    Seth

    Collapse
Viewing 9 replies - 1 through 9 (of 9 total)