Home Forums HDP on Windows – Installation HDP 2.0 install issues on Windows 7, JDK 1.7.0.2

This topic contains 6 replies, has 4 voices, and was last updated by  Swooshx Wu 4 months, 3 weeks ago.

  • Creator
    Topic
  • #48345

    David Goyal
    Participant

    Need help. I have tried installing and uninstalling HDP 2.0 more than 20 times during last 5 days. So far it installs successfully but SmokeTests fail. I used administrative access for installation.

    1. I am on Windows 7, 80 GB free disk, 8 GB RAM, JDK 1.7.0_02 & phthon 27.
    2. hdp-2.0.6.0.winpkg.install.log shows successful installation and does not show any errors.
    3. Firewall is off for everything
    4. I am on private network
    5. Only Smoke tests passed: Sqoop, Oozie
    6. Extracts of SmokeTest results..
    14/02/07 10:57:01 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
    ry for your platform… using builtin-java classes where applicable
    copyFromLocal: Call From MALT-7539/192.168.2.7 to MALT-7539:8020 failed on conne
    ction exception: java.net.ConnectException: Connection refused: no further infor
    mation; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
    Run-HadoopSmokeTest : Error copying the input file for the Hadoop smoke test
    At line:1 char:20
    + Run-HadoopSmokeTest <<<<
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorExcep
    tion
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorExceptio
    n,Run-HadoopSmokeTest

    14/02/07 10:57:06 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
    ry for your platform… using builtin-java classes where applicable
    14/02/07 10:57:06 INFO client.RMProxy: Connecting to ResourceManager at MALT-753
    9/192.168.2.7:8032

Viewing 6 replies - 1 through 6 (of 6 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #51500

    Swooshx Wu
    Participant

    Hi, David,

    I got the same issue you had before about “FATAL org.apache.hadoop.hdfs.server.namenode.NameNode: Exception in namenode join
    java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z”

    Would you please guide me how to resolve that issue? Thanks a lot.

    Regards,
    Swooshx

    Collapse
    #48867

    David Goyal
    Participant

    Dave
    Thanks. I was able to install and smoke test successfully on a single node install. I believe that my suggestions on URI format etc will be equally applicable to Microsoft server platforms.

    Soon I will be embarking on a multi-node install and will write my feedback.

    thanks

    Collapse
    #48862

    Dave
    Moderator

    Hi David,

    HDP 2.0 is not supported on Windows 7.
    There are no plans to support HDP on Desktop Operating Systems.
    You would be better off to source a Windows 2008/2012 64 bit VM or OS to install on.

    Thanks

    Dave

    Collapse
    #48384

    David Goyal
    Participant

    I made some more progress but still need help.
    1. runSmokeTests hadoop is in a never ending loop trying to establish connection to resource manager at port 8032..

    14/02/08 19:09:44 INFO ipc.Client: Retrying connect to server: MALT-7539/192.168.2.7:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1 SECONDS)
    14/02/08 19:09:46 INFO ipc.Client: Retrying connect to server: MALT-7539/192.168.2.7:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1 SECONDS)

    2. I do not see this port using [netstat -n | find "8032"]
    3. Yarn resource manager logs shows ”
    2014-02-08 18:55:18,610 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
    2014-02-08 18:55:18,622 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 8032: starting
    2014-02-08 18:55:18,771 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 8032
    2014-02-08 18:55:19,364 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 8033
    2014-02-08 18:55:19,367 INFO org.apache.hadoop.yarn.factories.impl.pb.RpcServerFactoryPBImpl: Adding protocol

    Few pointers for others
    1. Ensure that you are using 64 bit Java
    2. HDP2.0 standard distribution has few issues
    a. Following parameters in hdfs-site.xml need to be in correct URI format (dfs.namenode.name.dir, dfs.datanode.data.dir, dfs.namenode.checkpoint.dir, dfs.namenode.checkpoint.edits.dir)
    b. You need to manually create blank dfs.include and dfs.exclude files.
    c. You may need to manually format hadoop file system (hadoop datanode -format)
    d. core-site.xml – value of fs.default.name should be file:///

    Collapse
    #48376

    David Goyal
    Participant

    I made some progress but still see issues
    1. I corrected an issue in configuration file with incorrect URI for windows files. I corrected filenames to URI format in core-site.xml, hdfs-site.xml, mapred-site.xml and hive-site.xml files.
    2. Now I am trying to resolve following error in namenode logs. It appears to be a compile error in HDP 2.0 distribution. Any guidance will be appreciated.

    2014-02-07 21:17:35,852 FATAL org.apache.hadoop.hdfs.server.namenode.NameNode: Exception in namenode join
    java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:435)
    at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
    at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:451)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:794)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:575)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:443)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:491)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:684)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:669)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1254)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1320)
    2014-02-07 21:17:35,854 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1

    Collapse
    #48347

    iandr413
    Moderator

    Hi David,
    I have moved your forum post to our HDP on Windows – Installation section. This may allow others to weigh in on it better. Thank you.

    Ian

    Collapse
Viewing 6 replies - 1 through 6 (of 6 total)