The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Windows – Installation Forum

Hadoop 2 Errors during install

  • #48274
    Mike Kogan
    Participant

    I was trying to perform a multi-node install with the following settings: http://tinypic.com/view.php?pic=262wdad&s=8#.UvO3M_ldUnR

    It installed successfully but see the following errors in the log:

    HADOOP-CMD FAILURE: e-client-hs-plugins-2.2.0.2.0.6.0-0009.jar;c:\hdp\hadoop-2.2.0.2.0.6.0-0009\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-2.2.0.2.0.6.0-0009-tests.jar;c:\hdp\hadoop-2.2.0.2.0.6.0-0009\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-2.2.0.2.0.6.0-0009.jar;c:\hdp\hadoop-2.2.0.2.0.6.0-0009\share\hadoop\mapreduce\hadoop-mapreduce-client-shuffle-2.2.0.2.0.6.0-0009.jar;c:\hdp\hadoop-2.2.0.2.0.6.0-0009\share\hadoop\mapreduce\hadoop-mapreduce-examples-2.2.0.2.0.6.0-0009.jar
    STARTUP_MSG: build = git@github.com:hortonworks/hadoop-monarch.git -r b845729d6990bc11889a5bdefaf6d2221ef9e6d1; compiled by ‘jenkins’ on 2013-12-21T02:17Z
    STARTUP_MSG: java = 1.6.0_31
    ************************************************************/
    14/02/05 18:04:21 ERROR common.Util: Syntax error in URI c:\hdpdata\hdfs\nn. Please check hdfs configuration.
    java.net.URISyntaxException: Illegal character in opaque part at index 2: c:\hdpdata\hdfs\nn
    at java.net.URI$Parser.fail(URI.java:2810)
    at java.net.URI$Parser.checkChars(URI.java:2983)
    at java.net.URI$Parser.parse(URI.java:3020)
    at java.net.URI.<init>(URI.java:577)
    at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48)
    at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util.java:98)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getStorageDirs(FSNamesystem.java:1119)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceDirs(FSNamesystem.java:1074)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:813)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1213)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1320)
    HADOOP-CMD FAILURE: 14/02/05 18:04:21 WARN common.Util: Path c:\hdpdata\hdfs\nn should be specified as a URI in configuration files. Please update hdfs configuration.
    14/02/05 18:04:21 ERROR common.Util: Syntax error in URI c:\hdpdata\hdfs\nn. Please check hdfs configuration.
    java.net.URISyntaxException: Illegal character in opaque part at index 2: c:\hdpdata\hdfs\nn
    at java.net.URI$Parser.fail(URI.java:2810)
    at java.net.URI$Parser.checkChars(URI.java:2983)
    at java.net.URI$Parser.parse(URI.java:3020)
    at java.net.URI.<init>(URI.java:577)
    at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48)
    at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util.java:98)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getStorageDirs(FSNamesystem.java:1119)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceEditsDirs(FSNamesystem.java:1164)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceEditsDirs(FSNamesystem.java:1133)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.

  • Author
    Replies
  • #48276
    Robert Molina
    Keymaster

    Hi Mike,
    Can you post your configuration file?

    Regards,
    Robert

    #48281
    Mike Kogan
    Participant

    I did not use the configuration file. You can look at the picture to see which properties I had: http://tinypic.com/view.php?pic=262wdad&s=8#.UvO3M_ldUnR

    But here is the configuration file that has all the values that i used on the GUI:
    #Log directory
    HDP_LOG_DIR=c:\hadoop\logs

    #Data directory
    HDP_DATA_DIR=c:\hdpdata

    #hosts
    NAMENODE_HOST=hadoop3
    SECONDARY_NAMENODE_HOST=hadoop3
    RESOURCEMANAGER_HOST=hadoop3
    HIVE_SERVER_HOST=hadoop3
    OOZIE_SERVER_HOST=hadoop3
    WEBHCAT_HOST=hadoop3
    SLAVE_HOSTS=hadoop1,hadoop2
    CLIENT_HOSTS=
    HBASE_MASTER=
    HBASE_REGIONSERVERS=
    ZOOKEEPER_HOSTS=
    FLUME_HOSTS=

    #Database host
    DB_FLAVOR=MSSQL
    DB_HOSTNAME=p-dv-dsk-mkog.kcura.corp
    DB_PORT=1433

    #Hive properties
    HIVE_DB_NAME=hive
    HIVE_DB_USERNAME=hive
    HIVE_DB_PASSWORD=xxxx

    #Oozie properties
    OOZIE_DB_NAME=oozie
    OOZIE_DB_USERNAME=oozie
    OOZIE_DB_PASSWORD=xxxx

    #48438
    Ricardo Colon
    Participant

    I’m getting the same error.
    Does anyone know how to fix this?

    Attempted to install HDP 2.0 on Windws 2008 R2 64-bit.
    Following the instructions in the documentation for installed python, java, c++ redistributable, and .NET 4.0.

    #48531
    Wael Elimam
    Participant

    Any solutions for this problem?

    #48858
    Seth Lyubich
    Moderator

    Hi Mike,

    Can you please provide NameNode log and your hdfs-site.xml. Looks like the system cannot parse some configuration. FTP details are below:

    https://swft.exavault.com/login
    username: dropoff
    password: horton

    Thanks,
    Seth

    #49375
    David Vance
    Participant

    Had the same error with HDP 2.0 on Windws 2008 R2 64-bit.

    Try adjusting the hdfs-site.xml config file. It contains 4 nodes (dfs.namenode.name.dir, dfs.datanode.data.dir, dfs.namenode.checkpoint.dir, dfs.namenode.checkpoint.edits.dir) that are likely set to a drive instead of a file. For example,

    <name>dfs.namenode.name.dir</name>
    <value>c:/hdpdata/hdfs/nn</value>

    should be changed to:

    <name>dfs.namenode.name.dir</name>
    <value>file:///c:/hdpdata/hdfs/nn</value>

    Good luck!

    #52744
    IngEnsi
    Participant

    Hi ,
    I had the same problem with HDP 2.0 , I used the solution proposed by David, I changed the path from “c:/hdp/hdfs/nn” to “file:///c:/hdp/hdfs/nn” and it works perfectly.
    thanks David !!

    #65258
    Kushagra Srivastava
    Participant

    thanks it worked ….

    <value>file:///home/kushagra/hadoophdfs/hdfs/data</value>
    i gave it like that .But still i can not see it in my web-console .i can see other datanodes .. and when i try to stop it ,it says stopping datanode .so i guess its running but not showing in web-console

The forum ‘HDP on Windows – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.