HDP on Windows – Installation Forum

HDP2 on Windows 2012 R2 Failed_start

  • #51098
    Peter B
    Participant

    Hello,
    I started to deploy HDP 2.0.6.0 on my Windows Server 2012 R2 machine, with Python 2.76 and jdk-7u51-windows-x64. The installation completes, however when I start the service it fails.

    c:\hdp>start_local_hdp_services.cmd
    starting datanode
    starting derbyserver
    starting historyserver
    starting hiveserver2
    starting hwi
    starting master
    starting metastore
    starting namenode
    starting nodemanager
    starting oozieservice
    starting regionserver
    starting resourcemanager
    starting secondarynamenode
    starting templeton
    starting zkServer
    Sent all start commands.
    total services
    15
    running services
    15
    not yet running services
    0
    Failed_Start

    hadoop-datanode.log:
    2014-04-02 17:48:00,749 ERROR org.apache.hadoop.hdfs.server.common.Util: Syntax error in URI c:\hdpdata\hdfs\dn. Please check hdfs configuration.
    java.net.URISyntaxException: Illegal character in opaque part at index 2: c:\hdpdata\hdfs\dn
    at java.net.URI$Parser.fail(URI.java:2829)
    at java.net.URI$Parser.checkChars(URI.java:3002)
    at java.net.URI$Parser.parse(URI.java:3039)
    at java.net.URI.<init>(URI.java:595)
    at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48)
    at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util.java:98)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.getStorageDirs(DataNode.java:1648)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1638)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)
    2014-04-02 17:48:00,906 WARN org.apache.hadoop.hdfs.server.common.Util: Path c:\hdpdata\hdfs\dn should be specified as a URI in configuration files. Please update hdfs configuration.

    Cheers,
    p

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #51105
    Rohit Bakhshi
    Moderator

    Peter,

    The “Failed_start” number is 0, ‘running services’ is 15/15, and that means all services have started successfully.

    You can validate that by looking at the ‘Services’ Pane in Windows to see that all the Apache Hadoop services are running.

    #51225
    Peter B
    Participant

    Thank you. I still worry to everything is configured successfully, while I got Failed message from the smoketest below, and what I put file to the HDFS (hdfs dfs -put xxx)
    it ain’t invisible from http://localhost:50075/. However, webhdfs enabled by default in hdfs-site.xml:
    <property>
    <name>dfs.webhdfs.enabled</name>
    <value>true</value>
    <description>to enable webhdfs</description>
    <final>true</final>
    </property>

    smoketest:
    c:\hdpdata\hadoop\local\usercache\Administrator\appcache\application_13966135553
    26_0001\container_1396613555326_0001_02_000001>if 0 NEQ 0 exit /b 0

    .Failing this attempt.. Failing the application.
    14/04/04 13:16:29 INFO mapreduce.Job: Counters: 0
    Run-HadoopSmokeTest : Hadoop Smoke Test: FAILED
    At line:1 char:1
    + Run-HadoopSmokeTest
    + ~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorExcep
    tion
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorExceptio
    n,Run-HadoopSmokeTest

    Pig smoke test – wordcount using hadoop.cmd file
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.

    Regards,
    P

    #51610
    Andrea D’Orio
    Participant

    Hello,

    i think Peter was right saying that there’s something wrong becasue the “Apache Hadoop datanote service start and then suddenly stop, then start again and stop and go this way forever!

    Smoke-Tests fails for me in the same way Peter fails and if you look into hadoop-datanode-xxxxx log file you’ll find these lines:

    2014-04-16 11:27:27,539 ERROR org.apache.hadoop.hdfs.server.common.Util: Syntax error in URI c:\hdpdata\hdfs\dn. Please check hdfs configuration.
    java.net.URISyntaxException: Illegal character in opaque part at index 2: c:\hdpdata\hdfs\dn
    2014-04-16 11:27:27,539 WARN org.apache.hadoop.hdfs.server.common.Util: Path c:\hdpdata\hdfs\dn should be specified as a URI in configuration files. Please update hdfs configuration.

    Do you have some suggestions?

    Thanks,

    Andrea

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.