Home Forums HDP on Windows – Installation HDP2 on Windows 2012 R2 Failed_start

This topic contains 3 replies, has 3 voices, and was last updated by  Andrea D’Orio 3 months, 1 week ago.

  • Creator
    Topic
  • #51098

    Peter B
    Participant

    Hello,
    I started to deploy HDP 2.0.6.0 on my Windows Server 2012 R2 machine, with Python 2.76 and jdk-7u51-windows-x64. The installation completes, however when I start the service it fails.

    c:\hdp>start_local_hdp_services.cmd
    starting datanode
    starting derbyserver
    starting historyserver
    starting hiveserver2
    starting hwi
    starting master
    starting metastore
    starting namenode
    starting nodemanager
    starting oozieservice
    starting regionserver
    starting resourcemanager
    starting secondarynamenode
    starting templeton
    starting zkServer
    Sent all start commands.
    total services
    15
    running services
    15
    not yet running services
    0
    Failed_Start

    hadoop-datanode.log:
    2014-04-02 17:48:00,749 ERROR org.apache.hadoop.hdfs.server.common.Util: Syntax error in URI c:\hdpdata\hdfs\dn. Please check hdfs configuration.
    java.net.URISyntaxException: Illegal character in opaque part at index 2: c:\hdpdata\hdfs\dn
    at java.net.URI$Parser.fail(URI.java:2829)
    at java.net.URI$Parser.checkChars(URI.java:3002)
    at java.net.URI$Parser.parse(URI.java:3039)
    at java.net.URI.<init>(URI.java:595)
    at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48)
    at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util.java:98)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.getStorageDirs(DataNode.java:1648)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1638)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)
    2014-04-02 17:48:00,906 WARN org.apache.hadoop.hdfs.server.common.Util: Path c:\hdpdata\hdfs\dn should be specified as a URI in configuration files. Please update hdfs configuration.

    Cheers,
    p

Viewing 3 replies - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #51610

    Andrea D’Orio
    Participant

    Hello,

    i think Peter was right saying that there’s something wrong becasue the “Apache Hadoop datanote service start and then suddenly stop, then start again and stop and go this way forever!

    Smoke-Tests fails for me in the same way Peter fails and if you look into hadoop-datanode-xxxxx log file you’ll find these lines:

    2014-04-16 11:27:27,539 ERROR org.apache.hadoop.hdfs.server.common.Util: Syntax error in URI c:\hdpdata\hdfs\dn. Please check hdfs configuration.
    java.net.URISyntaxException: Illegal character in opaque part at index 2: c:\hdpdata\hdfs\dn
    2014-04-16 11:27:27,539 WARN org.apache.hadoop.hdfs.server.common.Util: Path c:\hdpdata\hdfs\dn should be specified as a URI in configuration files. Please update hdfs configuration.

    Do you have some suggestions?

    Thanks,

    Andrea

    Collapse
    #51225

    Peter B
    Participant

    Thank you. I still worry to everything is configured successfully, while I got Failed message from the smoketest below, and what I put file to the HDFS (hdfs dfs -put xxx)
    it ain’t invisible from http://localhost:50075/. However, webhdfs enabled by default in hdfs-site.xml:
    <property>
    <name>dfs.webhdfs.enabled</name>
    <value>true</value>
    <description>to enable webhdfs</description>
    <final>true</final>
    </property>

    smoketest:
    c:\hdpdata\hadoop\local\usercache\Administrator\appcache\application_13966135553
    26_0001\container_1396613555326_0001_02_000001>if 0 NEQ 0 exit /b 0

    .Failing this attempt.. Failing the application.
    14/04/04 13:16:29 INFO mapreduce.Job: Counters: 0
    Run-HadoopSmokeTest : Hadoop Smoke Test: FAILED
    At line:1 char:1
    + Run-HadoopSmokeTest
    + ~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorExcep
    tion
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorExceptio
    n,Run-HadoopSmokeTest

    Pig smoke test – wordcount using hadoop.cmd file
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.

    Regards,
    P

    Collapse
    #51105

    Rohit Bakhshi
    Moderator

    Peter,

    The “Failed_start” number is 0, ‘running services’ is 15/15, and that means all services have started successfully.

    You can validate that by looking at the ‘Services’ Pane in Windows to see that all the Apache Hadoop services are running.

    Collapse
Viewing 3 replies - 1 through 3 (of 3 total)