HDFS Forum

Unable to start the datanode

  • #24405

    I am trying to install HDP2.0 Alpha by following the instructions in http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-

    I am getting the following error while starting the datanode.

    [root@yellowstone-6878 sbin]# ./hadoop-daemon.sh –config /etc/hadoop start datanode
    starting datanode, logging to /usr/lib/hadoop/logs/hadoop-root-datanode-yellowstone-6878.novalocal.out
    /usr/lib/hadoop-hdfs/bin/hdfs: line 24: /usr/lib/hadoop-hdfs/bin/../libexec/hdfs-config.sh: No such file or directory
    /usr/lib/hadoop-hdfs/bin/hdfs: line 132: cygpath: command not found
    /usr/lib/hadoop-hdfs/bin/hdfs: line 164: exec: : not found

    Please help me on this issue.


to create new topics or reply. | New User Registration

  • Author
  • #24557

    Hi Shyamala,

    Thanks for using Hortonworks Data Platform.

    The error trace you are getting indicates that the configuration files could not be found in the directory pointed to by the –config option. Are the configuration files in /etc/hadoop and not in a subdirectory, if these configuration files are in a subdirectory of that that path then that subdirectory needs to be on the path. Also were you able to start the NameNode?



    Hi Ted,

    Thanks for looking into this issue.

    All the conf files are under etc/hadoop/config. That’s the location given in the instructions. Should I remove the config directory and put all the conf files under etc/hadoop?

    Yes my namenode and secondarynode started fine.


    Peter Rudenko

    Try to run export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec before starting.

    Grant Liu

    Folks, I ran into this, turns out I mis-read the instructions. When copying configuration files, you must copy ALL the files, not just the ones you changed. Possible you’re seeing the same thing.

    Chris Bennight

    I just did a fresh install on centos 6.4 and am getting this issue. From What I can tell I’ve covered all the bases listed below.

    [root@master conf]# ls -al
    total 116
    drwxr-xr-x. 2 hdfs hadoop 4096 Jun 15 22:56 .
    drwxr-xr-x. 3 hdfs hadoop 4096 Jun 15 22:55 ..
    -rwxr-xr-x. 1 hdfs hadoop 3478 Jun 15 22:35 capacity-scheduler.xml
    -rwxr-xr-x. 1 hdfs hadoop 1020 Jun 15 22:35 commons-logging.properties
    -rwxr-xr-x. 1 hdfs hadoop 171 Jun 15 22:35 container-executor.cfg
    -rwxr-xr-x. 1 hdfs hadoop 7823 Jun 15 22:35 core-site.xml
    -rwxr-xr-x. 1 hdfs hadoop 4417 Jun 15 22:35 hadoop-env.sh
    -rwxr-xr-x. 1 hdfs hadoop 1515 Jun 15 22:35 hadoop-metrics2.properties
    -rwxr-xr-x. 1 hdfs hadoop 1421 Jun 15 22:35 hadoop-metrics2.properties-GANGLIA
    -rwxr-xr-x. 1 hdfs hadoop 2490 Mar 15 05:12 hadoop-metrics.properties
    -rwxr-xr-x. 1 hdfs hadoop 5007 Jun 15 22:35 hadoop-policy.xml
    -rwxr-xr-x. 1 hdfs hadoop 11636 Jun 15 22:35 hdfs-site.xml
    -rwxr-xr-x. 1 hdfs hadoop 3039 Jun 15 22:35 health_check
    -rwxr-xr-x. 1 hdfs hadoop 5361 Jun 15 22:35 log4j.properties
    -rwxr-xr-x. 1 hdfs hadoop 7855 Jun 15 22:35 mapred-site.xml
    -rwxr-xr-x. 1 hdfs hadoop 7 Jun 15 22:56 master
    -rwxr-xr-x. 1 hdfs hadoop 12 Jun 15 22:56 slaves
    -rwxr-xr-x. 1 hdfs hadoop 2316 Mar 15 05:12 ssl-client.xml.example
    -rwxr-xr-x. 1 hdfs hadoop 2251 Mar 15 05:12 ssl-server.xml.example
    -rwxr-xr-x. 1 hdfs hadoop 3208 Jun 15 22:35 yarn-env.sh
    -rwxr-xr-x. 1 hdfs hadoop 4285 Jun 15 22:35 yarn-site.xml
    [root@master conf]# pwd
    [root@master conf]# echo $HADOOP_LIBEXEC_DIR
    [root@master conf]# echo $HADOOP_CONF_DIR
    [root@master conf]# echo $PATH
    [root@master conf]# service hadoop-hdfs-namenode start
    Starting Hadoop namenode: [ OK ]
    starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-master.out
    /usr/lib/hadoop-hdfs/bin/hdfs: line 24: /usr/lib/hadoop/lib/hadoop/libexec/hdfs-config.sh: No such file or directory
    /usr/lib/hadoop-hdfs/bin/hdfs: line 132: cygpath: command not found
    /usr/lib/hadoop-hdfs/bin/hdfs: line 164: exec: : not found
    [root@master conf]#

    Attemping to swap over from cloudera (have a working CDH4 cluster on a different set of VM’s) – but not sure what exactly is going on here.


    Hi Shyamala,

    Can you try launching with ‘hadoop-daemon.sh –config /etc/hadoop/config start datanode’?


    Chris Bennight

    No Shyamala (hijacked his thread as I’m having the same issue) – but I get the exact same response when i run “haddop-daemon.sh –config /etc/hadoop/config start datanode”


    Hi Chris,

    Did you copy all of the config files as Grant mentioned.
    It looks like your system is part way through the conversion from CDH to HDP, HDP does not use the scripts in /etc/init.d to launch and this may be causing some of your problems.
    also when you launched it with the hadoop-daemon.sh command did you su to the hdfs user?


The topic ‘Unable to start the datanode’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.