Namenode and SNamenode not starting

to create new topics or reply. | New User Registration

This topic contains 3 replies, has 2 voices, and was last updated by  Kevin 1 year ago.

  • Creator
    Topic
  • #50424

    Kevin
    Participant

    Hello,

    I just reinstalled ambari with the last version, everything is working pretty fine, no warnings nor everything else… until Starting services, especially the Namenode and SNamenode.
    The Namenode gives me this log:

    
    notice: /Stage[2]/Hdp-hadoop::Namenode::Format/Exec[/tmp/checkForFormat.sh]/returns: NameNode Dirname = /data/hadoop/hdfs/namenode
    notice: /Stage[2]/Hdp-hadoop::Namenode::Format/Exec[/tmp/checkForFormat.sh]/returns: ERROR: Namenode directory(s) is non empty. Will not format the namenode. List of non-empty namenode dirs  /data/hadoop/hdfs/namenode
    

    The Snamenode gives me this log:

    
    notice: /Stage[2]/Hdp-hadoop::Snamenode/Hdp-hadoop::Service[secondarynamenode]/Hdp::Exec[su - hdfs -c  'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start secondarynamenode']/Exec[su - hdfs -c  'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start secondarynamenode']/returns: starting secondarynamenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-secondarynamenode-evl2400471.out
    err: /Stage[2]/Hdp-hadoop::Snamenode/Hdp-hadoop::Service[secondarynamenode]/Hdp::Exec[su - hdfs -c  'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start secondarynamenode']/Exec[su - hdfs -c  'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start secondarynamenode']/returns: change from notrun to 0 failed: su - hdfs -c  'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start secondarynamenode' returned 1 instead of one of [0] at /var/lib/ambari-agent/puppet/modules/hdp/manifests/init.pp:487
    

    The datanodes start correctly, but they shutdown 10 seconds after since there are no namenodes ?

    If you can help me, it would be great.

    Regards,
    Kevin

Viewing 3 replies - 1 through 3 (of 3 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #50427

    Kevin
    Participant

    The /data/… is the only file I didn’t rm during the reinstallation. I think I will start over again.

    I’ll keep you up to date.

    Regards,

    Kevin

    Collapse
    #50426

    Kevin
    Participant

    Hello,
    Yeah there was things in it
    # cd /data/hadoop/hdfs/namenode/
    # ls
    current in_use.lock
    I tried to use the hadoop command to format the namenode. Didn’t work, so I used the rm command.

    I tried to start again the namenode, and it went to one step further. But this time I don’t understand what is wrong with the conf files:


    err: /Stage[2]/Hdp-hadoop::Namenode/Hdp-hadoop::Service[namenode]/Hdp::Exec[su - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode']/Exec[su - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode']/returns: change from notrun to 0 failed: su - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode' returned 1 instead of one of [0] at /var/lib/ambari-agent/puppet/modules/hdp/manifests/init.pp:487
    notice: /Stage[2]/Hdp-hadoop::Namenode/Hdp-hadoop::Service[namenode]/Hdp::Exec[su - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode']/Anchor[hdp::exec::su - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode'::end]: Dependency Exec[su - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode'] has failures: true
    notice: /Stage[2]/Hdp-hadoop::Namenode/Hdp-hadoop::Service[namenode]/Hdp::Exec[sleep 5; ls /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid >/dev/null 2>&1 && ps
    cat /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid >/dev/null 2>&1]/Anchor[hdp::exec::sleep 5; ls /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid >/dev/null 2>&1 && ps cat /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid >/dev/null 2>&1::begin]: Dependency Exec[su - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode'] has failures: true
    notice: /Stage[2]/Hdp-hadoop::Namenode/Hdp-hadoop::Service[namenode]/Hdp::Exec[sleep 5; ls /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid >/dev/null 2>&1 && ps cat /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid >/dev/null 2>&1]/Exec[sleep 5; ls /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid >/dev/null 2>&1 && ps cat /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid >/dev/null 2>&1]: Dependency Exec[su - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode'] has failures: true

    Collapse
    #50425

    Dave
    Moderator

    Hi Kevin,

    You must ensure this directory is empty:

    /data/hadoop/hdfs/namenode

    As it looks like it has stuff left over from your failed install.

    Thanks

    Dave

    Collapse
Viewing 3 replies - 1 through 3 (of 3 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.