HDP on Linux – Installation Forum

Starting Single Node HMC Service Start failure – CentOS VM

  • #8181

    I installed a single node using HMC on a CentOS 6.3 VMware (hostname is just localhost). Installation and testing of HDFS, MapReduce, and ZooKeeper were successful. I had a failure when it got to testing HBase. After rebooting HDFS start fails. I have Operation log below.

    The operation log is:
    Operation Logs

    “2”: {
    “nodeReport”: [],
    “nodeLogs”: []
    “3”: {
    “nodeReport”: [],
    “nodeLogs”: []
    “4”: {
    “nodeReport”: {
    “nodeLogs”: []

    Thank you in advance for any assistance you can provide.

to create new topics or reply. | New User Registration

  • Author
  • #8185
    Sasha J

    Your cluster did not complete installation due to HBase issue.
    After reboot, it assumes NEW installation and tried to restart everything from scratch and format namenode (which is already formatted before) and fail on this:
    hadoop::Namenode::Format/Exec[/tmp/checkForFormat.sh]/returns (notice): ERROR: Namenode directory(s) is non empty. Will not format the namenode. List of non-empty namenode dirs /usr/local/hadoop/hdfs/namenode
    Mon Aug 13 08:16:57 -0700 2012 /Stage[2]/Hdp-hadoop::Namenode::Format/Exec[/tmp/checkForFormat.sh]/returns (err): change from notrun to 0 failed: sh /tmp/checkForFormat.sh hdfs /etc/hadoop/conf /var/run/hadoop/hdfs/namenode-formatted /usr/local/hadoop/hdfs/namenode returned 1 instead of one of [0] at /etc/puppet/agent/modules/hdp-hadoop/manifests/namenode/format.pp:49
    Mon Aug 13 08:16:57 -0700 2012 /Stage[2]/Hdp-hadoop::Namenode::Format/Hdp::Exec[set namenode mark]/Anchor[hdp::exec::set namenode mark::begin] (notice): Dependency Exec[/tmp/checkForFormat.sh] has failures: true

    Please, deinstall HMC and puppet, then install HMC and start installation again.

    Make sure that in the service configuration page, you have heap size for HBase Region server set to at least 1024 (region server can not start if heap size is less than 1 Gb)…

    Thank you!

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.