HDP on Linux – Installation Forum

HMC Add Node Fail==>Node Discovery and Preparation Failed

  • #13123

    HMC Add Node Fail

    During “Node Discovery and Preparation” stage in the HMC tool,
    “Preparing discovred nodes” failed as the follwling
    (1)Finding reachable nodes: All 4 nodes succeeded
    (2)Obtaining information about reachable nodes: All 4 nodes succeeded
    (3)Verifying and updating node information: All 4 nodes succeeded
    (4)Preparing discovered nodes: 1 / 4 in progress, 3 succeeded

    “Node Discovery and Preparation” stage seems very unstatble,
    I ever tried as many as 5 times to get 1 successful job.
    Failed with no special reason,
    and Success with no special reason.
    It depends on the” good luck”
    I just wonder when can I get a successful or failed cluster deployment

    (1)OS : CentOS6.3
    (2)Hadoop : HDP1.1.1.16 + HDP-UTILS-
    (3)Cluster : 4 Nodes

    Confirmed Items:
    FQDN ok
    /etc/hosts ok
    iptables stop ok
    SSH keyless ok
    # yum repolist failed

    host001.dmo.com.err 0 byte
    host003.dmo.com.err 0 byte
    host004.dmo.com.err 0 byte
    host005.dmo.com.err 0 byte
    host002.dmo.com.err 47K bytes

    Existing lock /var/run/yum.pid: another copy is running as pid 21564.
    Another app is currently holding the yum lock; waiting for it to exit…
    The other application is: PackageKit
    Memory : 39 M RSS (343 MB VSZ)
    Started: Fri Dec 28 12:50:01 2012 – 2:45:48 ago
    State : Sleeping, pid: 21564
    Another app is currently holding the yum lock; waiting for it to exit…
    The other application is: PackageKit
    Memory : 39 M RSS (343 MB VSZ)
    Started: Fri Dec 28 12:50:01 2012 – 2:45:50 ago
    State : Sleeping, pid: 21564

    I wonder why” yum lock” happened after another 3 nodes had successfully passed the check , does it mean that it’s too busy for puppet or yum install the HDP package on 4 nodes ? Did Any one has the same problem ? Any help will be GREATLY appreciated..


to create new topics or reply. | New User Registration

  • Author
  • #13125

    Hi Jeff,

    Thanks for using HDP.

    Actually this is not a bug, the yum lock means that HMC(ambari) was trying to use yum on that particular host and could not because that host thinks that another process was already using it. You can check to see if yum is running by running the following command: “ps -aux | grep yum” If there are any processes in the list other than “grep yum” you can stop them with finding the process id in the list and issuing the command “kill -9 ” Then retry the HDP installer.

    I hope this helps,


    Hi tedr :

    Thanks for your info.

    After “rm -rf /var/run/yum.pid “, it can work now.



    Larry Liu

    Hi, Jeff

    Thanks for letting us know.

    Have a great 2013.



You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.