HDP on Linux – Installation Forum

Adding A Node to a Single Node Cluster

  • #9946

    Hello,
    I have set up HMC as a single node cluster first.
    I am now trying to add a second node to the cluster…
    I have gone through the steps of creating password-less ssh on the slave and verified from the master that it works.
    When I try to “add node” on HMC, it gets stuck at “Node Discovery and Preparation”.. on the first step — Finalizing bootstrapped node..

    Help ?

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #9952
    Sasha J
    Moderator

    Anand,

    please run check script mentioned in the following post:
    http://hortonworks.com/community/forums/topic/hmc-installation-support-help-us-help-you/

    Thank you!
    Sasha

    #9965

    Sasha,

    The file has been uploaded..
    The filename is abc.np3-centos5-computer-0.localdomain.127.0.1.1.out

    #9985
    Sasha J
    Moderator

    Your second nodes (the one you try to add) is not registered in cluster configuration.
    Most likely this is because of incorrect IP addresses usage:
    you use 127.0.x.x which is internal network and should not be used for anything else.
    Please, change IP address for you nodes to something else (192.168.0.x is a very good candidate for this) and then run node addition again.
    Make sure you have passwordless connectivity working and all prerequisites met.

    Sasha

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.