The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Problem on step Add Nodes

  • #8309

    I need your help please
    I’m Trying install HDP on cluster for testing, and in HMC I cant add nodes
    I have managed passwordless ssh with id_dsa key, and hostnames with hosts file in /etc directory
    this method worked fine in cloudera
    but when I press the button “add nodes” it fails:
    “Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password)”
    any idea?

  • Author
  • #8310
    Sasha J

    It seems like HMC node have incorrect key to access newly added node.
    You have to set the same key for all nodes, HMC should be able to communicate with all nodes with the same key.
    Please, copy your authorized_keys file to all nodes you are planning to add.
    Also, make sure that your sshd configuration on new node allows root access (it is disabled by default on RHEL).

    Let us know if there more questions.
    Thank you!


    Sasha J
    thnk you but my problem is not solved
    I have 6 nodes
    1 for only hmc, 1 for namenode and hbase and 4 hosts for datanodes
    hmc installed correctly with no problem, now I created new key (for now rsa key, not dsa) I copied it in all nodes in the /root/.ssh/ directory
    there is my authorized_keys file:
    ssh-rsa AAAAB….blablabla…eQrQ== root@hmc
    from hmc host when I connect other hosts from ssh it connects without password now
    I also use hostdetail.txt for specifying hosts in hmc
    in sshd configuration – PermitRootLogin yes
    but same problem – “Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password)”
    I cant understand what Im doing wrong :(
    thnx and sorry for my anglish :)

    Sasha J

    Let us take this offline.
    What will be the best e-mail to communicate with you?

    Thank you!


    Sasha J
    Ok I solved the ssh problem it was my fail
    but still I cant add nodes:
    Finalizing bootstrapped nodes

    Entry Id : 104
    Final result : FAILED
    Progress at the end : : Of 6 nodes, 1 succeeded and 5 failed
    Additional information :
    here is a tail of hmc.log
    [2012:08:17 06:45:24][INFO][PuppetFinalize:txnId=39:subTxnId=104][finalizeNodes.php:390][]: Puppet finalize, succeeded for 1 and failed for 5 of total 6 hosts
    [2012:08:17 06:45:24][INFO][PuppetFinalize:txnId=39:subTxnId=104][finalizeNodes.php:399][]: Completed signing of certs for puppet agents, opStatus=FAILED
    my mail is

    Sasha J

    It looks like you have still some problems with the keys and certificates…
    Please, do the following:
    1. make sure that HMC node can access all other nodes by ssh with no password (looks like this works for you, but let us make sure).
    2. run
    yum -y erase puppet
    on all nodes and
    yum -y erase hmc puppet
    on hmc node.
    3. run
    yum -y install hmc
    on hmc node
    4. run
    service hmc start
    on hmc node
    5. connect to UI and start installation.

    If still failed, please run script from the post
    and upload results for us.

    Thank you!

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.