The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

TOTALFAILURE at Preparing discovered nodes

  • #11168
    Jonas Kemper

    hi when trying to start a cluster i have the error:

    Preparing discovered nodes
    Entry Id : 103
    Final result : TOTALFAILURE
    Progress at the end : : 1 / 1 in progress

    output from the

    there might be something wrong with the dns, but i can ssh with the server name without password…

  • Author
  • #11178
    Sasha J

    SELinux enabled, MUST be disabled.
    error is:
    [2012:10:17 15:23:58][ERROR][BootStrap][bootstrap.php:187][]: Error when trying to download gpg key using curl –connect-timeout 30 –fail -s -o /tmp/hmcDownloads-1350487408//RPM-GPG-KEY-Jenkins, output=Array

    Looks like connectivity to outside world does not work.

    Seth Lyubich

    Please note that your /etc/hosts file does not have entry for your ethernet ip, so it is possible that your host is not discoverable over ethernet. Please try to add it and see if you can ping your host using hostanme.

    Also, please make sure that you can ssh passwordless using both your IP and hostname.


    Jonas Kemper

    i disabled selinux, still not working

    “Looks like connectivity to outside world does not work”

    its working, i can run the curl command on console and get the gpg key….
    i can ssh passwordless using both ip and hostname.
    and i can ping both.

    Jonas Kemper

    “$retVal = 0;
    $output = array();
    exec(“curl –connect-timeout 30 –fail -s -o /tmp/hmcDownloads-1350557315//RPM-GPG-KEY-Jenkins“, $output, $retVal);
    echo $output;
    echo $retVal;

    if i run this php-scipt it returns (in retVal) 0, which is right and is the same as in bootstrap.php…
    but still bootstrap.php is failing…
    i use a proxy server, but i set the variable in the .bash_profile so it should work!?

    Jonas Kemper

    okay its return code 6 from curl:

    Couldn’t resolve host. The given remote host was not resolved.

    but if i run it from console it works… does the php thing doesnt use the proxy or dns or stuff??

    Sasha J

    Nameresolution is incorrect.
    hostname -f should return something meaningful.
    best way is to place host information in the /etc/hosts file on all nodes.

    Jonas Kemper

    why should nameresolution be incorrect?
    i have currently only one host, i can ping it with hostname and ip,
    hostname -f returns
    i can get the gpg key with the same command used in the php script, but the script does not work itself…

    Jonas Kemper

    here is another result from from my current setup :

    Sasha J

    This looks pretty much as a timeout on accessing HDP repository…
    Do you have real server of VM?

    Sasha J

    This looks pretty much as a timeout on accessing HDP repository…
    Do you have real server or VM?
    Also, I noticed that you trying to download install HDP 2.0, before it was HDP 1.1
    Note that HDP 2.0 is currently in alpha test stage. If you want to try it, please post your questions to relevant Forum thread. This one intended for HDP v1.x, which is production release.


    i use a proxy server, but i set the variable in the .bash_profile so it should work!?

    This is your problem. Ambari runs as root and so cannot load your proxy setting. You should try putting in the proxy settings in root’s bash. And try the same php script as root before you do this.

    Jonas Kemper

    i now changed the proxy settings to store them in /etc/profile.d/ so they are there for ALL users!

    i ran the php script as root (im atm doing anything as root) and it works flawless as before (i can dl the gpg key) but the whole process of adding nodes still fails with the same error (timout while running curl…..)

    i think ssh is working, because i have some input in …

    i changed the repository to the new 2.0 version just to test, but with both versions its the same failure…

    im not giving up 😀 and i can get u all infos u need. thanks so far, jk

    Jonas Kemper

    hey, did u answer on my post yesterday? i got an email with test or sth. but see nothing here, im trying to get a vpn connection to my companys network so we could meet on irc or sth. if possible… and i can access the server. im from germyn, so its evening when u answer my psots 😀

    Sasha J

    Yes, I sent you e-mail yesterday.
    Here it is again:

    what do you think about WebEx?
    You are in Germany, right?
    it is 9 hours ahead of us…
    What will be the good time to schedule this?

    Please, respond to it directly, not through the forums.
    Let us take this issues offline.

    Jonas Kemper

    So, we were not able to solve the problem, it is likely to be a problem with the proxy…

    I now installed everything with the gsInstaller-Method, many thanks to the Hortonwoks Support who helped me again!

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.