HDP on Linux – Installation Forum

ports used for puppet

  • #7701

    I am installing HDP on the Amazon cloud (EC2) and it is hanging. I was able to get past one issue by opening port 8139 which puppet seems to use. Are there any other ports that need to be opened (complete list) which could be causing this issue?

    The details are in hmc.log show it is waiting for results forever without ever erroring out:

    [2012:07:27 01:26:17][INFO][PuppetInvoker][PuppetInvoker.php:314][waitForResults]: 0 out of 3 nodes have reported for txn 3-2-0
    [2012:07:27 01:26:22][INFO][PuppetInvoker][PuppetInvoker.php:314][waitForResults]: 0 out of 3 nodes have reported for txn 3-2-0
    [2012:07:27 01:26:27][INFO][PuppetInvoker][PuppetInvoker.php:314][waitForResults]: 0 out of 3 nodes have reported for txn 3-2-0
    [2012:07:27 01:26:32][INFO][PuppetInvoker][PuppetInvoker.php:314][waitForResults]: 0 out of 3 nodes have reported for txn 3-2-0
    [2012:07:27 01:26:37][INFO][PuppetInvoker][PuppetInvoker.php:314][waitForResults]: 0 out of 3 nodes have reported for txn 3-2-0

    Any help is appreciated!

to create new topics or reply. | New User Registration

  • Author
  • #7704
    Sasha J


    are you sure you have used the INTERNAL fqdn in your hosts file that you uploaded?



    Yes, definitely using the internal fqdn’s such as ip-xx-xx-xx-xx.ec2.internal

    Is there a list of ports that need to be open?

    Sasha J

    those ports should not be closed by default please post your linux version info, the iptables, and selinux status


    I am using CentOS v5.4 HVMx64
    Linux ip-10-17-132-110 2.6.18-274.12.1.el5 #1 SMP Tue Nov 29 13:37:46 EST 2011 x86_64 x86_64 x86_64 GNU/Linux

    iptables (Firewall) is stopped on all servers.

    selinux is disabled. (confirmed in /etc/selinux/config)

    Sasha J

    Hi Stephen,

    is it possible for you to try CentOS 5.8?



    Unfortunately trying CentOS 5.8 is not an option at this point. Are there specific problems with the Hortonworks installation with CentOS 5.4 that are fixed in 5.8?

    Any other ideas or advice?


    Stephen I just had this same issue yesterday, hang occurred on the hdfs test step & again today on the zookeeper step. I resolved it both times & here is what worked for me:

    first you should associate your fully qualified domain name to your ip in your /etc/hosts file
    and make sure its the same on all your nodes
    ex.. ip-10-190-111-104.ec2.internal Deploy

    secondly, and specifically for the hang issue try uninstalling ( on all nodes )/ reinstalling hmc ( only on your deployment node ) and pre installing all the required packages as mentioned here.



    Thanks for the advice Miguel but I tried this and it still does not work.

    It’s stuck on the “Cluster Install” and never gets past this point. very frustrating.



    I tried CentOS 5.x for quite some time with only 1 success. However the process has been considerably easier with 6.2 ( i think 😛 )following this guide from a user on another thread. If your curious http://www.linuxdict.com/2012-06-auto-deploy-hadoop-cluster-with-hdp/

    Here is someone else who is trying it, & I posted a recommendation for an ami on this thread..


    Also, I noticed you are using EC2, what size instance are you using? & Did you preinstall the packages on all of your nodes?


    I am using a 4xlarge instance.

    Yes, I tried preinstalling the packages on the nodes and it still gets stuck in the same place. :-(

    I am now trying CentOS 6.2 but had issues with the ami suggested earlier in that it had no storage space.


    Does your security group not allow a particular port? I used for my development cluster.


    I also opened up all tcp ports but that didn’t help. Are there any udp ports needed?


    Im not sure but i opened all tcp udp icmp.. I think I will attempt 5.x again let me know how 6.x goes.


    Stephen I reproduced this issue on CentOS 5.7 going to try 5.8…


    Interestingly enough it fails on the big yum command which I executed before deployment…

    [root@domU-12-31-39-05-68-41 log]# cat puppet_apply.log | grep err
    Wed Aug 01 00:54:30 -0400 2012 /Stage[1]/Hdp::Pre_install_pkgs/Hdp::Exec[yum install $pre_installed_pkgs]/Exec[yum install $pre_installed_pkgs]/returns (err): change from notrun to 0 failed: yum install -y hadoop hadoop-libhdfs.x86_64 hadoop-native.x86_64 hadoop-pipes.x86_64 hadoop-sbin.x86_64 hadoop-lzo hadoop hadoop-libhdfs.i386 hadoop-native.i386 hadoop-pipes.i386 hadoop-sbin.i386 hadoop-lzo zookeeper hbase mysql-server hive mysql-connector-java-5.0.8-1 hive hcatalog oozie.noarch extjs-2.2-1 oozie-client.noarch pig.noarch sqoop mysql-connector-java-5.0.8-1 templeton templeton-tar-pig-0.0.1-1 templeton-tar-hive-0.0.1-1 templeton hdp_mon_dashboard hdp_mon_nagios_addons nagios-3.2.3 nagios-plugins-1.4.9 fping net-snmp-utils ganglia-gmetad-3.2.0 ganglia-gmond-3.2.0 gweb hdp_mon_ganglia_addons ganglia-gmond-3.2.0 gweb hdp_mon_ganglia_addons snappy snappy-devel returned 1 instead of one of [0] at /etc/puppet/agent/modules/hdp/manifests/init.pp:222


    This yum command contains more packages than the command listed under the help files. Furthermore executing it on the shell yields:
    Package mysql-server is obsoleted by MySQL-server-community, trying to install MySQL-server-community-5.1.55-1.rhel5.x86_64 instead

    mysql50-5.0.96-2.ius.el5.x86_64 from rightscale has depsolving problems
    –> mysql50 conflicts with MySQL-server-community

    I think this is an issue with conflicting repos, similar to a php / php54 problem I experienced earlier..
    executing: yum –disablerep=* –enablerep=HDP- install -y … …
    leads to a successful install, I will let you know what happens next.


    No package mysql-server available. hehe so you have to get this from outside the hdp repo..


    Looks like both mysql50 & MySQL-server-community are provided by the rightscale repo..

    – Decided to use repoforge

    wget http://packages.sw.be/rpmforge-release/rpmforge-release-0.5.2-2.el5.rf.x86_64.rpm
    rpm –import http://apt.sw.be/RPM-GPG-KEY.dag.txt
    rpm -K rpmforge-release-0.5.2-2.el5.rf.*.rpm
    rpm -i rpmforge-release-0.5.2-2.el5.rf.*.rpm

    yum –disablerep=rightscale install mysql-server
    yum –disablerep=* –enablerep=HDP- install

    already installed and latest version
    Nothing to do

    Cool, but still get the same error hehe now what?


    Oh yeah, just because i installed everything doesn’t mean hdp stops executing that yum install command. I guess 2 options, 1 disable my other repos. 2 shut off the yum install command on hdp. Ill try option 1..


    vim /etc/yum.repos.d/rightscale.repo

    yum -y erase hmc puppet
    yum -y install hmc
    service hmc restart

    Cool, that should get you past the cluster install step :) and I am going to sleep.


    Hmm looking at this a bit more carefully this morning, mysql-server is provided by the updates repo from CentOS-Base.repo the real issue is conflicting repos so if you can successfully execute the install command with out errors or dependency conflicts when you get to the deployment phase it will be smooth sailing.


    Pre-Deploy Installations
    yum erase rrdtool
    yum install rrdtool-1.2.27-3.el5.x86_64
    yum erase php*
    yum install mysql-server net-snmp-utils php-pecl-json
    Timeout / repo conflict
    yum install -y … the big one make sure it doesn’t have errors

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.