HDP on Linux – Installation Forum

Ambari Install Failure

  • #20134

    I’m using Ambari on 5 servers in EC2. I setup passwordless ssh between the Ambari server and the rest of the nodes, but each node except the Ambari server fails the setup host check. (It goes from Registering->Failed) I think DNS is setup properly; I get the Private DNS name from hostname. Adding each node to the host file doesn’t help.

    sed: can’t read /etc/ambari-agent/conf/ambari-agent.ini: No such file or directory
    Verifying Python version compatibility…
    Using python /usr/bin/python2.6
    Checking for previously running Ambari Agent…
    Starting ambari-agent
    Verifying ambari-agent process status…
    ERROR: ambari-agent start failed for unknown reason
    (‘hostname: ok ip-10-0-0-119
    ip: ok
    cpu: ok Intel(R) Xeon(R) CPU E5645 @ 2.40GHz
    memory: ok 1.61684 GB
    disks: ok
    Filesystem Size Used Avail Use% Mounted on
    /dev/xvde 7.9G 711M 6.8G 10% /
    tmpfs 828M 0 828M 0% /dev/shm
    os: ok CentOS release 6.3 (Final)
    iptables: ok
    Chain INPUT (policy ACCEPT 0 packets, 0 bytes)
    pkts bytes target prot opt in out source destination
    522 54682 ACCEPT all — * * state RELATED,ESTABLISHED
    0 0 ACCEPT icmp — * *
    0 0 ACCEPT all — lo *
    3 180 ACCEPT tcp — * * state NEW tcp dpt:22
    0 0 REJECT all — * * reject-with icmp-host-prohibited

    Chain FORWARD (policy ACCEPT 0 packets, 0 bytes)
    pkts bytes target prot opt in out source destination
    0 0 REJECT all — * * reject-with icmp-host-prohibited

    Chain OUTPUT (policy ACCEPT 730 packets, 47815 bytes)
    pkts bytes target prot opt in out source destination
    selinux: ok SELINUX=enforcing
    yum: ok yum-3.2.29-30.el6.centos.noarch
    rpm: ok rpm-4.8.0-27.el6.x86_64
    openssl: ok openssl-1.0.0-25.el6_3.1.x86_64
    curl: ok curl-7.19.7-26.el6_2.4.x86_64
    net-snmp: UNAVAILABLE
    net-snmp-utils: UNAVAILABLE
    puppet: UNAVAILABLE
    nagios: UNAVAILABLE
    ganglia: UNAVAILABLE
    passenger: UNAVAILABLE
    hadoop: UNAVAILABLE
    yum_repos: ok
    HDP-UTILS- Hortonworks Data Platform Utils Version – HDP-UTILS-1. 52
    zypper_repos: UNAVAILABLE
    ‘, None)
    (“INFO 2013-04-02 22:07:51,137 NetUtil.py:45 – DEBUG:: Connecting to the following url https://localhost.localdomain.localdomain.localdomain.localdomain.localdomain:8440/cert/ca
    INFO 2013-04-02 22:07:51,139 NetUtil.py:59 – Failed to connect to https://localhost.localdomain.localdomain.localdomain.localdomain.localdomain:8440/cert/ca due to [Errno -2] Name or service not known

to create new topics or reply. | New User Registration

  • Author
  • #20135

    Hi Geoffrey,

    Thanks for trying out Hortonworks Data Platform.

    In the error log you post it looks like there are two possibilities: 1- make sure that iptables is turned off using the command “service iptables stop” and 2 – DNS still looks a little messed up, could you post the contents of your /etc/hosts file and the output of the command hostname -f on all of your nodes in the cluster?


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.