The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Problems with installation and deploying using Ambari

  • #28653

    We are using HDP 1.3. During the process of setting up the cluster we are facing couple of issues
    (We have 3 machines, one among them itself runs ambari all on CentOS 5.4)

    a.1)We chose auto registration of ambari-agent. The registration process passed for server3,
    but for other two it failed (because of ssh ).
    a.2) We corrected all machines, amabari-server reset, and tried again.
    Till time server3 failed with the below exception. – Connecting to the following url https://localhost.localdomain:8440/cert/ca – Failed to connect to https://localhost.localdomain:8440/cert/ca due to [Errno 111] Connection refused

    It was taking,localhost instead of ambari server. But /etc/ambari-agent/conf/ambari-agent.ini was pointing to proper server.

    So, we tried the following
    * Remove rpm,reset ambari -failed
    * Remove the rpm,delete /etc/ambari-agent, delete /usr/lib/ambari* , retry – It worked

    Would like to know any inputs on where we went wrong.

    b)After this Installing and starting server failed , with below error

    ERROR ServiceComponentHostImpl:721 – Can’t handle ServiceComponentHostEvent event at current state, serviceComponentName=GANGLIA_SERVER, hostName=server233.xxxxxx, currentState=INSTALL_FAILED, eventType=HOST_SVCCOMP_OP_
    15:17:12,934 WARN HeartBeatHandler:233 – State machine exception
    org.apache.ambari.server.state.fsm.InvalidStateTransitionException: Invalid event: HOST_SVCCOMP_OP_SUCCEEDED at INSTALL_FAILED

    We retired again,4th attempt it passed.Is there any extra conf that I have to take care in this case ?

    c) After successfully completing everything , we were able to see the dashboard and all .
    But, next time we I use the same url, it redirects me to the installation process.

    But all the process is running and the cluster is healthy. We shortly discovered that only this url redirects me

    If this is a known problem any proper workaround for this.


  • Author
  • #28685

    Hi Vivek,

    Dis you change the port on which Ambari is running? It usually runs on 8080 not 5858. Anyway, the work around for the redirection problem is to :
    close the ambari page in your browser
    clear the browser’s cache
    then reload the ambari main page again, the redirection should be gone.



    Hi Ted,
    Thanks for the response.
    Yes we changed the port to 5858 in the .
    We tried after clearing browser cache, and even from different machines where the url was never invoked.
    Still the same behavior.



    Hi Vivek,

    I am assuming if you enter the URL without only the http://xxx:5858 it still redirects you to installer step 1, am I correct? I am researching to see if there is a postgres or other command you can do that will get the server to know that it is already installed. Will update you when such is found.


The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.