The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Ambari(1.2) hdp restart failure

  • #14152

    I finally had a mostly working installation and then restarted my vm and had severe issues getting things running again.

    I saw Sasha’s recent post referencing ‘service hmc start’ but I have no hmc service. This seems to be a consequence of the 1.2 ambari install. The only obvious hdp related services are hdp-gmetad, hdp-gmond, and nagios which I issued service start commands to.

    I found in the documentation a similar reference to ‘ambari-server start’ and ‘ambari-agent start’ which I did but still nothing appears to be starting although at least now the ambari portal knew that I had a cluster already installed.

    I finally resorted to running in the hadoop/bin directory and everything seems mostly up although the dashboard is still complaining about an inaccessible ganglia service and the whole monitoring infrastructure seems flaky – complaining that services are down even though they are up.

    Can someone shed some light on the expected restart procedure for 1.2?

  • Author
  • #14162
    Sasha J

    There are no HDP services started automatically on the cluster, this done on purpose and designed like this. script is also not starting everything, it should be considered as obsolete.
    Your guess about ambari-server and ambari-agent was absolutely right, you should start it both and then login to UI (http://name:8080). This way you will be presented with the actual cluster status and will be able to start services.

    Hope this helps!
    Thank you!


    Hi Sasha,
    I am just getting back to this. I have a 5-node HDP 1.2 cluster. On the head-node I did:
    ambari-server start
    ambari-agent start
    On each data-node I did:
    ambari-agent start

    At this point the web UI was active and it knew the services were not running, but when I went to HDFS, for example, the only button active was the ‘Stop’ button. I went ahead and did a Stop – after that completed all subcommands the Start button was active and I was able to use that to bring up the service.

    The stop/start behavior is obviously strange – is there any known workaround? Also, is there any way to start more than one service at a time?


    Sasha J

    You did the right thing and it all works for you now, right?
    The reason why it only shows “Stop” initially, is that status stored in the internal database, and if/when server dies or rebooted, status remembered in there and it may be incorrect.
    As of starting multiple services at the same time, it is not currently supported, will be on the place in future releases. As of today, you can click start, then confirm and then close the pop-up window and click “Start” on next service, etc.
    In general, Hadoop cluster should be never stopped (in a real life). IN development setup, when you need to stop your VM, or something like this, you should stop cluster from the UI first, then on next boot it will report correct statuses for the services.

    Hope this helps!
    Thank you!

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.