Ambari Forum

Ambari Services not in safe mode!

  • #23401
    Santosh R
    Member

    Hi, I had to stop the ambari services & then restart them. So in the ambari web platform I stopped them all successfully (stopped them simultaneously when background processes where still running). But while restarting them i again clicked the “Start” button of different services without waiting for one service to start completely. And when all background processes where finished none of the services were started. They all display a Red Blinking signal in the services page. None of the “Start” buttons are enabled. Please help me to start the services back.
    Awaiting Your Reply soon ,
    Santosh

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #23404
    tedr
    Moderator

    Hi Santosh,

    Thanks for using Hortonworks Data Platform.

    The manner in which you started the services should not have caused a problem.

    A few things you can try here:

    One – refresh the page in your browser – sometimes the browser just doesn’t pick up that the status has changed.

    Two – look in the ambari-agent logs to see if there is a reason why the services did not start successfully. The logs are located at /var/log/ambari-agent/ambari-agent.log.

    Three – you can try using the following commands in order to clean up the service status

    Stop Services
    in a shell set the following variables to the correct values:
    PASSWORD – this should be the password you entered for the Ambari admin user – if you left it at the default it is ‘admin’
    HOST – the host name of your ambari-server machine (fully qualified)
    CLUSTER_NAME – the name you gave to this cluster when installing.

    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/NAGIOS
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/GANGLIA
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/HIVE
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/HCATALOG
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/HBASE
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/OOZIE
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/ZOOKEEPER
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/WEBHCAT
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/MAPREDUCE
    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/HDFS

    Thanks,
    Ted.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.