The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Automatically starting services

  • #27050
    finance turd
    Member

    I am running Horonworks/Ambari on a single VM for development purposes. We have an automated install for our multi-node production environment.

    I have added to /etc/rc.local:
    ambari-server restart
    ambari-agent restart

    And they both start up after system reboot… however…

    None of the Hadoop services (HDFS, Namenode, Oozie, etc.) are started.

    I have to go into Ambari and click on each one, and that’s really a pain.

    I tried to look to see what the command is to start these myself, but it’s not self-evident, at least not to me.

    Could you please tell me what commands to add to execute on system start-up to bring up not only ambari but also the Hadoop services that Ambari manages?

    Thank you.

    FT

  • Author
    Replies
  • #27104
    tedr
    Moderator

    Hi Finance,

    You can add these commands to your start/stop scripts:

    Stop Services

    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “INSTALLED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/$SERVICE

    Start Services

    curl –user admin:$PASSWORD -i -X PUT -d ‘{“ServiceInfo”: {“state” : “STARTED”}}’ http://$HOST:8080/api/v1/clusters/$CLUSTER_NAME/services/$SERVICE

    You will need to set the following environment variables in your script:
    PASSWORD = the password for the Ambari admin user
    CLUSTER_NAME = the name you gave to your cluster
    SERVICE = one of (HDFS, MAPREDUCE, WEBHCAT, ZOOKEEPER, OOZIE, HBASE, HCATALOG, HIVE, GANGLIA, NAGIOS) – this is the order that is best to start the services, reverse it for stopping them.

    Thanks,
    Ted.

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.