The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

What is the correct procedure to shut down clusters(HMC,name node,data nodes)

  • #10658

    Hi Sasha:

    I wonder what is the correct procedure to shut down clusters(HMC,Name Node, Data Node),then power on clusters ?
    My Hadoop clusters are composed of 5 nodes, 1 name node(with HMC), 4 data nodes.
    I once shut down 1 name node(with HMC), then powered up name node(with HMC),
    after that, I never had a chance to let my cluster work again.
    The error message required me to uninstall clusers, but the procedure to uninstall clusters never succeeded. so I had to remove HMC , then reinstall HMC to deploy clusters again.
    But it’s all the same,the condition never improved. Deploy clusters will definitely fail at install HDFS stage or somewhere again.
    After the cluster deployment failure, I was required to uninstall clusters again.
    However,as you can expect, the procedure to uninstall clusters will fail, and never succeed.
    So I have to remove HMC ,then reinstall HMC , start HMC to deploy clusters.
    eventually, this deployment will fail again at HDFS installation stage.

    My questions are
    1. What is the correct procedure to shut down clusters,then power on them ?
    my Hadoop clusters consists of 5 nodes, 1 name node(with HMC), 4 data nodes.
    If I need to power down these 5 nodes PM 9:00 every night, then power up them at
    AM 9:00 tomorrow morning,
    2.Why the uninstall clusters procedure can not be completed when there is some trouble on deploy clusters procedure? Anything wrong with the uninstall procedures or deploy cluster steps ?
    It’s a nightmare to uninstall clusters then deploy clusters ,
    It’s an endless loop. and the only way to break this failure loop is by formatting Hard drive,
    re-insall CentOS linux environment, then set up HMC, finally to deploy clusters successfully.


  • Author
  • #10675
    Sasha J


    The correct procedure for shutting down an HDP cluster is to go to the hmc manage services tab, and then select the “stop all” button. Wait for everything to stop and then you can power off the machines in any order you wish.

    Then to power the system back up you basically reverse the process. Power on the computers in the cluster, then, on the hmc server in the cluster, start hmc with ‘service hmc start’ and ‘service hmc-agent start’. Then point your browser at http://<FDQN>/hmc/html (replace <FDQN> with the FDQN of the hmc server in your cluster).

    The above instructions assume that the machines in the cluster are using static ip. If they are using dynamic ip then you’ll need to update the ‘/etc/hosts’ file on every machine in the cluster before starting hmc.



    Dear Sasha :

    Thanks a lot for this information.



The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.