The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

custom configuration and remove services with ambari

  • #26700
    Julien Naour

    I have installed a 4-nodes cluster with Ambari.
    First question is it possible to remove services with Ambari with the Web UI? I see how to add a service but not how to remove one. Same for hosts.
    I am trying to customize configuration for one of my host, I want different directories for Datanode and Mapred in one of my hosts. Is it possible ?

    Julien Naour

  • Author
  • #26702

    Hi Julien,

    Thanks for trying out Hortonworks Data Platform. Currently there is not a way in the webui to remove services or hosts.


    Julien Naour

    Ok, thx. Is it the same for customize the configuration of a specific node ?

    Julien Naour

    I had my response from the ambari mailing list.

    Thanks Ted,


    Sasha J

    possibility of changing/adding directories to HDFS and MR configuration is depends on Ambari version you use.
    Latest release have this ability, but not earlier ones..
    So, if you download HDP1.3 and install it, you will have this functionality.
    If you already have data in your cluster, and is not able to reinstall it from scratch, then you can only upgrade Ambari itself to the latest version. This will also give you the latest functionality (modifying

    Thank you!


    How can I manually remove a service from Ambari ?
    There is an EXECUTE HBASE_SERVICE_CHECK on a node that is no longer available.
    I’d like to carve this out of the Ambari configuration. I’ve searched the web but can’t find anything :(


    HI Suaroman,
    Removing services is not currently available in the current release in Ambari. But what is possible is that you can decomission a node.
    I hope that helps.


    I have 1 question here that If I want to connect my Ambari running on my local centOS virtual machine to ganglia gmonds which are running on different nodes in different cluster. I am currently running ganglia which is shipped with Ambari on single node cluster as of now but I need to integrate my local ambari ganglia with remote gmonds running on different nodes.
    How will I do this? What are the steps?

    Sasha J

    It is not clear what do you want to accomplish.
    Could you please make your question clear?

    Thank you!


    My goal is to monitor remote dev cluster through my local centOS machine running Ambari. I have 6 hosts in dev cluster out of which 5 are running gmond and storm services and 1 host is running gmetad which is actually monitoring these 5 storm nodes through ganglia .
    So my requirement is to monitor those 5 gmond hosts through my local centOS machine running Ambari. Please help me to specify the steps so that I can monitor those gmonds through my Ambari’s ganglia.

    Sasha J

    I do not think this is supported configuration…
    Why can not you just connect to your remote Ambari server with browser and do your checks?


    David Nguyen

    Sorry to dredge this ticket up, but how does one remove a node permanently in a way that Ambari will reflect the change? Is there a way to cleanly delete a node from the Ambari DB?


    Hi David,

    You must ensure you decommission the node as per the documentation, and then you can remove it from the database (if that is what you want to accomplish)

    Is this a production environment or a test environment?



The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.