HDP on Linux – Installation Forum

Hmaster HA via an API call

  • #15987
    Francois BORIE

    Hi !

    I know for the moment it’s not possible to deploy other Hmaster instances with Ambari Web UI.

    I was wondering if there were any possibilities to do that via an API call. For example, if I want to deploy another HBASE_MASTER on a node which is just running a zookeeper instance, I try something like below but it fails with a 404 error :

    [root@oben**** ambari-server]# curl –user admin:admin -d ‘{“HostRoles”:{“cluster_name” : “hadoop_poc”,”component_name”: “HBASE_MASTER”,”host_name”:”mytestnode.priv”}}’ -X PUT http://oben****:8001/api/v1/clusters/hadoop_poc/hosts/mytestnode.priv/host_components/HBASE_MASTER{
    “status” : 404,
    “message” : “org.apache.ambari.server.controller.spi.NoSuchResourceException: The requested resource doesn’t exist: ServiceComponentHost not found, clusterName=hadoop_poc, serviceName=HBASE, serviceComponentName=HBASE_MASTER, hostName=mytestnode.priv”

    So I took an alternative way by replaying the puppet manifest of the main Hbase master visible in Ambari, on my testnode where I want to deploy my second hmaster.

    (I do something like :
    export RUBYLIB=’/usr/lib/ambari-agent/lib/puppet-2.7.9/lib/:/usr/lib/ambari-agent/lib/facter-1.6.10/lib’
    /usr/lib/ambari-agent/lib/ruby-1.8.7-p370/bin/ruby /usr/lib/ambari-agent/lib/puppet-2.7.9/bin/puppet apply –confdir=/var/lib/ambari-agent/puppet –detailed-exitcodes /var/lib/ambari-agent/data/mymanifestwhichdeployshmaster.pp

    It’s working and I get a new Hmaster automatically configured on another node, working in a cluster with the other Hmaster, but it looks like a quick & dirty way to achieve Hmaster HA for the moment.

    Many thanks for your answer,



to create new topics or reply. | New User Registration

  • Author
  • #16008
    Sasha J

    Ambari in it’s current release not supporting services addition or moving to different node.
    Your manual trick may work, but it is not supported.

    Thank you!

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.