The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Ambari Forum

Cant Readd Host, Deleting is not working properly

  • #57363

    So i messed up the nodemanager at one host and decided to format the whole thing and readd. That was a big mistake i think now.
    Im just not getting rid of the Host. Unfortunately i cant change the name of the host since its company network. So no matter how often i delete the host and even restart the ambari server ill end up having the host in the host list or at least the metadata somewhere in the system. So when try to add i get the message Host is already. When i delete it and then readd it right away i get some steps more forward and i get an Binding exception :

    Exception [EclipseLink-4002] (Eclipse Persistence Services – 2.4.0.v20120608-r11652): org.eclipse.persistence.exceptions.DatabaseException Internal Exception: java.sql.BatchUpdateException: Batch entry 0 INSERT INTO ClusterHostMapping (cluster_id, host_name) VALUES (2, ‘****’) was aborted. Call getNextException to see the cause. Error Code: 0 Call: INSERT INTO ClusterHostMapping (cluster_id, host_name) VALUES (?, ?) bind => [2 parameters bound]

    So i really dont get it why is it so hard to delete a host properly from the ambari server? after this message of course it is in the host list again with no services installed and i cant add services. If i delete it and restart it will be in the list again after restart and i cant add it again. If i delete it and try to add it without restart it fails at the last step.

    So what to do?

  • Author
  • #57364
    Jeff Sposetti

    Hi, can you confirm which version of Ambari you are using?


    Yes im using version 1.6.0


    Hi Johannes, can you elaborate with more detail how you first deleted the host and attempted to re-add it?
    I’m looking to reproduce the issue.



    I attempted to repro and was able to delete and re-add a host without any errors. I suspect that the clusterhostmapping table still has a reference to the host and may have gotten into an inconsistent state, which is causing the INSERT statement to fail.

    Johannes, do you have permission to login to the name node and access the database directly? If you do, what database are you using, MySQL, Postgres, Oracle? I’d be curious to know the records in the clusterhostmapping table.


    i have but i won’t be in office till tuesday so i can’t say you this tell then im sorry…
    Ill continue working on this then.

    Ill check the Namenode? Database on tuesday and let you know


    So im back at the office and i have the default database installed which is jdbc is i can see in the configs.
    But i cant get a connection with sql developer maybe i got the wrong port i cant find the config for that.
    But is that the right database you mean? the ambari jdbc?


    Hi Johannes, I’m actually able to reproduce the issue with a specific sequence of steps and am working on identifying a fix.


    Well good so it wasnt my stupidity. Let me know how you fixed it.


The forum ‘Ambari’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.