HDFS Forum

Hostname Change HDP

  • #29770
    Ray Roberts

    I noticed someone asked this before, however it doesn’t look like he received an answer.

    I have installed HDP via Ambari and I now need to change the FQDN of this box. What is the best way to do this without breaking my HDP instance?


to create new topics or reply. | New User Registration

  • Author
  • #29784
    Sasha J

    are you talking about Ambari host you need to change?
    Or some other host ?

    This is not recommended to change hosts names when you use Ambari…

    Thank you!

    Ray Roberts


    Sorry for the delay.

    So, I’m actually running Ambari and hadoop under one workstation. I know this isn’t recommended, however, it is for testing purposes.

    I installed and setup Ambari on this workstation while it was connected to the internet. It now no longer has connectivity to the internet and lives on a closed network. This closed network has changed the workstation’s FQDN.

    ‘ambari-server reset’ doesn’t seem to get me anywhere either. I noticed in the ambari-server logs that it is failing due to it no longer seeing the original hostname. Is there somewhere I can change the hostname manually?


    Sasha J

    After “ambari-server reset” you already lost all the cluster metadata from Ambari, so Ambari is not useful anymore, you should reinstall cluster from scratch.
    As of changing name back to original one:
    just put needed line to /etc/hosts files hostname

    Hostname should corresponds to the expected host name in the hadoop configuration.

    Then start all services manually.

    This is it.

    Thank you!


    hostname -f is not returning anything

    [root@STLSabc1 ~]# hostname -f
    hostname: Unknown host
    Any idea… network name of the system is STLSabc1 and /etc/hosts entry is localhost.localdomain localhost4 localhost4.localdomain4
    ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6

    Shall i start my installation with this setting?

    Thanks Chandra


    I seee this error now

    Starting HMC Installer [ OK ]
    Starting httpd: Syntax error on line 35 of /etc/httpd/conf.d/puppetmaster.conf:
    SSLCertificateFile: file ‘/var/lib/puppet/ssl/certs/.pem’ does not exist or is empty
    Failed to start HMC

    Koelli Mungee

    Hi Chandra,

    Can you give us some background on this, what version of HDP/Ambari are you using? Is this related to changing the hostname? What OS is being used, do you have a /etc/sysconfig/network file where the hostname is set?



    Hi koelli,

    I got CentOS 6.0 VM single node and want to install Hortonworks on the same. I followed the URL https://blog.codecentric.de/en/2012/12/tutorial-installing-a-apache-hadoop-single-node-cluster-with-hortonworks-data-platform/ because http://hortonworks.com has no documentation about installing on single node Linux env.

    Please advice what could be fastest way to get started with Map/Reduce, Hive and analytics on Horton Data Integration platform. I need to build some prototype first and then later will setup cluster by Jan next year.

    one option i see is to download Oracle VM on my laptop (Spec is 64 bit windows 7).. and then start Hortonsandbox. but i wanted to do little bit more.

    Thanks Chandra

    Koelli Mungee


    Please follow our documentation to install HDP using Ambari. Please refer to the following hortonworks documentation and my advice would be to start fresh,


    Hope this helps,

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.