HDP on Linux – Installation Forum

hmc start fails

  • #6500

    I am following the instructions here:

    From a shell on the main installation host, enter:
    service hmc start
    2. Agree to the Oracle JDK license when asked. You must accept this license to be able to download the necessary JDK from Oracle. The JDK is installed during the deploy phase.
    NOTE: If you already have a local copy of the Oracle JDK v 1.6 update 31 32 and 64-bit bina­ries accessible from the install host, you can skip this and the next step. Use the Miscella­neous section on the Customize Settings in the deployment wizard to provide the path to your binaries.
    3. Agree to let the installer download the JDK binaries.

    I did let the JDK binaries download.

    Then I see:

    Starting HMC Installer [ OK ]
    Starting httpd: Syntax error on line 18 of /etc/httpd/conf.d/puppetmaster.conf:
    SSLCertificateFile: file ‘/var/lib/puppet/ssl/certs/jjscentos64.pem’ does not exist or is empty
    Failed to start HMC

    What do I do to correct this error?


to create new topics or reply. | New User Registration

  • Author
  • #6501
    Sasha J

    Looks like your SSL certificate was not created correctly on the HMC host…
    Is this the only host you try to setup?
    Please, do the following:
    yum erase hmc
    yum erase puppet

    reboot node, then :
    you install hmc
    this should also install puppet as a prerequisite and SSL certificate will be created when puppet installed.


    service hmc start

    you should accept license again and then HMC should work.

    Please, try and get back to us.


    I did some searching and found a thread in another forum. The certificates where created with the name jjscentos64.localdomain and the puppetmaster.conf file had the host names without .localdomain.

    So I edited the puppetmaster conf file to use this suffix for both entries. The start command now works but I see this message:

    service hmc start
    Starting HMC Installer
    Starting httpd: httpd: Could not reliably determine the server’s fully qualified domain name, using for ServerName
    [ OK ]
    Starting HMC

    Can I ignore the warning about FQDN?


    OK, I fixed the error by adjusting the hosts file to look like this at the beginning: localhost.localdomain localhost jjscentos64.localdomain jjscentos64

    Now I do not get the FQDN warning any more.

    Is this the preferred way to correct the issue? What name should I use in hostdetail.txt for this host?


    Side note. I have centos updates turned on and I am told that there is an update to pdsh. I did not have pdsh installed initially so I added a reference to the Atomic repo and installed pdsh from there. Now the attempt to update pdsh fails with the message:

    file /usr/lib64/pdsh/sshcmd.so from install of pdsh-2.27-1.el5.rf.x86_64 conflicts with file from package pdsh-rcmd-ssh-2.17-1.el5.art.x86_64

    The latter package is apparently the one currently installed from Atomic.

    So do I need to remove the old pdsh and get the right one? Which repo should I use for pdsh?

    Like I said in another thread, my CentOS skills are rusty.

    Sasha J

    first, FQDN is a mandatory for Hadoop functionality.
    If you are unable to setup real DNS, please use your /etc/hosts file(s).
    You should not have more than 1 lines in /etc/hosts.
    Give some “real” ip to your box and edit your /etc/hosts file to be something like: localhost.localdomain localhost jjscentos64

    put same name (jjscentos64) to /etc/sysconfig/network

    this way your FQDN will be jjscentos64 and this will work fine.

    As of pdsh update, why do you want to use it? Please do not update anything.
    Just ignore all updates and restart HMC after you make changes in /etc/hosts

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.