HDP on Linux – Installation Forum

How to start over?

  • #6572

    I seem to be hitting a brick wall with my attempt to install.

    Can I bring my CentOS VM back to a “square one” condition to start the process over again?

    Or do I need to get a clean CentOS VM? The latter is not really attractive — hoping some yum magic will be sufficient.


to create new topics or reply. | New User Registration

  • Author
  • #6573
    Sasha J

    OK, let us do it again.
    First of all, in your case hostname does not really matter, but as you can imagine, local domain is just something not real. Your choice, use it or not… I usually don’t.
    so, let us say we have hostname as jjscentos64 and the same name returned by both “hostname” and “hostname -f” commands.
    So, our /etc/hosts files contains the following line: jjscentos64

    Now, do the following (in order):
    !!! Make sure you have net-snmp and net-snmp-utils installed. Without it HMC will fail.

    service hmc stop
    yum erase hmc puppet (puppet needs to be removed in order to get fresh certificate on the place, referring to the correct node name)
    yum install hmc (this will also install puppet as a dependency and create all needed certificates)
    service hmc start

    When it started, (it will show “Could not reliably determine the server’s fully qualified domain name, using for ServerName”, just ignore this) point your browser to

    If you use “remote” browser and have set hosts file on windows to resolve node name ( jjscentos64), you can also use http://jjscentos64/hmc/html.

    Please, try and let me know of results.

    Thank you!


    Do the versions of net-snmp and net-snmp-utils matter? I think these may be incompatible as well, perhaps I obtained them from the atomic repository that caused my php problems.

    So will the yum install of hmc bring in the right versions of these? Or do I get them separately?

    Sasha J

    Get rid of atomic repository and use CenOS repository.
    In current release you have to install SNMP packages manually.
    Just remove atomic repository, make sure you have CentOS repositories configured (yum repolist)
    and run “yum install net-snmp net-snmp-utils”

    you could do this before or after HMC installation, it does not check for this dependency…
    but definitely BEFORE starting HMC



    I noticed that puppet did not regenerate any certs. These were left behind and were not removed when I did a yum erase. So the re-install of puppet just left the existing certs in place.

    I tried the puppet kick command and I see:

    puppet kick jjscentos64
    Triggering jjscentos64
    Host jjscentos64 failed: Connection refused – connect(2)
    jjscentos64 finished with exit code 2
    Failed: jjscentos64

    This happens regardless of whether hmc is started or not. Maybe this is normal.

    The HMC web page DOES come up and the process to start creating a cluster seems available to me.

    I am leaving my work location now for the weekend. Not sure if I will have a chance to continue this in the near future but I will keep you posted.

    Sasha J

    OK, let me know when you will be able to work on this again.
    I suggest to reinstall VM with all the prerequisites (SNMP) and start from the fresh, without any atomic repositories.


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.