HDP on Linux – Installation Forum

Puppet Failed : Pre Deploy

  • #7546
    Sasha J
    Moderator

    Q: Failed to start HMC : SSLCertificateFile: file ‘/var/lib/puppet/ssl/certs/.pem’ does not exist or is empty

    A: probably a bad hostname, or hostname has changed

    S: the easiest thing to do is to :

    1) first verify your hostname:

    # hostname -f

    > if this is invalid fix this before continuing

    2) reinstall hmc

    # yum -y remove hmc
    # yum -y remove puppet
    # yum -y install hmc

    3) restart hmc

    # service hmc start

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #7563
    Sasha J
    Moderator

    Q: Puppet Kick Failed : Timeout

    A: this is likely due to a timeout downloading and installing all the packages

    S: manually prep the packages

    1) yum erase hmc puppet
    2) yum install hmc
    3) yum install -y hadoop hadoop-libhdfs.x86_64 hadoop-native.x86_64 hadoop-pipes.x86_64 hadoop-sbin.x86_64 hadoop-lzo hadoop hadoop-libhdfs.x86_64 hadoop-native.x86_64 hadoop-pipes.x86_64 hadoop-sbin.x86_64 hadoop-lzo hadoop hadoop-libhdfs hadoop-native hadoop-pipes hadoop-sbin hadoop-lzo hadoop hadoop-libhdfs hadoop-native hadoop-pipes hadoop-sbin hadoop-lzo hadoop hadoop-libhdfs.x86_64 hadoop-native.x86_64 hadoop-pipes.x86_64 hadoop-sbin.x86_64 hadoop-lzo hadoop hadoop-libhdfs hadoop-native hadoop-pipes hadoop-sbin hadoop-lzo zookeeper zookeeper hbase hbase hbase mysql-server hive mysql-connector-java hive hcatalog oozie.noarch extjs-2.2-1 oozie-client.noarch pig.noarch sqoop mysql-connector-java templeton templeton-tar-pig-0.0.1.14-1 templeton-tar-hive-0.0.1.14-1 templeton hdp_mon_dashboard hdp_mon_nagios_addons nagios-3.2.3 nagios-plugins-1.4.9 fping net-snmp-utils ganglia-gmetad-3.2.0 ganglia-gmond-3.2.0 gweb hdp_mon_ganglia_addons ganglia-gmond-3.2.0 gweb hdp_mon_ganglia_addons snappy snappy-devel lzo lzo lzo-devel lzo-devel

    4) service hmc start

The topic ‘Puppet Failed : Pre Deploy’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.