HDP on Linux – Installation Forum

Installing HDP 1.2

  • #13687
    sean mikha

    Having some trouble installing HDP 1.2 on CentOS 5 and CentOS 6.

    Everything works up until deploying through Ambari, however afer installation I get multiple failures:
    oozie check execute, hive check execute, and webhcat check execute fail with no log information in stdout or stderror

    Hbase check execute fails as well but includes the following (posted below)
    (please note I have installed hdp and on the same Linux distro and same exact node prep so not sure if this is something introduced with 1.2 or a new prep requirement has been added).

    notice: /Stage[1]/Hdp::Snappy::Package/Hdp::Snappy::Package::Ln[32]/Hdp::Exec[hdp::snappy::package::ln 32]/Exec[hdp::snappy::package::ln 32]/returns: executed successfully
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/File[/tmp/hbaseSmoke.sh]/ensure: defined content as ‘{md5}a4e08d5388577f1767eb5f8ea8c4a267’
    err: /Stage[2]/Hdp-hbase::Hbase::Service_check/Exec[/tmp/hbaseSmoke.sh]/returns: change from notrun to 0 failed: Command exceeded timeout at /var/lib/ambari-agent/puppet/modules/hdp-hbase/manifests/hbase/service_check.pp:46
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Anchor[hdp::exec::hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable::begin]: Dependency Exec[/tmp/hbaseSmoke.sh] has failures: true
    warning: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Anchor[hdp::exec::hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable::begin]: Skipping because of failed dependencies
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]: Dependency Exec[/tmp/hbaseSmoke.sh] has failures: true
    warning: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]: Skipping because of failed dependencies
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Anchor[hdp::exec::hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable::end]: Dependency Exec[/tmp/hbaseSmoke.sh] has failures: true
    warning: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Anchor[hdp::exec::hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable::end]: Skipping because of failed dependencies
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Anchor[hdp-hbase::hbase::service_check::end]: Dependency Exec[/tmp/hbaseSmoke.sh] has failures: true
    warning: /Stage[2]/Hdp-hbase::Hbase::Service_check/Anchor[hdp-hbase::hbase::service_check::end]: Skipping because of failed dependencies
    notice: Finished catalog run in 314.56 seconds

to create new topics or reply. | New User Registration

  • Author
  • #13688
    sean mikha

    Well… looks like I may have solved my own problem. I had a feeling a lot of the issues may have been around timeouts and the performance of the nodes. I started out on a M1.Small Amazon EC2 instance for 2 nodes.

    I changed this to 4 node , M1.Large and was able to install HDP 1.2.0 on CentOS 6.2 in about 15 minutes.

    I used the rightscale image: ami-043f9c6d

    The cluster consists of 4 hosts
    Installed and started services successfully on 4 new hosts
    Master services installed
    NameNode installed on ip-10-85-122-86.ec2.internal
    SecondaryNameNode installed on ip-10-116-243-188.ec2.internal
    JobTracker installed on ip-10-116-243-188.ec2.internal
    Nagios Server installed on ip-10-85-122-86.ec2.internal
    Ganglia Server installed on ip-10-85-122-86.ec2.internal
    Hive Metastore installed on ip-10-116-243-188.ec2.internal
    HBase Master installed on ip-10-85-122-86.ec2.internal
    Oozie Server installed on ip-10-116-243-188.ec2.internal
    All services started
    All tests passed
    Install and start completed in 14 minutes and 47 seconds


    Hi Sean,

    Yup, the size of the instance makes a big difference. The small instance doesn’t have enough memory or drive space to run HDP very well.


    rajeev kaul

    I tried with 3 medium sized nodes, and ran into installation failures. Then switched to a 4 node cluster (1 small for Ambari alone, and 3 large for HDP). I still ran into issues, but was able to figure out from logs that the Postgres port 5432 was not open. Once, I fixed that, I was able to start all the failing services like Hive, Oozie, namenode, jobtracker, etc. Ambari is a much improved installation program over HMC. Do not have to install from scratch if you run into issues with a particular service. Kudos to Hortonworks for making this much needed change.

    I have all the services running fine, except the metrics only show for one of the 3 hosts. The ganglia and nagios services seem to be running fine, so not quite sure why it is not reporting metrics for 2 of the 3 hosts I have configured.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.