Installing HDP 1.2

to create new topics or reply. | New User Registration

This topic contains 3 replies, has 3 voices, and was last updated by  rajeev kaul 2 years, 2 months ago.

  • Creator
  • #13687

    sean mikha

    Having some trouble installing HDP 1.2 on CentOS 5 and CentOS 6.

    Everything works up until deploying through Ambari, however afer installation I get multiple failures:
    oozie check execute, hive check execute, and webhcat check execute fail with no log information in stdout or stderror

    Hbase check execute fails as well but includes the following (posted below)
    (please note I have installed hdp and on the same Linux distro and same exact node prep so not sure if this is something introduced with 1.2 or a new prep requirement has been added).

    notice: /Stage[1]/Hdp::Snappy::Package/Hdp::Snappy::Package::Ln[32]/Hdp::Exec[hdp::snappy::package::ln 32]/Exec[hdp::snappy::package::ln 32]/returns: executed successfully
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/File[/tmp/]/ensure: defined content as ‘{md5}a4e08d5388577f1767eb5f8ea8c4a267′
    err: /Stage[2]/Hdp-hbase::Hbase::Service_check/Exec[/tmp/]/returns: change from notrun to 0 failed: Command exceeded timeout at /var/lib/ambari-agent/puppet/modules/hdp-hbase/manifests/hbase/service_check.pp:46
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Anchor[hdp::exec::hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable::begin]: Dependency Exec[/tmp/] has failures: true
    warning: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Anchor[hdp::exec::hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable::begin]: Skipping because of failed dependencies
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]: Dependency Exec[/tmp/] has failures: true
    warning: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]: Skipping because of failed dependencies
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Anchor[hdp::exec::hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable::end]: Dependency Exec[/tmp/] has failures: true
    warning: /Stage[2]/Hdp-hbase::Hbase::Service_check/Hdp-hadoop::Exec-hadoop[hbase::service_check::test]/Hdp::Exec[hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable]/Anchor[hdp::exec::hadoop –config /etc/hadoop/conf fs -test -e /apps/hbase/data/usertable::end]: Skipping because of failed dependencies
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Anchor[hdp-hbase::hbase::service_check::end]: Dependency Exec[/tmp/] has failures: true
    warning: /Stage[2]/Hdp-hbase::Hbase::Service_check/Anchor[hdp-hbase::hbase::service_check::end]: Skipping because of failed dependencies
    notice: Finished catalog run in 314.56 seconds

Viewing 3 replies - 1 through 3 (of 3 total)

You must be to reply to this topic. | Create Account

  • Author
  • #13882

    rajeev kaul

    I tried with 3 medium sized nodes, and ran into installation failures. Then switched to a 4 node cluster (1 small for Ambari alone, and 3 large for HDP). I still ran into issues, but was able to figure out from logs that the Postgres port 5432 was not open. Once, I fixed that, I was able to start all the failing services like Hive, Oozie, namenode, jobtracker, etc. Ambari is a much improved installation program over HMC. Do not have to install from scratch if you run into issues with a particular service. Kudos to Hortonworks for making this much needed change.

    I have all the services running fine, except the metrics only show for one of the 3 hosts. The ganglia and nagios services seem to be running fine, so not quite sure why it is not reporting metrics for 2 of the 3 hosts I have configured.



    Hi Sean,

    Yup, the size of the instance makes a big difference. The small instance doesn’t have enough memory or drive space to run HDP very well.



    sean mikha

    Well… looks like I may have solved my own problem. I had a feeling a lot of the issues may have been around timeouts and the performance of the nodes. I started out on a M1.Small Amazon EC2 instance for 2 nodes.

    I changed this to 4 node , M1.Large and was able to install HDP 1.2.0 on CentOS 6.2 in about 15 minutes.

    I used the rightscale image: ami-043f9c6d

    The cluster consists of 4 hosts
    Installed and started services successfully on 4 new hosts
    Master services installed
    NameNode installed on ip-10-85-122-86.ec2.internal
    SecondaryNameNode installed on ip-10-116-243-188.ec2.internal
    JobTracker installed on ip-10-116-243-188.ec2.internal
    Nagios Server installed on ip-10-85-122-86.ec2.internal
    Ganglia Server installed on ip-10-85-122-86.ec2.internal
    Hive Metastore installed on ip-10-116-243-188.ec2.internal
    HBase Master installed on ip-10-85-122-86.ec2.internal
    Oozie Server installed on ip-10-116-243-188.ec2.internal
    All services started
    All tests passed
    Install and start completed in 14 minutes and 47 seconds

Viewing 3 replies - 1 through 3 (of 3 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.