The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Hive / HCatalog Forum

Hive Smoke Test

  • #39057

    Hi,
    I am trying to run Hive Smoke Test and it fails with :

    warning: Unrecognised escape sequence ‘\;’ in file /var/lib/ambari-agent/puppet/modules/hdp-hive/manifests/hive/service_check.pp at line 32
    warning: Dynamic lookup of $configuration is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    notice: /Stage[1]/Hdp::Snappy::Package/Hdp::Snappy::Package::Ln[32]/Hdp::Exec[hdp::snappy::package::ln 32]/Exec[hdp::snappy::package::ln 32]/returns: executed successfully
    err: /Stage[2]/Hdp-hcat::Hcat::Service_check/Exec[hcatSmoke.sh prepare]/returns: change from notrun to 0 failed: /tmp/hcatSmoke.sh: line 26: hcat: command not found
    /tmp/hcatSmoke.sh: line 27: hcat: command not found
    /tmp/hcatSmoke.sh: line 28: hcat: command not found

    Also there is some alert :
    Hive Metastore status check CRIT for a day
    CRITICAL: Error accessing hive-metaserver status [/usr/lib64/nagios/plugins/check_hive_metastore_status.sh: line 39: hcat: command not found]

    Any idea?

    Thank you!

  • Author
    Replies
  • #52611
    Umesh
    Participant

    Hi,

    I have upgraded HDP from 2.0.6 to 2.1 and ran hive smoke test but its failing throwing following error,

    2014-04-30 04:42:36,508 – Error while executing command ‘service_check’:
    Traceback (most recent call last):
    File “/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py”, line 106, in execute
    method(env)
    File “/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/service_check.py”, line 44, in service_check
    hcat_service_check()
    File “/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hcat_service_check.py”, line 48, in hcat_service_check
    logoutput=True)
    File “/usr/lib/python2.6/site-packages/resource_management/core/base.py”, line 148, in __init__
    self.env.run()
    File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 149, in run
    self.run_action(resource, action)
    File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 115, in run_action
    provider_action()
    File “/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py”, line 239, in action_run
    raise ex
    Fail: Execution of ‘env JAVA_HOME=/usr/jdk64/jdk1.6.0_31 sh /tmp/hcatSmoke.sh hcatsmokeid090a4da0_date423014 prepare’ returned 127. /tmp/hcatSmoke.sh: line 26: hcat: command not found
    /tmp/hcatSmoke.sh: line 27: hcat: command not found
    /tmp/hcatSmoke.sh: line 28: hcat: command not found

    Any Ideas.?

    Thanks,
    Umesh

The forum ‘Hive / HCatalog’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.