The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Hive / HCatalog Forum

Stinger Phase 3 – error

  • #53012
    Pramod Sharma


    I followed the same steps as mentioned in the document –
    And in the last launching Hive using below command then I am getting the error like below.

    -bash-4.1$ hive -hiveconf hive.optimize.tez=true
    2014-05-06 12:30:39,248 INFO [main] Configuration.deprecation ( – mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
    2014-05-06 12:30:39,267 INFO [main] Configuration.deprecation ( – mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
    2014-05-06 12:30:39,270 INFO [main] Configuration.deprecation ( – mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
    2014-05-06 12:30:39,271 INFO [main] Configuration.deprecation ( – mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
    2014-05-06 12:30:39,271 INFO [main] Configuration.deprecation ( – mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
    2014-05-06 12:30:39,272 INFO [main] Configuration.deprecation ( – mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
    2014-05-06 12:30:39,272 INFO [main] Configuration.deprecation ( – mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative

    Logging initialized using configuration in jar:file:/opt/apache-hive-!/
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/opt/apache-hive-!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Exception in thread “main” java.lang.NoSuchMethodError: org.apache.hadoop.yarn.util.Apps.addToEnvironment(Ljava/util/Map;Ljava/lang/String;Ljava/lang/String;)V
    at org.apache.tez.client.TezClientUtils.getFrameworkClasspath(
    at org.apache.tez.client.TezClientUtils.createApplicationSubmissionContext(
    at org.apache.tez.client.TezSession.start(
    at org.apache.hadoop.hive.ql.session.SessionState.start(
    at org.

  • Author
  • #53023
    Carter Shanklin


    The Stinger Phase 3 tech preview is old at this point, you should use HDP 2.1 directly which has the software out of the box.


    Pramod Sharma

    Thanks for the reply.

    Currently I am using the HDP-2.0.6. I am using the ambari to manage the cluster. Pleaselet me know how to upgrade the HDP-2.0.6 to HDP 2.1using ambari ?


The forum ‘Hive / HCatalog’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.