Problems using HCatatalog with pig java.lang.NoSuchMethodError: org.apache.hadoo

This topic contains 4 replies, has 2 voices, and was last updated by  Antonio González Artime 8 months ago.

  • Creator
  • #52466


    I’m having some problems accessing HCatalog from Pig. I’m using:
    Hue 2.3.1-385
    HDP 2.1.1
    Hadoop 2.4.0
    Pig 0.12.1
    Hive-Hcatalog 0.13.0

    While I try to get a table description from HCatalog, I get the following exception (it occurs both in grunt and HUE):

    I think that it may be a problem in the hive-exec.jar that its being used: hive-exec-; because it looks like if it couldn’t find an specific constructor.

    PIG SCRIPT (-useHCatalog):

    records = LOAD ‘tfrinspa’ USING org.apache.hcatalog.pig.HCatLoader();
    DUMP records LIMIT 2;

    2014-04-29 11:02:38,448 [main] ERROR – ERROR 2998: Unhandled internal error. org.apache.hadoop.hive.ql.plan.TableDesc.<init>(Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Ljava/util/Properties;)V
    Details at logfile: /hadoop/yarn/local/usercache/antonio/appcache/application_1398432203922_0060/container_1398432203922_0060_01_000002/pig_1398762151126.log

    ERROR 2998: Unhandled internal error. org.apache.hadoop.hive.ql.plan.TableDesc.<init>(Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Ljava/util/Properties;)V

    java.lang.NoSuchMethodError: org.apache.hadoop.hive.ql.plan.TableDesc.<init>(Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Ljava/util/Properties;)V
    at org.apache.hcatalog.common.HCatUtil.getInputJobProperties(
    at org.apache.hcatalog.mapreduce.InitializeInput.extractPartInfo(
    at org.apache.hcatalog.mapreduce.InitializeInput.getInputJobInfo(
    at org.apache.hcatalog.mapreduce.InitializeInput.setInput(
    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(
    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(
    at org.apache.hcatalog.pig.HCatLoader.setLocation(
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(
    at org.apache.pig.PigServer.launchPlan(
    at org.apache.pig.PigServer.executeCompiledLogicalPlan(
    at org.apache.pig.PigServer.storeEx(
    at org.apache.pig.PigServer.openIterator(
    Thanks a lot.

    Antonio G. Artime

Viewing 4 replies - 1 through 4 (of 4 total)

The forum ‘HDP 2.1 Technical Preview’ is closed to new topics and replies.

  • Author
  • #54654

    Hi again.

    I’ve tried it and it’s working. Maybe it will cause some side effect in the future, but for now it’s ok for me.

    What I did:
    – Download hive-exec-
    – Put it into /usr/lib/hive/lib/
    – Link /usr/lib/hive/lib/hive-exec.jar to /usr/lib/hive/lib/hive-exec-

    Good luck!

    Antonio G. Artime


    Garrett Barton

    I can give that a go, I wonder if hive 12 is compatible with HCat/Metastore running 13?


    I think that the problem occurs with the library hive-exec- . There’s not constructor there receiving 4 arguments. In a previous version (0.12.*) the constructor exists, so I think that it should work if we change hive-exec-*jar to v0.12.* . I haven’t test it yet.



    Garrett Barton

    I am seeing this as well, same stack, same example case.
    It also breaks trying to write to a hive table from pig.
    Describe works fine though.

    Any known solutions out there?

Viewing 4 replies - 1 through 4 (of 4 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.