Home Forums HDP 2.1 Technical Preview Problems using HCatatalog with pig java.lang.NoSuchMethodError: org.apache.hadoo

This topic contains 4 replies, has 2 voices, and was last updated by  Antonio González Artime 6 months ago.

  • Creator
    Topic
  • #52466

    Hi:

    I’m having some problems accessing HCatalog from Pig. I’m using:
    Hue 2.3.1-385
    HDP 2.1.1
    Hadoop 2.4.0
    Pig 0.12.1
    Hive-Hcatalog 0.13.0

    While I try to get a table description from HCatalog, I get the following exception (it occurs both in grunt and HUE):

    I think that it may be a problem in the hive-exec.jar that its being used: hive-exec-0.13.0.2.1.1.0-385.jar; because it looks like if it couldn’t find an specific constructor.

    PIG SCRIPT (-useHCatalog):

    records = LOAD ‘tfrinspa’ USING org.apache.hcatalog.pig.HCatLoader();
    DUMP records LIMIT 2;

    OUTPUT:
    2014-04-29 11:02:38,448 [main] ERROR org.apache.pig.tools.grunt.Grunt – ERROR 2998: Unhandled internal error. org.apache.hadoop.hive.ql.plan.TableDesc.<init>(Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Ljava/util/Properties;)V
    Details at logfile: /hadoop/yarn/local/usercache/antonio/appcache/application_1398432203922_0060/container_1398432203922_0060_01_000002/pig_1398762151126.log

    PIG STACKTRACE:
    ERROR 2998: Unhandled internal error. org.apache.hadoop.hive.ql.plan.TableDesc.<init>(Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Ljava/util/Properties;)V

    java.lang.NoSuchMethodError: org.apache.hadoop.hive.ql.plan.TableDesc.<init>(Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Ljava/util/Properties;)V
    at org.apache.hcatalog.common.HCatUtil.getInputJobProperties(HCatUtil.java:452)
    at org.apache.hcatalog.mapreduce.InitializeInput.extractPartInfo(InitializeInput.java:161)
    at org.apache.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:137)
    at org.apache.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:87)
    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:65)
    at org.apache.hcatalog.pig.HCatLoader.setLocation(HCatLoader.java:120)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:477)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:298)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:191)
    at org.apache.pig.PigServer.launchPlan(PigServer.java:1324)
    at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1309)
    at org.apache.pig.PigServer.storeEx(PigServer.java:980)
    at org.apache.pig.PigServer.store(PigServer.java:944)
    at org.apache.pig.PigServer.openIterator(PigServer.java:857)
    at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:774)
    at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
    at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
    [...]
    Thanks a lot.

    Antonio G. Artime
    @agartime

Viewing 4 replies - 1 through 4 (of 4 total)

The forum ‘HDP 2.1 Technical Preview’ is closed to new topics and replies.

  • Author
    Replies
  • #54654

    Hi again.

    I’ve tried it and it’s working. Maybe it will cause some side effect in the future, but for now it’s ok for me.

    What I did:
    – Download hive-exec-0.12.0.2.0.6.1-102.jar
    – Put it into /usr/lib/hive/lib/
    – Link /usr/lib/hive/lib/hive-exec.jar to /usr/lib/hive/lib/hive-exec-0.12.0.2.0.6.1-102.jar

    Good luck!

    Regards,
    Antonio G. Artime

    Collapse
    #54627

    Garrett Barton
    Participant

    I can give that a go, I wonder if hive 12 is compatible with HCat/Metastore running 13?

    Collapse
    #54565

    I think that the problem occurs with the library hive-exec-0.13.0.2.1.1.0-385.jar . There’s not constructor there receiving 4 arguments. In a previous version (0.12.*) the constructor exists, so I think that it should work if we change hive-exec-*jar to v0.12.* . I haven’t test it yet.

    Regards,

    Collapse
    #54563

    Garrett Barton
    Participant

    I am seeing this as well, same stack, same example case.
    It also breaks trying to write to a hive table from pig.
    Describe works fine though.

    Any known solutions out there?

    Collapse
Viewing 4 replies - 1 through 4 (of 4 total)