The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Pig Forum

pig -useHCatalog

  • #45308

    Hi,

    I am getting the following error when I run “pig -useHCatalog”.

    Exception in thread “main” java.lang.NoClassDefFoundError: /usr/lib/hive/lib/libthrift-0/9/1/jar:/usr/lib/hive/lib/hive-exec-0/12/0/2/0/6/0-76/jar:/usr/lib/hive/lib/libfb303-0/9/0/jar:/usr/lib/hive/lib/jdo-api-3/0/1/jar:/usr/lib/hive/lib/slf4j-api-1/7/2/jar
    Caused by: java.lang.ClassNotFoundException: .usr.lib.hive.lib.libthrift-0.9.1.jar:.usr.lib.hive.lib.hive-exec-0.12.0.2.0.6.0-76.jar:.usr.lib.hive.lib.libfb303-0.9.0.jar:.usr.lib.hive.lib.jdo-api-3.0.1.jar:.usr.lib.hive.lib.slf4j-api-1.7.2.jar
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: /usr/lib/hive/lib/libthrift-0.9.1.jar:/usr/lib/hive/lib/hive-exec-0.12.0.2.0.6.0-76.jar:/usr/lib/hive/lib/libfb303-0.9.0.jar:/usr/lib/hive/lib/jdo-api-3.0.1.jar:/usr/lib/hive/lib/slf4j-api-1.7.2.jar. Program will exit

    I have defined all the environement variables (including PIG_CLASSPATH) as below and also defined pig.additiona.jars in the pig.properties files

    export HADOOP_GROUP=hadoop ;
    export HADOOP_HOME=/usr/lib/hadoop
    export HCAT_HOME=/usr/lib/hcatalog
    export HIVE_HOME=/usr/lib/hive
    export templeton_host=$TEMPLETON_HOST
    export user_name=hcat
    export PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-*.jar:\
    $HIVE_HOME/lib/hive-metastore-*.jar:$HIVE_HOME/lib/libthrift-*.jar:\
    $HIVE_HOME/lib/hive-exec-*.jar:$HIVE_HOME/lib/libfb303-*.jar:\
    $HIVE_HOME/lib/jdo2-api-*-ec.jar:$HIVE_HOME/conf:$HADOOP_CONF_DIR:\
    $HIVE_HOME/lib/slf4j-api-*.jar:/usr/lib/hive/lib/libthrift-0.9.1.jar:/usr/lib/hive/lib/hive-exec-0.12.0.2.0.6.0-76.jar:/usr/lib/hive/lib/libfb303-0.9.0.jar:/usr/lib/hive/lib/jdo-api-3.0.1.jar:/usr/lib/hive/lib/slf4j-api-1.7.2.jar

    export PIG_OPTS=-Dhive.metastore.uris=thrift://$HIVE_HOST:$HIVE_PORT

    In pig.properties file,

    pig.additional.jars=/usr/lib/hive/lib/libthrift-0.9.1.jar:/usr/lib/hive/lib/hive-exec-0.12.0.2.0.6.0-76.jar:/usr/lib/hive/lib/libfb303-0.9.0.jar:/usr/lib/hive/lib/jdo-api-3.0.1.jar:/usr/lib/hive/lib/slf4j-api-1.7.2.jar

    Please let me know how this could be resolved. Thanks!

  • Author
    Replies
  • #45318
    abdelrahman
    Moderator

    Hi Arun,

    Most likely the issue with some of the Jars. Have you installed HCatalog? From command line please run:
    $ hcat

    Try to revert back all of the environment variables and only add the following:
    In pig.properties add:
    hcat.bin=/usr/bin/hcat

    In pig-env.sh add: ( JAVA_HOME should be set there)
    HADOOP_HOME=${HADOOP_HOME:-/usr}

    Hope this helps.

    Thanks
    -Rahman

    #45394

    Hi Rahman,

    Thanks for your reply. I tried this, but it did not work.

    The HDP version I was trying was HDP 2.0.2. I had another HDP installation HDP 2.0.6 and the command pig -useHCatalog worked fine there. I noticed that one of the jars /usr/lib/hive/lib/libthrift-0.9.1.jar in HDP 2.0.2 was a older version in HDP 2.0.6 (/usr/lib/hive/lib/libthrift-0.9.0.jar). So I copied all the jars from the /hive/lib library in 2.0.6 to 2.0.2 and the command worked fine.

    So I believe this is a jar issue in 2.0.2. Can you have a look or confirm?

    Thanks,
    Arun

The forum ‘Pig’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.