Pig Forum

Cannot run any pig scripts from Hue

  • #56788
    Jens Rabe
    Participant

    I just installed a small sample cluster with the current HDP and Ambari 1.6. It works well so far, but I cannot run any Pig scripts, even something simple like

    a = 3 + 3;
    dump a;

    I get the following log:

    ls: cannot access /hadoop/yarn/local/usercache/jra/appcache/application_1404231648965_0017/container_1404231648965_0017_01_000002/hive.tar.gz/hive/lib/slf4j-api-*.jar: No such file or directory
    Error: Could not find or load main class hive.metastore.uris=thrift:..localhost:9933

    I think there is a misconfiguration anywhere. Can you guide me what is wrong?

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #57037
    Christian González
    Participant

    I have the same problem, I have installed HDP 2.1.2.1 and
    hue-common-2.3.1.2.1.2.0-402.el6.x86_64
    hue-plugins-2.3.1.2.1.2.0-402.el6.x86_64
    hue-pig-2.3.1.2.1.2.0-402.el6.x86_64
    hue-hcatalog-2.3.1.2.1.2.0-402.el6.x86_64
    hue-oozie-2.3.1.2.1.2.0-402.el6.x86_64
    hue-beeswax-2.3.1.2.1.2.0-402.el6.x86_64
    hue-server-2.3.1.2.1.2.0-402.el6.x86_64
    hue-shell-2.3.1.2.1.2.0-402.el6.x86_64
    hue-2.3.1.2.1.2.0-402.el6.x86_64

    #57555
    Owen Taylor
    Participant

    Hi Jens,

    I’m not sure if this will be satisfactory, but some issues related to pig in HDP are discussed/resolved in this thread:

    http://hortonworks.com/community/forums/topic/sandbox-pig-basic-tutorial-example-is-nbot-working/page/2/#post-56174

    #57601
    Praveen Kumar
    Participant

    I also have the same error. As Owen Taylor suggested above, I edited /usr/lib/pig/bin/pig and added HCAT_HOME=/usr/lib/hive-hcatalog. After this I restarted all the services. But the same error still persists.

    #57654
    Learner
    Participant

    Hi
    I installed hue and while writing pig script I am getting the same error.
    “”
    ls: cannot access /hadoop/yarn/local/usercache/hdfs/appcache/application_1405964500282_0014/container_1405964500282_0014_01_000002/hive.tar.gz/hive/lib/slf4j-api-*.jar: No such file or directory
    Error: Could not find or load main class hive.metastore.uris=thrift:..inf-misc-snv-05:9083
    “”
    can anyone help me to fix this problem. Thanks in advance.

    #57655
    Learner
    Participant

    Also I tried editing /usr/lib/pig/bin/pig and added HCAT_HOME=/usr/lib/hive-hcatalog. After this I restarted all the services. But the same error still persists.

    “”
    ls: cannot access /hadoop/yarn/local/usercache/hdfs/appcache/application_1405964500282_0014/container_1405964500282_0014_01_000002/hive.tar.gz/hive/lib/slf4j-api-*.jar: No such file or directory
    Error: Could not find or load main class hive.metastore.uris=thrift:..inf-misc-snv-05:9083
    “”

    #57712
    Artur Markiewicz
    Participant

    Hi!
    For me the following did the work:

    1. Go to Ambari/WebHCat and change the property templeton.hive.properties to “hive.metastore.local=false,hive.metastore.uris=thrift://<YOUR_METASTORE_HOST>:9083,hive.metastore.sasl.enabled=false”. Be sure to remove all spaces in the property’s value! I guess, that spaces cause some command-line execution to fail – the property value after the space will be interpreted as a next argument. In this case as a class name. Restart WebHCat and try to execute your Pig-script.

    2. Add slf4j-api jar to hdfs:///apps/webhcat/hive.tar.gz to get rid of the first error. I guess it’s not critical, to get the Pig-scripts to work.

    Cheers!

    #60800
    Hadoop Admin
    Participant

    The steps listed by Artur Markiewicz worked for me as well.

    #61828

    Hi Arthur,

    How do you go about adding the slf4j-api jar to hdfs:///apps/webhcat/hive.tar.gz?

    Thank you,

    #62086
    Artur Markiewicz
    Participant

    Hi Kyle,
    as far as I remember, it was something like this:

    1. hadoop fs -copyToLocal /apps/webhcat/hive.tar.gz .
    2. tar zxvf hive.tar.gz
    3. cp /usr/lib/hadoop/lib/slf4j-api-1.7.5.jar hive/lib
    4. Set the file’s owner/permissions on slf4j-api-1.7.5.jar like on the other jars in hive/lib
    5. Gzip the hive directory and upload it to hdfs:///apps/webhcat/hive.tar.gz

    Dont’ forget to backup 😉

    #63945
    Izan Izan
    Participant

    hi Artur, can you please tell how to get to those directories and how to do that exaclty, as Im facing the same problem!

    Are those directories of your local pc? Of your VM (In my case HortonWorks Sandbox)? Or the File browser in the HortonWorks website like thing:p

    please bear with me as Im a beginner

    Thanks in advance

    #64487
    Sudhindra Vedanthi
    Participant

    Need Help to resolve this.
    The same error executing the first “Hello World” tutorial.

    ls: cannot access /hadoop/yarn/local/usercache/hue/appcache/application_1417767693477_0001/container_1417767693477_0001_01_000002/hive.tar.gz/hive/lib/slf4j-api-*.jar: No such file or directory

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.