The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HBase Forum

Hbase shell exception:Exception in thread "main" java.lang.NoClassDefFoundError:

  • #14825
    shi xinpo
    Participant

    Exception in thread “main” java.lang.NoClassDefFoundError: is
    Caused by: java.lang.ClassNotFoundException: is
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: is. Program will exit.

  • Author
    Replies
  • #14891
    Larry Liu
    Moderator

    Hi, Shi Xinpo

    What version of HDP are you using? Can you please provide more detail about the issue you had?

    Thanks

    Larry

    #19883
    Bajeesh TB
    Member

    Hello,

    I have same issue when running
    hadoop jar wordcount.jar g.myorg.WordCount /input /output
    Exception in thread “main” java.lang.ClassNotFoundException: g.myorg.WordCount
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:247)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
    ————————————————————————————————
    #]javac -classpath /usr/lib/hadoop/hadoop-core-1.1.2.21.jar -d wordcount_classes WordCount.java

    Its working fine and it doesn’t showing any errors. I am using HDP 1.2.1.

    Please help me?

    #19969
    Robert
    Participant

    Hi Bajeesh,
    I believe your question would be best answered in this forum:
    http://hortonworks.com/community/forums/forum/core-hadoop/

    I understand the error looks familiar regarding the class not found exception, but your error is regarding to a mapreduce job. A couple of things you should add so the rest of the community can get a better understanding:

    Did you use manual rpm or Ambari installation?
    Are you able to run the default wordcount program like this:
    NOTE: you will need to make sure your input folder has a file to process
    hadoop jar /usr/lib/hadoop/hadoop-examples.jar wordcount wordcount/input output

    If the above works fine, then issue might be specific to your wordcount program.

    Regards,
    Robert

    #20008
    Bajeesh TB
    Member

    Hi Robert,

    Its working fine now, that was package name issue. I have only basics of JAVA, so the issue was happened.

    Thanks for your help.

    Thanks,
    Bajeesh T.B

    #20142
    tedr
    Member

    Hi Bajeesh,

    Thanks for letting us know that it is working for you now.

    Thanks again,
    Ted.

The forum ‘HBase’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.