HDFS Forum

Hadoop Core Issue

  • #25682

    Hello guys

    i’m trying to run a simple hadoop programme in which i have set one search function and set one maper class
    and one reduce class but when i run that programme at that time the class file of the (Map.class and Reduce.class) this both class can not be get at run time and internally by using (job.waitforcompletion(true)) this is the method in which finally the job will execute but at internally in this method it will throw the exception like class not found exception (test.hadoop.map) so i have not idea that how to solve this issue ..i have also set all jar in class path but still m getting same issue,hope any oneof u guys guide me,

    Kuldeep dayma

to create new topics or reply. | New User Registration

  • Author
  • #25689

    Hi Kuldeep,

    Can you post the stacktrace you are getting here? That will help us get to the bottom of your issue.



    Hello tedr this is the main method of my hadoop class

    Configuration conf = new Configuration();
    conf.set(“mapred.job.tracker”, “localhost:9001”);
    Job job = new Job(conf, “wordcount”);
    FileInputFormat.addInputPath(job, new Path(“/home/administrator/Desktop/test wordcount”));
    FileOutputFormat.setOutputPath(job, new Path(“/home/administrator/Desktop/test wordcount/output123h”));
    job.waitForCompletion(true); Internally this method throws the exception..
    here i have set map and redude class but at run time when this main mehtod is called at that time following exception is occur.

    13/05/17 10:08:41 INFO mapred.JobClient: Task Id : attempt_201305171006_0002_m_000000_2, Status : FAILED
    java.lang.RuntimeException: java.lang.ClassNotFoundException: test.hadoop.WordCount$Map
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:859)
    at org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
    Caused by: java.lang.ClassNotFoundException: test.hadoop.WordCount$Map
    at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:812)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
    … 8 more
    This exception m getting at run time,so can u give me the solution for that

    Seth Lyubich

    Hi Kuldeep,

    I think that this issue could be related to http://hortonworks.com/community/forums/topic/jobtracker-security-exception/. I think you should try to resolve the issue with Safe mode first. Also, you can try to run mapreduce smoke test and see if it succeeds or fails with similar error?

    Hope this helps,


The topic ‘Hadoop Core Issue’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.