Home Forums Hortonworks Sandbox Running simple Java job

This topic contains 1 reply, has 2 voices, and was last updated by  Xiandong Su 4 months, 1 week ago.

  • Creator
    Topic
  • #28538

    How do a run a simple Hadoop java job in the Sandbox?

    I’ve tried using using the java type Job Design but I get an error saying ClassNotFound HelloWorld. The myjob.jar file has the HelloWorld class in it.

    My settingings are
    Jar path /user/sample.myjob.jar
    Main class HelloWorld
    And my class is a trivial one that just sets the input file and output directory and uses the default mapper/reducer:

    import java.io.IOException;
    import org.apache.hadoop.mapreduce.Job;
    import org.apache.hadoop.fs.Path;
    import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
    import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
    import org.apache.hadoop.io.Text;

    public class HelloWorld {

    public static void main( String [] args ) throws IOException, ClassNotFoundException, InterruptedException {
    // TODO Auto-generated method stub
    @SuppressWarnings (“unused” )
    Job job = new Job ();
    FileInputFormat.addInputPath (job, new Path( “/user/sample/trivialdata.txt” ));
    FileOutputFormat.setOutputPath (job, new Path( “/user/sample/otest1″)) ;
    job.setOutputKeyClass (Text. class) ;
    job.setOutputValueClass (Text. class) ;
    System.exit (job.waitForCompletion ( true) ? 0 : 1 ) ;
    }
    }

    Thanks,
    Dan

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #44978

    Xiandong Su
    Member

    I do not know if you have resolved this issue. If not, try to put the class in a package. I copied your code without any changes, except a package information. Running it on Hortonworks platform, and it ran through without any problems. When specifying main class, you need to have package in it. For exmaple: org.something.HelloWorld

    Collapse
Viewing 1 replies (of 1 total)