Oozie Forum

Getting exception in rerun

  • #11773

    I am getting following error while rerun my job

    2012-11-02 15:06:47,386 ERROR ReRunXCommand:536 – USER[?] GROUP[users] TOKEN[-] APP[-] JOB[-] ACTION[-] XException,
    org.apache.oozie.command.CommandException: E0712: Could not create lib paths list for application [hdfs://IMPETUS-N157ubuntu:9000/user/impadmin/examples/apps/logProcessing], null
    at org.apache.oozie.command.wf.ReRunXCommand.execute(ReRunXCommand.java:151)
    at org.apache.oozie.command.wf.ReRunXCommand.execute(ReRunXCommand.java:70)
    at org.apache.oozie.command.XCommand.call(XCommand.java:260)
    at org.apache.oozie.DagEngine.reRun(DagEngine.java:314)
    at org.apache.oozie.servlet.V1JobServlet.reRunWorkflowJob(V1JobServlet.java:551)
    at org.apache.oozie.servlet.V1JobServlet.reRunJob(V1JobServlet.java:192)
    at org.apache.oozie.servlet.BaseJobServlet.doPut(BaseJobServlet.java:115)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:640)
    at org.apache.oozie.servlet.JsonRestServlet.service(JsonRestServlet.java:284)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: org.apache.oozie.workflow.WorkflowException: E0712: Could not create lib paths list for application [hdfs://IMPETUS-N157ubuntu:9000/user/impadmin/examples/apps/logProcessing], null
    at org.apache.oozie.service.WorkflowAppService.createProtoActionConf(WorkflowAppService.java:224)
    at org.apache.oozie.command.wf.ReRunXCommand.execute(ReRunXCommand.java:117)
    … 21 more
    Caused by: java.lang.NullPointerException
    at java.util.Hashtable.put(Hashtable.java:394)
    at java.util.Properties.setProperty(Properties.java:143)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:404)
    at org.apache.oozie.service.WorkflowAppService.createProtoActionConf(WorkflowAppService.java:156)
    … 22 more

    Job.proprties:
    nameNode=hdfs://IMPETUS-N157ubuntu:9000
    jobTracker=IMPETUS-N157ubuntu:9001
    oozie.wf.rerun.skip.nodes=copyToWorkingDir

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #11815

    Job.properties:

    nameNode=hdfs://IMPETUS-N157ubuntu:9000
    jobTracker=IMPETUS-N157ubuntu:9001
    oozie.wf.rerun.skip.nodes=copToWorkingDir
    queueName=default
    examplesRoot=examples
    remotIp=192.168.213.34
    userId=abccc
    #password=ab
    password=abcccc
    PartitionShellScript=/home/impadmin/sws/oozie_doc/examples/apps/logProcessing/addPartition.sh
    CleanUpShellScript=/home/impadmin/sws/oozie_doc/examples/apps/logProcessing/cleanup.sh
    #oozie.libpath={nameNode}/user/${user.name}/${examplesRoot}/apps/ssh/ssh/lib
    oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/logProcessing

    #11836
    Sasha J
    Moderator

    Vimlesh,
    it looks like you use UBUNTU on your cluster (based on the hostnames.
    Note, that HDP does not support Ubuntu at this time (only RedHat and CentOS supported).

    Thank you!
    Sasha

    #11863

    Thank you for reply.
    I getting same exception on CentOS as well.I am using following command for rerun.Please help me.
    “oozie job -oozie http://192.168.213.78:11000/oozie -config
    /usr/share/doc/oozie-3.1.3.15/examples/apps/test/job.properties -rerun
    0000029-121031145015634-oozie-oozi-W”

    Error: E0712 : E0712: Could not create lib paths list for application
    [hdfs://impetus-n163.impetus.co.in:8020/user/oozie/oozie/test], null

    As per error above error it could not create lib path list for application.I had gone through code of org.apache.oozie.service.WorkflowAppService. and found that it expect lib directory under oozie application to create lib path list.But in my application lib dir exist and jars are also there.Below is code snippet at where the exception is coming.

    if (isWorkflowJob) {
    // app path could be a directory
    Path path = new Path(uri.getPath());
    if (!fs.isFile(path)) {
    filePaths = getLibFiles(fs, new Path(appPath + “/lib”));
    } else {
    filePaths = getLibFiles(fs, new Path(appPath.getParent(), “lib”));
    }
    }
    else {
    filePaths = new ArrayList();
    }

    #21395
    tedr
    Moderator

    Hi Vimlesh,

    It looks like all you need is the ‘/’ at the start of the second parameter in the second getLibFiles call.

    Thanks,
    Ted.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.