Oozie Forum

JA018 Cannot run program

  • #52193
    Gwenael Le Barzic

    Hello !

    I contact you because we currently have a problem on our cluster in HDP 2.0 with kerberos installed.
    We have Oozie server installed on one server and we have 6 data nodes.

    We tried to launch a workflow which needs to launch a shell script in bash, but we keep encountering the following error message :
    0000014-140423121102029-oozie-oozi-W@myAction http://<NAMENODE_FQDN>:<PORT>/proxy/application_1398179473939_0031/ JA018 Cannot run program "myShell.sh" (in directory "/var/opt/data/flat/data003/hadoop/yarn/local/usercache/my_user/appcache/application_1398179473939_0031/container_1398179473939_0031_01_000001"): error=13, Permission denied job_1398179473939_0031 FAILED/KILLED myAction 0 <NAMENODE_FQDN>:<PORT> shell 2014-04-23 12:53:17 GMT ERROR 2014-04-23 12:53:29 GMT

    Here is the command line to launch the workflow :
    oozie job -oozie=http://<HOSTNAME_OOZIE_SERVER>:<PORT>/oozie -debug -verbose -config /home/my_user/job.properties -nocleanup -run

    Here is the content of job.properties :
    #nameNode Adress
    #jobTracker adress
    #Yarn queue
    #adress where the workflow is deployed
    #adresse du serveur oozie

    Here is the workflow.xml :
    <workflow-app xmlns="uri:oozie:workflow:0.5" name="myWorkflow">
    <start to="myAction" />
    <action name="myAction">
    <shell xmlns="uri:oozie:shell-action:0.2">
    <capture-output />
    <ok to="end" />
    <error to="fail" />
    <kill name="fail">
    <message>Workflow failed, error
    <end name="end" />

    I will continue my description in my next post.

    Best regards.

    Gwenael Le Barzic

to create new topics or reply. | New User Registration

  • Author
  • #52194
    Gwenael Le Barzic

    Here is what I already tried :
    1. Check on each data nodes the authorization on each folder “/var/opt/data/flat/data<XXX>/hadoop/yarn/local/usercache/my_user/appcache” : everything is OK
    2. Check the content of my file myShell.sh to check if there is bad things inside : OK
    3. The workflow.xml file is located in HDFS in /user/my_user/ with the following authorization :
    -rw-rw-r-- 3 my_user my_group 957 2014-04-23 15:22 /user/my_user/workflow.xml

    4. The shell myshell.sh is located in HDFS in /user/myuser/oozie/workflow/shell/ with the following rights :
    -rw-rw-r-- 3 my_user my_group 3936 2014-04-22 11:34 /user/myuser/oozie/workflow/shell/myshell.sh

    Best regards.

    Gwenael Le Barzic

    Amudhan K

    Hello Gwenael Le Barzic,
    The problem is not with your workflow or job.properties. The error shows that the permission is denied for the JA018. You need to add the proxy user properties to the hadoop core-site.xml. the properties which you need to add are


    after adding these properties to your cluster and execute the refresh nodes command or restart your cluster.

    Amudhan K

    Also check whether the user JA018 or the user from which you are running oozie have all the access to your HDFS.

    hiral matalia

    Hi Amudhan

    I am pretty new to hortworks stack and configuration

    in above post you mentioned that check whether the user JA018 or other user have access to your HDFS .

    what is the best way to check ?




You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.