The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Oozie Forum

Oozie Workflow Shell Action Permission Denied on yarn.nodemanager.local-dirs

  • #55784
    Baris ERGUN
    Participant

    I am trying to run an Oozie Workflow Shell Action from HUE. I had some HDFS permissions which I could fix easily by adding properties to Oozie xml. I am stuck 6 hours trying to get around a local filesystem permission issue. Using HDP 2.0.6 installed with Ambari + installed Hue as described in documentation. When I submit workflow as hue (member of hadoop group)user with only a Shell Action in it I get the below error log:


    ACTION[0000010-140526124040148-oozie-oozi-W@ChargingVariables] Launcher exception: Cannot run program "charging_related_calculations" (in directory "/space/hadoop/yarn/local/usercache/hue/appcache/application_1402668961478_0005/container_1402668961478_0005_01_000002"): error=13, Permission denied
    java.io.IOException: Cannot run program "charging_related_calculations" (in directory "/space/hadoop/yarn/local/usercache/hue/appcache/application_1402668961478_0005/container_1402668961478_0005_01_000002"): error=13, Permission denied

    everytime after I submit the job the directory /space/hadoop/yarn/local/usercache/hue/appcache permissions are automatically changed to 710 and the owner of the directory is yarn:hadoop. So hadoop group has only execution right on the appcache directory created. I am sure about this because I watched all three nodes and saw that randomly shell script is being copied under the above directory.

    I am running Oozie Worklfow as hue user and hue is member of hadoop linux group. I obeserved all the local folder and files being copied to temp appcache. I even have a copy of folders which I instantly cp -R d when the tmp folders were created, and I can share them. I also executed and got the below weird error Any Idea?

    2014-06-13 22:13:23,106 INFO [main] org.apache.hadoop.ipc.Client: Retrying connect to server: tex655.tnhdpdemo/10.35.36.55:42020. Already tried 49 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1 SECONDS)
    2014-06-13 22:13:23,110 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.net.ConnectException: Call From tex655.tnhdpdemo/10.35.36.55 to tex655.tnhdpdemo:42020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
    at org.apache.hadoop.ipc.Client.call(Client.java:1351)
    at org.apache.hadoop.ipc.Client.call(Client.java:1300)
    at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.i

  • Author
    Replies
  • #55785
    Baris ERGUN
    Participant

    More detailed


    2014-06-13 22:12:32,191 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
    2014-06-13 22:12:32,240 WARN [main] org.apache.hadoop.conf.Configuration: job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
    2014-06-13 22:12:32,567 INFO [main] org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
    2014-06-13 22:12:32,674 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSinkAdapter: Sink ganglia started
    2014-06-13 22:12:32,841 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
    2014-06-13 22:12:32,841 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started
    2014-06-13 22:12:32,865 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:
    2014-06-13 22:12:32,865 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1402668961478_0015, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@1aaa8aa5)
    2014-06-13 22:12:32,906 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: RM_DELEGATION_TOKEN, Service: 10.35.36.55:8050, Ident: (owner=hue, renewer=oozie mr token, realUser=oozie, issueDate=1402685678055, maxDate=1403290478055, sequenceNumber=58, masterKeyId=2)

    ....

    Retries=50, sleepTime=1 SECONDS)
    2014-06-13 22:13:23,110 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.net.ConnectException: Call From tex655.tnhdpdemo/10.35.36.55 to tex655.tnhdpdemo:42020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
    at org.apache.hadoop.ipc.Client.call(Client.java:1351)
    at org.apache.hadoop.ipc.Client.call(Client.java:1300)
    at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:231)
    at com.sun.proxy.$Proxy6.getTask(Unknown Source)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:133)
    Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:735)

    #55844
    Baris ERGUN
    Participant

    I dont know why and how launcher script was using 42020 port for YarnChild IPC call but I changed it manually to 10020 which is mapreduce.jobhistory.address (or should I set it to different port???) I have now the below failure message


    2014-06-15 00:13:57,262 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:
    2014-06-15 00:13:57,263 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1402668961478_0015, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@1aaa8aa5)
    2014-06-15 00:13:57,355 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: RM_DELEGATION_TOKEN, Service: 10.35.36.55:8050, Ident: (owner=hue, renewer=oozie mr token, realUser=oozie, issueDate=1402685678055, maxDate=1403290478055, sequenceNumber=58, masterKeyId=2)
    2014-06-15 00:13:57,475 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.
    2014-06-15 00:13:57,879 ERROR [main] org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:job_1402668961478_0015 (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): Unknown version of delegation token 22
    2014-06-15 00:13:57,880 WARN [main] org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(java.io.IOException): Unknown version of delegation token 22
    2014-06-15 00:13:57,880 ERROR [main] org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:job_1402668961478_0015 (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): Unknown version of delegation token 22
    2014-06-15 00:13:57,882 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : org.apache.hadoop.ipc.RemoteException(java.io.IOException): Unknown version of delegation token 22
    at org.apache.hadoop.ipc.Client.call(Client.java:1347)
    at org.apache.hadoop.ipc.Client.call(Client.java:1300)
    at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:231)
    at com.sun.proxy.$Proxy6.getTask(Unknown Source)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:133)

    2014-06-15 00:13:57,882 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics system...
    2014-06-15 00:13:57,883 INFO [ganglia] org.apache.hadoop.metrics2.impl.MetricsSinkAdapter: ganglia thread interrupted.
    2014-06-15 00:13:57,883 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system stopped.
    2014-06-15 00:13:57,883 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system shutdown complete.

    #55912
    Baris ERGUN
    Participant

    Okey I found out what is wrong it is a little bit lack of documentation and user interface mislead problem. You should set your script from Files and on Shell Command only enter the name of your script ie. bloody_run.sh

The forum ‘Oozie’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.