Oozie Forum

Oozie Streaming Job Fails when using "Add File" section

  • #45776

    I have installed HDP 2.x, and hue. (2 hosts)

    I need to solve this issue as I’m new to oozie workflows.
    I followed example here: http://www.oraclealchemist.com/news/hadoop-streaming-hue-oozie-workflows-hive/
    It works perfectly if I have mapper and reducers in all hosts at respective paths only. I see ambari-qa smoke test of oozie is fine but when I submit a streaming job from Hue UI for Oozie, where mapper and reducers where shell scripts which performs word count (term frequency) and I wanted to use “Add File” (-file in hadoop command line) section in Oozie workflow editor and upload mapper and reducers to HDFS. After this change and submitting this job, there was an error. I get the following error for all attempts of map tasks:

    2013-12-16 19:21:24,278 ERROR [main] org.apache.hadoop.streaming.PipeMapRed: configuration exception
    java.io.IOException: Cannot run program “/hadoop/yarn/local/usercache/root/appcache/application_1387201627160_0006/container_1387201627160_0006_01_000002/./maptf.sh”: java.io.IOException: error=2, No such file or directory

    This means it can’t able to find mapper and reducer in that path where oozie/mapred/yarn will create files onfly. Do I have any oozie configuration and workflow issues? (logs by email: sandeepboda91083@gmail.com)

    In HDFS, I have all paths and files setup correctly under root user.

    Note: I can able to run streaming jobs without oozie as:
    cd /root/mrtest/
    -rwxrwxrwx 1 root root 235 Dec 11 11:37 maptf.sh
    -rwxrwxrwx 1 root root 273 Dec 11 11:37 redtf.sh

    hadoop jar /usr/lib/hadoop-mapreduce/hadoop-streaming- -D stream.num.map.output.key.fields=1 -input crane_in1 -output crane_out2 -file ./maptf.sh -mapper maptf.sh -file ./redtf.sh -reducer redtf.sh

    It seems I cant attach logs here. Please mail me for files and logs.

to create new topics or reply. | New User Registration

  • Author
  • #45890

    I solved this problem by refering at:

    I edited shell scripts in windows notepad/wordpad and uploaded via “Add File” upload method. This means for each end of line I see \r\n appended and this causes the error. I need to do dos2unix conversion and made it work. But why the error message is misleading? “No such file or directory”

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.