Home Forums HDP on Windows – Installation Launch From Windows To Linux Cluster

This topic contains 2 replies, has 2 voices, and was last updated by  Ian Cadieu 4 months, 2 weeks ago.

  • Creator
    Topic
  • #53780

    Ian Cadieu
    Participant

    I’m trying to get some examples running. I got HDP2.2 installed and can submit jobs but they all fail on container startup with bash code 127 (command not found). When inspecting the generated launch_container.sh script, it looks like {{JAVA_HOME}} isn’t getting replaced properly. I’ve tried everything I can think of. Any thoughts? What do I need to do in order to make it set this variable?

    Hadoop Command (with simplified map script, but even this reproduces the error):
    jar hadoop-streaming-2.3.0-cdh5.0.0.jar -input “/data/test.csv” -output “/output.txt” -mapper “cat launch_container.sh” -numReduceTasks 0 -verbose
    Snippet from Launch_Container.sh:
    exec /bin/bash -c “{{JAVA_HOME}}/bin/java -Dlog4…
    Error from log file:
    /bin/bash: {{JAVA_HOME}}/bin/java: No such file or directory

Viewing 2 replies - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #53992

    Ian Cadieu
    Participant

    So I got this working and figured it out. The issue is you need Yarn 2.4 running on the server for it to work properly (pretty much any distribution I think).

    I had been trying with the cloudera streaming jar, and the hortonworks streaming jar thinking it’s a client side problem, but it actually required a server side upgrade to be compliant. Debugging container startup issues are a bit of a nightmare though.

    Collapse
    #53946

    Dave
    Moderator

    Hi Ian,

    Do you mean HDP 2.1.2 ?
    Have you set JAVA_HOME correctly to C:\java in your system environment variables and added java to the PATH ?
    Also hadoop-streaming-2.3.0-cdh5.0.0.jar is a Cloudera jar.
    Did you install HDP or CDH?

    Thanks

    Dave

    Collapse
Viewing 2 replies - 1 through 2 (of 2 total)