HDP on Linux – Installation Forum

Oozie test failed

  • #11744
    elena diez
    Member

    I am trying to install hmc in a single node cluster with all services. During the deploy, oozie test fails.
    Executing: grep fail /var/log/puppet_apply.log
    I get the following: Thu Nov 01 11:31:54 +0000 2012 /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns (err): change from notrun to 0 failed: sh /tmp/oozieSmoke.sh /etc/oozie/conf /etc/hadoop/conf ambari_qa returned 1 instead of one of [0] at /etc/puppet/agent/modules/hdp-oozie/manifests/oozie/service_check.pp:52

    The cluster installs and deploys when I select only the basic services.

    Can anybody help me?

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #11745
    elena diez
    Member

    I’ve also tried to install all services except oozie and it works.

    #11747
    tedr
    Member

    Elena,

    It is hard to tell exactly what is going on from just that small snippet, can you follow the instructions here
    http://hortonworks.com/community/forums/topic/hmc-installation-support-help-us-help-you/

    Thanks,
    Ted.

    #11760
    elena diez
    Member

    Hi,
    i am trying to connect to the ftp to upload the output file, but it says:
    Error: Failed to retrieve directory listing.

    thanks.

    #11761
    tedr
    Member

    Elena,

    This is strange, when I try I get in no problem. Maybe the server was having a problem when you were trying, do you still get the same error? Are you using command line ftp or another ftp client?

    Ted

    #11762
    elena diez
    Member

    Hi, still not working for me,
    i am trying with FileZilla

    #11766
    Sasha J
    Moderator

    Elena,
    that ftp site is a dropbox, it allows only writes into it, you can not get file listing.
    This is why FileZilla failed, as it always trying to read the directory content.
    just use command line ftp client, login to the server and execute command “put file”.
    it will be uploaded to the place.
    Then let us know of the file name.

    Thank you!
    Sasha

    #11944
    elena diez
    Member

    Hi there!,
    in the end i decided to rebuild the 3 servers with CentOS 6, and then following exactly the same steps that i did before, it worked,

    thank you anyway :)

    #11948
    tedr
    Member

    Thanks for letting us know.

    #11954
    Sasha J
    Moderator

    Elena,
    Just wondering, what OS it was built on before you “decided to rebuild the 3 servers with CentOS 6″?
    Also, in the beginning of the thread, you said: “I am trying to install hmc in a single node cluster”, at the end it is 3 servers…
    Could you please give us some deeper explanation on what are you trying to do?

    Thank you!
    Sasha

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.