HDP on Linux – Installation Forum

Oozie smoke test is getting failed

  • #38230
    Tanzir
    Participant

    notice: /Stage[1]/Hdp::Snappy::Package/Hdp::Snappy::Package::Ln[32]/Hdp::Exec[hdp::snappy::package::ln 32]/Exec[hdp::snappy::package::ln 32]/returns: executed successfully
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: Moved to trash: hdfs://ip-10-0-0-75:8020/user/ambari-qa/examples
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: Moved to trash: hdfs://ip-10-0-0-75:8020/user/ambari-qa/input-data
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: Error: IO_ERROR : java.net.ConnectException: Connection refused
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: Invalid sub-command: Missing argument for option: info
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns:
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: use ‘help [sub-command]’ for help details
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: Invalid sub-command: Missing argument for option: info
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns:
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: use ‘help [sub-command]’ for help details
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns:
    notice: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: workflow_status=
    err: /Stage[2]/Hdp-oozie::Oozie::Service_check/Hdp-oozie::Smoke_shell_file[oozieSmoke.sh]/Exec[/tmp/oozieSmoke.sh]/returns: change from notrun to 0 failed: sh /tmp/oozieSmoke.sh /etc/oozie/conf /etc/hadoop/conf ambari-qa false /etc/security/keytabs/smokeuser.headless.keytab EXAMPLE.COM jt/ip-10-0-0-76@EXAMPLE.COM nn/ip-10-0-0-75@EXAMPLE.COM returned 1 instead of one of [0] at /var/lib/ambari-agent/puppet/modules/hdp-oozie/manifests/oozie/service_check.pp:63
    notice: Finished catalog run in 46.68 seconds
    —————————————————————-

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #38231
    Tanzir
    Participant

    I have applied the patch https://issues.apache.org/jira/browse/AMBARI-2879. I DO NOT think the document http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-1.3.2/bk_releasenotes_hdp_1.x/content/ch_relnotes-hdp1.3.2_5_oozie.html is 100% correct. On point 2, it mentions “Replace /var/lib/ambari-agent/puppet/modules/hdp-nagios/files/check_oozie_status.sh with the downloaded file.” but after a fresh install there was not any check_oozie_status.sh file at all. It had only two files oozieSmoke.sh and wrap_ooziedb.sh.

    Thanks in advance.

    #38232
    Tanzir
    Participant

    Sorry, I missed the part “hdp-nagios” in the middle, apology. After applying the patch, I’m still getting the same error.

    Thanks in advance.

    #38239
    Dave
    Moderator

    Hi Tanzir,

    Are you running in a secured (Kerberos) cluster?
    This is HDP 1.3.2 ?
    Did you use Ambari to install the cluster?

    Thanks

    Dave

    #38245
    Tanzir
    Participant

    Hi Dave,
    Thanks for your response. Yes, I’m running HDP 1.3.2 and I used Ambari to install the cluster on Amazon EC2 (inside a VPC).

    #38247
    Tanzir
    Participant

    From the oozie.log:

    2013-09-26 22:08:00,659 FATAL Services:533 – Runtime Exception during Services Load. Check your list of ‘oozie.services’ or ‘oozie.services.ext’
    2013-09-26 22:08:00,665 FATAL Services:533 – USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] E0103: Could not load service classes, Schema ‘OOZIE’ does not exist {SELECT t0.bean_type, t0.conf, t0.console_url, t0.cred, t0.data, t0.error_code, t0.error_message, t0.external_child_ids, t0.external_id, t0.external_status, t0.name, t0.retries, t0.stats, t0.tracker_uri, t0.transition, t0.type, t0.user_retry_count, t0.user_retry_interval, t0.user_retry_max, t0.end_time, t0.execution_path, t0.last_check_time, t0.log_token, t0.pending, t0.pending_age, t0.signal_value, t0.sla_xml, t0.start_time, t0.status, t0.wf_id FROM WF_ACTIONS t0 WHERE t0.bean_type = ? AND t0.id = ?} [code state="42Y07" language="30000,"][/code]
    org.apache.oozie.service.ServiceException: E0103: Could not load service classes, Schema ‘OOZIE’ does not exist {SELECT t0.bean_type, t0.conf, t0.console_url, t0.cred, t0.data, t0.error_code, t0.error_message, t0.external_child_ids, t0.external_id, t0.external_status, t0.name, t0.retries, t0.stats, t0.tracker_uri, t0.transition, t0.type, t0.user_retry_count, t0.user_retry_interval, t0.user_retry_max, t0.end_time, t0.execution_path, t0.last_check_time, t0.log_token, t0.pending, t0.pending_age, t0.signal_value, t0.sla_xml, t0.start_time, t0.status, t0.wf_id FROM WF_ACTIONS t0 WHERE t0.bean_type = ? AND t0.id = ?} [code state="42Y07" language="30000,"][/code]
    at org.apache.oozie.service.Services.loadServices(Services.java:291)
    at org.apache.oozie.service.Services.init(Services.java:208)
    at org.apache.oozie.servlet.ServicesLoader.contextInitialized(ServicesLoader.java:39)
    at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4705)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
    at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
    at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
    at org.apache.catalina.core.StandardEngine.start(

    #38252
    Koelli Mungee
    Moderator

    Hi Tanzir,

    Let’s just take a step back here. At this point is the oozie server up and showing green on Ambari or can you grep the pid? If so, can you browse the oozie server?

    http://{oozie.full.hostname}:11000/oozie

    Thanks,
    Koelli

    #38253
    Tanzir
    Participant

    Hi Dave,
    It’s showing red on Ambari. And the url:

    http://{oozie.full.hostname}:11000/oozie -> public IP address of the oozie server

    is not working.

    #38256
    Tanzir
    Participant

    I had to manually create the schema for oozie:

    [root@ip-10-0-0-149 db]# su -l oozie
    [oozie@ip-10-0-0-149 ~]$ /usr/lib/oozie/bin/ooziedb.sh create -run

    Now it seems working and smoke test is getting passed. Any idea why I need to manually create the schema for oozie?

    Thanks again.

    #38260
    Dave
    Moderator

    Hi Tanzir,

    Did you select “New database” in Ambari and MySQL?
    Did you choose existing database for Hive?

    It should create it as part of the installer (unless the ;create=true is missed off the jdbc connection for oozie)

    Thanks

    Dave

    #38352
    Tanzir
    Participant

    I choose “New Derby Database” and I saw create=true but still it didn’t create the schema.

    jdbc:derby:${oozie.data.dir}/${oozie.db.schema.name}-db;create=true

    Tanzir

    #38355
    Tanzir
    Participant

    I found out the issue. If I shutdown the instance (Amazon EC2) and start again, it will erase the schema again. Every time I start the instance, I need to create the schema manually. How to prevent this from happening? Any idea?

    Thanks in advance.
    – Tanzir

    #38401
    Dave
    Moderator

    Hi Tanzir,

    Unfortunately I don’t know what would be causing this apart from an environmental issue with EC2.
    I’ll dig around and let you know if I find out anything

    Thanks

    Dave

    #38403
    Tanzir
    Participant

    Hi Dave,
    Thanks a lot for your help. I have found the issue. When I created those instances for HDP, I forgot to remove ephemeral storage from the instance. So what happened then that during installation the Oozie data directory was pointed to /mnt/hadoop/oozie/data by default. I thought unless I mention during installation, it will not use /mnt point for oozie or other services.

    As a result, after I stop/start the instance all data in that mount point (ephemeral storage) got lost and hence Oozie didn’t find the schema. This is also the reason behind the issue with the namenode formatting (my other thread related to HDFS).

    To be sure about that, I just installed HDP 1.3.2 again in another cluster and this time I removed ephemeral storage from the instances. So, this time Oozie data path is pointed to /hadoop/oozie/data. Now everything seems working and even “Start All” and “Stop All” buttons are working as expected. Namenode issue also resolved now.

    Thanks again,

    – Tanzir

    #38405
    Dave
    Moderator

    Tanzir,

    I’m glad you got it all resolved.
    I was scratching my head wondering what would cause the loss of the schema – but that explains it!

    Thanks

    Dave

The topic ‘Oozie smoke test is getting failed’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.