Oozie Forum

oozie is not allowing to impersonate oozie while Submiting Job

  • #10518

    HI,

    I am facing below issue while attempting to execute an Oozie workflow from ‘oozie’ user,

    [oozie@localhost tmp]$ oozie job -oozie http://localhost:11000/oozie -config oozietest/job.properties -submit
    Error: E0902 : E0902: Exception occured: [org.apache.hadoop.ipc.RemoteException: User: oozie is not allowed to impersonate oozie]

    And when try to dig into the error in google, it seems the core-site.xml is need to be changed with the Oozie proxy user with.

    hadoop.proxyuser.oozie.groups
    *
    Proxy group for Hadoop.
    hadoop.proxyuser.oozie.hosts
    *

    Here i don’t have privileges to change the core-site.xml file content. So i would like to override core-site.xml file through command line.Please help me to resolve the issue.

    Thanks,

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #12026
    Robert
    Participant

    Hi Sandeep,
    Unfortunately, the only way you will be to edit the file is to have read/write/execute permissions on that file. Once you have that, you should be able to easily edit the xml file and make the changes you suggested.

    Regards,
    Rob

    #29096

    hey are you able to fix this issue after changing * to mentioned properties?

    #29165
    tedr
    Moderator

    Hi Faisal,

    We don’t know if he was able to make the changes, as of his last input he didn’t have write permissions to the files.

    Thanks,
    Ted.

    #29241

    Thanks Ted, but are we sure once we change these properties oozie will start working?
    I just got to know oozie requires kerberos setup to work even though following property is not specified in oozie-site.xml

    oozie.service.HadoopAccessorService.kerberos.enabled
    false

    My constraint is i also don’t have write access to hadoop environment therefore i have to communicate changes to my administrator that allow me to run oozie examples.

    #29243
    tedr
    Moderator

    Hi Faisal,

    Oozie does not need kerberos enabled to work, what it needs is proper setup of the accounts (hosts and users) that it is allowed to impersonate. These are configured in core-site.xml as mentioned in the initial post of this forum. If you are on an Ambari installed cluster you should not make the changes directly in the files they need to be made in the Ambari UI.

    Thanks,
    Ted.

    #29247

    Thanks Ted, can you please point me where in ambari i can set these properties. ?
    I downloaded the sandbox and login to ambari but don’t see any option.
    http://127.0.0.1:8080/#/main/hosts/sandbox/summary
    If this option is only available on ambari cluster then please direct me the place i can make these configurations..

    Thanks,

    #29383

    I got it, somehow i was overlooking the config option :)
    Let me change these configuration on my sandbox ..

    #29428
    tedr
    Moderator

    Hi Faisal,

    Let us know how it goes once the config changes are done.

    Thanks,
    Ted.

    #29433

    Hi Ted,
    I have downloaded a sandbox and running ambari on it and changed these 2 proeprties as mentioned.
    but from CLI now i am getting this exception
    ————–
    Error: E0405 : E0405: Submission request doesn’t have any application or lib path
    ————–
    so here is what i am doing
    1- i logged in sandbox as root
    2- download oozie example and placed it on root here is the location of job.proeprties i am tring to run
    /oozie-3.0.2/examples/apps/map-reduce/job.properties
    3- here is my job.properties
    nameNode=hdfs://localhost:8020
    #nameNode=${hadoop.name.node}
    jobTracker=localhost:8021
    #jobTracker=${hadoop.job.tracker}
    queueName=default
    examplesRoot=examples
    oozie.wfI.application.path=${nameNode}/user/hue/examples/apps/map-reduce
    outputDir=map-reduce

    4- here is my workflow.xml

    ${jobTracker}
    ${nameNode}

    <!– –>

    mapred.job.queue.name
    ${queueName}

    mapred.mapper.class
    org.apache.oozie.example.SampleMapper

    mapred.reducer.class
    org.apache.oozie.example.SampleReducer

    mapred.map.tasks
    1

    mapred.input.dir
    <!– /user/${wf:user()}/${examplesRoot}/input-data/text –>
    /user/hue/examples/intput-data/text

    mapred.output.dir
    <!– /user/${wf:user()}/${examplesRoot}/output-data/${outputDir} –>
    /user/hue/examples/apps/map-reduce/output-data

    Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]

    3- run following command from user root as well as hue. but same error
    oozie job -oozie http://127.0.0.1:11000/oozie -config /oozie-3.0.2/examples/apps/map-redu

    #29434

    In addition i also placed this examples folder on hdfs ->/user/hue/examples

    #29614
    tedr
    Moderator

    Hi Faisal,

    Does it make any difference if you change ‘localhost’ to ‘sandbox’ in your job.properties?

    thanks,
    Ted.

    #29624

    Thanks Ted, here is my latest job.properties and it worked :)

    job.properties
    ==========
    nameNode=hdfs://sandbox:8020
    jobTracker=sandbox:50300
    queueName=default
    examplesRoot=examples

    oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/map-reduce
    outputDir=map-reduce

    #29643
    tedr
    Moderator

    Hi Faisal,

    Thanks for letting us know that the change worked! :)

    Thanks,
    Ted

    #29647

    My Pleasure Ted :)
    Can you please take a look at my problem i posted on “Hive run via oozie” thread?

    Thanks
    Faisal

    #29658
    tedr
    Moderator

    Hi Faisal,

    I have posted to that thread a link where the information looks like it will help you.

    Thanks,
    Ted.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.