Oozie Forum

Oozie via Hue: Sqoop Eval Failing

  • #43366
    Nick Martin

    I can successfully run Sqoop imports and exports via a workflow in Oozie, but I’m having trouble with the Sqoop -eval statement in Hue(Oozie). When I attempt to run a Sqoop -eval task I get “error parsing arguments for eval”. I’ve tried running both the command text version and inputting individual params and get errors with both. I should note that when I execute the same Sqoop -eval task I’m attempting in Hue in shell it runs perfectly fine.


    Sqoop command arguments :
    –connect jdbc:oracle:thin:@xxx-xxx.xxx.xxx.xxx:0000/SCHEMA
    –username xxx
    –password xxx

    >>> Invoking Sqoop command line now >>>

    575 [main] WARN org.apache.sqoop.tool.SqoopTool – $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
    608 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool – Error parsing arguments for eval:
    608 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool – Unrecognized argument: –connect jdbc:oracle:thin:@xxx-xxx.xxx.xxx.xxx:0000/SCHEMA
    608 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool – Unrecognized argument: –username xxx
    608 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool – Unrecognized argument: –password xxx
    608 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool – Unrecognized argument: -e ‘TRUNCATE TABLE SCHEMA.TABLE_INFO’
    usage: sqoop eval [GENERIC-ARGS] [TOOL-ARGS]

    Common arguments:
    –connect Specify JDBC connect
    –connection-manager Specify connection manager
    class name
    –connection-param-file Specify connection
    parameters file
    –driver Manually specify JDBC
    driver class to use
    –hadoop-home Override
    –hadoop-mapred-home Override
    –help Print usage instructions
    -P Read password from console
    –password Set authentication
    –password-file Set authentication
    password file path
    –username Set authentication
    –verbose Print more information
    while working

    SQL evaluation arguments:
    -e,–query Execute ‘statement’ in SQL and exit

    Generic Hadoop command-line arguments:
    (must preceed any tool-specific arguments)
    Generic options supported are
    -conf specify an application configuration file
    -D use value for given property
    -fs specify a namenode
    -jt specify a job tracker
    -files specify comma separated files to be copied to the map reduce cluste

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.