Home Forums Sqoop Trouble with Teradata Connector

This topic contains 2 replies, has 3 voices, and was last updated by  Robert Molina 10 months, 1 week ago.

  • Creator
    Topic
  • #34119

    I’m trying to test out the teradata connector in my sandbox environment by doing a simple extract of dbc.tables to a file. The problem I’m running into seems to be related to Sqoop or the TD adaptor trying to an “rm” instead of an “rm -r” on the staging directory it creates on hdfs.

    I’ve gotten it to connect to Teradata and display tables, run queries, etc. Just when trying to export data to file seems to be an issue.

    Command I am using:


    sqoop import \
    -libjars $LIB_JARS \
    -Dteradata.db.input.job.type=hdfs \
    -Dteradata.db.input.source.table=dbc.tables \
    -Dteradata.db.input.target.paths=/users/chris \
    --connect jdbc:***MyConnectString*** \
    --connection-manager org.apache.sqoop.teradata.TeradataConnManager \
    --table=dbc.tables\
    --username ***Myuser*** \
    --password ***mypass***\
    --target-dir /user/chris \
    --split-by databasename

    Output I am getting:


    13/09/03 10:17:39 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
    13/09/03 10:17:39 INFO manager.SqlManager: Using default fetchSize of 1000
    13/09/03 10:17:39 INFO tool.CodeGenTool: Beginning code generation
    13/09/03 10:17:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM dbc.tables AS t WHERE 1=0
    13/09/03 10:17:44 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop
    13/09/03 10:17:44 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop/hadoop-core.jar
    Note: /tmp/sqoop-hdfs/compile/2133daa3bc737b3d7257cfea97a9fc47/dbc_tables.java uses or overrides a deprecated API.
    Note: Recompile with -Xlint:deprecation for details.
    ....

    13/09/03 10:18:02 INFO mapred.JobClient: Cleaning up the staging area hdfs://sandbox:8020/user/hdfs/.staging/job_201308280904_0005
    13/09/03 10:18:02 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs cause:java.io.IOException: Target hdfs://sandbox:8020/user/hdfs/.staging/job_201308280904_0005/libjars/conf/conf is a directory
    13/09/03 10:18:02 INFO mapreduce.TeradataInputProcessor: job cleanup starts at 1378228682948
    13/09/03 10:18:04 INFO mapreduce.TeradataInputProcessor: job cleanup ends at 1378228684426
    13/09/03 10:18:04 INFO mapreduce.TeradataInputProcessor: job cleanup time is 1s
    13/09/03 10:18:04 ERROR teradata.TeradataSqoopImportJob: Exception running Teradata import job
    com.teradata.hadoop.exception.TeradataHadoopException: java.io.IOException: Target hdfs://sandbox:8020/user/hdfs/.staging/job_201308280904_0005/libjars/conf/conf is a directory
    3/09/03 10:18:02 INFO mapreduce.TeradataInputProcessor: job cleanup starts at 1378228682948
    13/09/03 10:18:04 INFO mapreduce.TeradataInputProcessor: job cleanup ends at 1378228684426
    13/09/03 10:18:04 INFO mapreduce.TeradataInputProcessor: job cleanup time is 1s
    13/09/03 10:18:04 ERROR teradata.TeradataSqoopImportJob: Exception running Teradata import job
    com.teradata.hadoop.exception.TeradataHadoopException: java.io.IOException: Target hdfs

Viewing 2 replies - 1 through 2 (of 2 total)

The topic ‘Trouble with Teradata Connector’ is closed to new replies.

  • Author
    Replies
  • #40808

    Robert Molina
    Moderator

    Hi Chris,
    What user are you running the sqoop job as? Have you tried running it as hdfs user?

    Regards,
    Robert

    Collapse
    #37764

    Jack Liu
    Member

    Hi Chris Schrader,
    I wonder what is your hdp version?
    I failed to find the class org.apache.sqoop.teradata.TeradataConnManager, could you please share where does the jar come from?

    and I am using hdp 2.1.x.

    Appreciate so much for your help.

    Collapse
Viewing 2 replies - 1 through 2 (of 2 total)