Sqoop Error while exporting Hive table to SQL Server

to create new topics or reply. | New User Registration

This topic contains 5 replies, has 2 voices, and was last updated by  Robert Molina 1 year, 3 months ago.

  • Creator
  • #41711

    Vikas Madaan

    I am trying to export a table from HIVE to SQL Server and getting below error.
    Here is how the table data looks in hive
    377910599125962752 377910599125962752 Wed Sep 11 21:44:42 +0000 2013
    Wed Sep 11 2013 2013 09 11 21:44:42 null RT @History_Pics
    : The first Harley-Davidson factory; a tiny wooden shed in the Davidson family b
    ackyard in Milwaukee. 1903 http://t.co/HzHX… null false false null
    web 0 NULL [null,null,null,null,null] [“history_pics”,null,nul
    l,null,null] NULL history_pics HeavenScentBlis Annie Day 242
    6 193 en Staffordshire London http://a0.twimg.com/profile_imag

    Here is the error i get

    C:\Hadoop\sqoop-1.4.2\bin\sqoop.cmd export –connect “jdbc:sqls
    erver://MKE123858N02:1433;database=CompuwareMKE;user=sa;password=Mnbv1234$$” –t
    able hd_twitter_details_data –export-dir /apps/hive/warehouse/hd_twitter_detail
    s_data –input-fields-terminated-by “\t”
    Warning: HBASE_HOME and HBASE_VERSION not set.
    Warning: HBASE_HOME does not exist HBase imports will fail.
    Please set HBASE_HOME to the root of your HBase installation.
    13/10/24 10:42:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* F
    ROM [hd_twitter_details_data] AS t WHERE 1=0
    13/10/24 10:42:24 INFO orm.CompilationManager: HADOOP_HOME is c:\Hadoop\hadoop-1
    13/10/24 10:42:24 INFO orm.CompilationManager: Found hadoop core jar at: c:\Hado
    Note: \tmp\sqoop-BMWVXM0\compile\1e115f5b0e55be2c056af3bf17f0c085\hd_twitter_det
    ails_data.java uses or overrides a deprecated API.
    Note: Recompile with -Xlint:deprecation for details.
    13/10/24 10:42:26 INFO orm.CompilationManager: Writing jar file: \tmp\sqoop-BMWV
    13/10/24 10:42:26 INFO mapreduce.ExportJobBase: Beginning export of hd_twitter_d
    13/10/24 10:43:08 INFO input.FileInputFormat: Total input paths to process : 1
    13/10/24 10:43:08 INFO input.FileInputFormat: Total input paths to process : 1
    13/10/24 10:43:08 INFO mapred.JobClient: Running job: job_201310220941_0015
    13/10/24 10:43:09 INFO mapred.JobClient: map 0% reduce 0%
    13/10/24 10:44:50 INFO mapred.JobClient: Task Id : attempt_201310220941_0015_m_0
    00000_0, Status : FAILED
    java.lang.NumberFormatException: For input string: “377910599125962752☺377910599
    125962752☺Wed Sep 11 21:44:42 +0000 2013☺Wed Sep 11 2013☺2013☺09☺11☺21:44:42☺nul
    l☺RT @History_Pics: The first Harley-Davidson factory; a tiny wooden shed in the
    Davidson family backyard in Milwaukee. 1903 http://t.co/HzHXΓǪ☺null☺false☺fals
    entBlis☺Annie Day☺242☺6☺193☺en☺Staffordshire☺London☺http://a0.twimg.com/profile_

    looks like it concates everything.

Viewing 5 replies - 1 through 5 (of 5 total)

The topic ‘Sqoop Error while exporting Hive table to SQL Server’ is closed to new replies.

  • Author
  • #44778

    Robert Molina

    Hi Vikas,
    Do the task logs : attempt_201310220941_0015_m_0
    00000_0, Status : FAILED

    show any further information regarding the error?



    Vikas Madaan

    bigint and string on both HIVE and SQL Server side.


    Robert Molina

    Hi Vikas,
    What are the data types of your source and the target?



    Vikas Madaan

    Yes i am using HDP1.1 i am trying this for one table. i have a 2 field table that i am having issue with.

    When i try to export 1 field a time, it goes fine. but when i export both the fields together it creates and issue for me.


    Robert Molina

    HI Vikas,
    Are you using HDP 1.1 for Windows? Does this happen for all tables? You might want to see if you can isolate if it’s specific to a field.


Viewing 5 replies - 1 through 5 (of 5 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.