The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Sqoop Forum

Sqoop Error while exporting Hive table to SQL Server

  • #41711
    Vikas Madaan

    I am trying to export a table from HIVE to SQL Server and getting below error.
    Here is how the table data looks in hive
    377910599125962752 377910599125962752 Wed Sep 11 21:44:42 +0000 2013
    Wed Sep 11 2013 2013 09 11 21:44:42 null RT @History_Pics
    : The first Harley-Davidson factory; a tiny wooden shed in the Davidson family b
    ackyard in Milwaukee. 1903… null false false null
    web 0 NULL [null,null,null,null,null] [“history_pics”,null,nul
    l,null,null] NULL history_pics HeavenScentBlis Annie Day 242
    6 193 en Staffordshire London

    Here is the error i get

    C:\Hadoop\sqoop-1.4.2\bin\sqoop.cmd export –connect “jdbc:sqls
    erver://MKE123858N02:1433;database=CompuwareMKE;user=sa;password=Mnbv1234$$” –t
    able hd_twitter_details_data –export-dir /apps/hive/warehouse/hd_twitter_detail
    s_data –input-fields-terminated-by “\t”
    Warning: HBASE_HOME and HBASE_VERSION not set.
    Warning: HBASE_HOME does not exist HBase imports will fail.
    Please set HBASE_HOME to the root of your HBase installation.
    13/10/24 10:42:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* F
    ROM [hd_twitter_details_data] AS t WHERE 1=0
    13/10/24 10:42:24 INFO orm.CompilationManager: HADOOP_HOME is c:\Hadoop\hadoop-1
    13/10/24 10:42:24 INFO orm.CompilationManager: Found hadoop core jar at: c:\Hado
    Note: \tmp\sqoop-BMWVXM0\compile\1e115f5b0e55be2c056af3bf17f0c085\hd_twitter_det uses or overrides a deprecated API.
    Note: Recompile with -Xlint:deprecation for details.
    13/10/24 10:42:26 INFO orm.CompilationManager: Writing jar file: \tmp\sqoop-BMWV
    13/10/24 10:42:26 INFO mapreduce.ExportJobBase: Beginning export of hd_twitter_d
    13/10/24 10:43:08 INFO input.FileInputFormat: Total input paths to process : 1
    13/10/24 10:43:08 INFO input.FileInputFormat: Total input paths to process : 1
    13/10/24 10:43:08 INFO mapred.JobClient: Running job: job_201310220941_0015
    13/10/24 10:43:09 INFO mapred.JobClient: map 0% reduce 0%
    13/10/24 10:44:50 INFO mapred.JobClient: Task Id : attempt_201310220941_0015_m_0
    00000_0, Status : FAILED
    java.lang.NumberFormatException: For input string: “377910599125962752☺377910599
    125962752☺Wed Sep 11 21:44:42 +0000 2013☺Wed Sep 11 2013☺2013☺09☺11☺21:44:42☺nul
    l☺RT @History_Pics: The first Harley-Davidson factory; a tiny wooden shed in the
    Davidson family backyard in Milwaukee. 1903ΓǪ☺null☺false☺fals
    entBlis☺Annie Day☺242☺6☺193☺en☺Staffordshire☺London☺

    looks like it concates everything.

  • Author
  • #41757
    Robert Molina

    HI Vikas,
    Are you using HDP 1.1 for Windows? Does this happen for all tables? You might want to see if you can isolate if it’s specific to a field.


    Vikas Madaan

    Yes i am using HDP1.1 i am trying this for one table. i have a 2 field table that i am having issue with.

    When i try to export 1 field a time, it goes fine. but when i export both the fields together it creates and issue for me.

    Robert Molina

    Hi Vikas,
    What are the data types of your source and the target?


    Vikas Madaan

    bigint and string on both HIVE and SQL Server side.

    Robert Molina

    Hi Vikas,
    Do the task logs : attempt_201310220941_0015_m_0
    00000_0, Status : FAILED

    show any further information regarding the error?


The topic ‘Sqoop Error while exporting Hive table to SQL Server’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.