Sqoop Forum

Sqoop 1.4.2 for SQL Server fails for tables with non-default schema

  • #28051


    When I try to use Sqoop 1.4.2 for SQL Server to import tables from AdventureWorks database, it fails for tables under non-default schema e.g. Person.Contact. In the same database however, the sqoop import command succeeds for tables with the default dbo schema e.g. dbo.ErrorLog. For any tables which is NOT dbo., the import command fails saying ‘Invalid Object Name’. What am I missing here? How can I import tables which are not in dbo schema?

    Is there a switch in sqoop for SQL Server where I can set the schema? (Like Sqoop for MySQL)



to create new topics or reply. | New User Registration

  • Author
  • #28542
    Seth Lyubich

    Hi Debarchan,

    Can you please provide command and error that you are getting? You might find some useful information here:



    Carter Shanklin

    Can you try fully qualifying the desired schema in –hive-table, for example –hive-table Person.Contact?

    I believe you will need to “create database Person” within Hive ahead of time.


    Hi Carter,

    Thanks, I’ve tried fully qualifying the table names as you indicated below without luck.

    I believe you will need to “create database Person” within Hive ahead of time. – Do you mean that I need to fire the create database command in Hive before the import? Could you please provide me with such a sample command?




    I use the command from Sqoop – SQL Server connector reference manual:

    sqoop import –connect “jdbc:sqlserver://;username=;password=;database=AdventureWorks2008R2” –table Person.Contact –target-dir /data/PersonInfo


    Carter Shanklin

    Debarchan, I’m not actually sure it’s necessary but to do it all you need to do is log in using the Hive shell (or the similar beeline utility which requires connecting) and type:

    hive> create database Person;

    You can also run

    hive> show databases;

    To list available databases.



    I will give it a try. Not sure if it will help though, as from the error it looks like it is not able to see the SQL Server table at the first place if they are not in the default dbo schema. If I import a table from the same database which is under dbo schema, then import works. Also, Person is the schema and AdventureWorks is the database.



    Hi Debarchan,

    Have you had a chance to try Carters suggestion?


    Jonathan Gao

    I run into the sample problem with Sqoop 1.4.3 on Windows Azure HDInsight.

    Koelli Mungee

    Hi Jonathan,

    Did you have the full command you are entering?

    Kind Regards

    Jonathan Gao

    c:\apps\dist\sqoop-\bin>sqoop import –connect “jdbc:sqlserver://xxxxxxxxxx.database.windows.net;user=xxx@xxxxxxxxxx;password=Pass@word1;database=AdventureWorks2012” –table Sales.SalesOrderDetail –target-dir /lineitemData -m 1

    Here is the error:

    Warning: HBASE_HOME and HBASE_VERSION not set.
    Warning: HBASE_HOME does not exist HBase imports will fail.
    Please set HBASE_HOME to the root of your HBase installation.
    13/12/03 23:35:15 INFO manager.SqlManager: Using default fetchSize of 1000
    13/12/03 23:35:15 INFO tool.CodeGenTool: Beginning code generation
    13/12/03 23:35:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [Sales.SalesOrderDetail] AS t WHERE 1=0
    13/12/03 23:35:16 ERROR manager.SqlManager: Error executing statement: com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘Sales.SalesOrderDetail’.
    com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘Sales.SalesOrderDetail’.
    at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
    at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
    at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
    at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
    at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4615)
    at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
    at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
    at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
    at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:283)
    at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:619)
    at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:628)
    at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:235)
    at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:219)
    at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:347)
    at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1264)
    at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1081)
    at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:82)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)

    Koelli Mungee

    Hi Jonathon,

    Since you are using HDInsight, It would be probably best to post your issue on the HDInsight forums. Based on the error, ” Invalid object name ‘Sales.SalesOrderDetail”

    However, here is an example of a working command,
    sqoop-import –connect “jdbc:sqlserver://;databaseName=dave” –driver com.microsoft.sqlserver.jdbc.SQLServerDriver –username hive –password hive –query “select test.dave as NUMBER, test.test# as DATA from test where \$CONDITIONS” –split-by test.dave –as-avrodatafile -m1 –target-dir /data/sub/cca/fpcondtl/stg5hash/

The topic ‘Sqoop 1.4.2 for SQL Server fails for tables with non-default schema’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.