The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Sqoop Forum

Error in running sqoop from a Java code

  • #47768
    Aysan Rasooli

    I have a java code which runs sqoop to import data from Postgresql. When I run it on eclipse I get the following error. I am using CentOS, and I have already installed JDK, but still getting this error. Could you please let me know how can I solve the problem?

    Here is the error:

    :40,943 WARN [main] sqoop.ConnFactory ( – $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
    2014-01-29 16:07:40,984 INFO [main] manager.SqlManager ( – Using default fetchSize of 1000
    2014-01-29 16:07:40,985 INFO [main] tool.CodeGenTool ( – Beginning code generation
    2014-01-29 16:07:41,118 INFO [main] manager.SqlManager ( – Executing SQL statement: SELECT t.* FROM “two” AS t LIMIT 1
    2014-01-29 16:07:41,150 INFO [main] orm.CompilationManager ( – HADOOP_MAPRED_HOME is /usr/lib/hadoop/etc/hadoop
    2014-01-29 16:07:41,153 ERROR [main] orm.CompilationManager ( – It seems as though you are running sqoop with a JRE.
    2014-01-29 16:07:41,153 ERROR [main] orm.CompilationManager ( – Sqoop requires a JDK that can compile Java code.
    2014-01-29 16:07:41,153 ERROR [main] orm.CompilationManager ( – Please install a JDK and set $JAVA_HOME to use it.
    2014-01-29 16:07:41,154 ERROR [main] tool.ImportTool ( – Encountered IOException running import job: Could not start Java compiler.
    at org.apache.sqoop.orm.CompilationManager.compile(
    at org.apache.sqoop.tool.CodeGenTool.generateORM(
    at org.apache.sqoop.tool.ImportTool.importTable(
    at sqoopJob.main(


  • Author
  • #47799
    Robert Molina

    Hi Aysan,
    It seems to state you don’t have the JDK installed. Can you verifying your java home is pointed to the JDK?


    Aysan Rasooli

    Hi Robert,

    I set the JAVA_HOME as an environment variable in my eclipse when I am running the code. I also set it in ~/.bash_profile but still getting this error. Shall I set it anywhere else?



    Hi Aysan,

    If you run the command:


    does it provide an output?



    Aysan Rasooli

    Hi Dave,

    Thanks for the help! It worked. The problem was that the eclipse was using the old setting for the java home, which was pointing to jar location.


    Sourabh Potnis

    Even I am trying to run Sqoop from a Java program using Sqoop.runTool(str), to export data to Teradat from HDFS.

    While compiling I have added following Jars:
    /etc/hadoop/conf:/usr/lib/sqoop/sqoop-, all jars in/usr/lib/sqoop/lib, /usr/lib/hadoop/hadoop-common-, all jars in /usr/lib/hadoop/lib, /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-, all jars in /usr/lib/hadoop-mapreduce, all jars in /usr/lib/hadoop-mapreduce/lib

    But when running the jar, getting following error:
    Exception in thread “main” java.lang.NoSuchFieldError: IBM_JAVA
    at org.apache.hadoop.mapreduce.JobContext.<init>(
    at org.apache.hadoop.mapreduce.Job.<init>(
    at org.apache.sqoop.mapreduce.ExportJobBase.runExport(
    at org.apache.sqoop.teradata.TeradataConnManager.exportTable(
    at org.apache.sqoop.tool.ExportTool.exportTable(
    at org.apache.sqoop.Sqoop.runSqoop(
    at org.apache.sqoop.Sqoop.runTool(
    at org.apache.sqoop.Sqoop.runTool(
    at SqoopTest.exportHDFSToSQL(
    at SqoopTest.main(

    (From a command line Sqoop export is working fine.)



    Hi Saurabh, I see that you have posted the same on a different thread, I will close this thread and you can continue to use the other thread.


The topic ‘Error in running sqoop from a Java code’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.