Sqoop Forum

java.lang.AbstractMethodError: org.netezza.sql.NzPreparedStatament.isClosed()Z

  • #59524
    Carsten Piepel

    I am using HDP for Windows. When attempting to import data from a Netezza appliance using Sqoop, the MapReduce job fails with the Netezza JDBC error shown below:

    2014-08-29 14:24:00,467 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@3d989dea
    2014-08-29 14:24:00,807 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: "SITE_ID" >= 1 AND "SITE_ID" < 100413
    2014-08-29 14:24:00,860 INFO [main] org.apache.sqoop.mapreduce.db.DBRecordReader: Working on split: "SITE_ID" >= 1 AND "SITE_ID" < 100413
    2014-08-29 14:24:00,923 INFO [main] org.apache.sqoop.mapreduce.db.DBRecordReader: Executing query: SELECT "VP_ID", "BR_ID", "ACCOUNT_ID", "SITE_ID", "X", "Y" FROM "BR_SITE_LOCATION_V" AS "BR_SITE_LOCATION_V" WHERE ( "SITE_ID" >= 1 ) AND ( "SITE_ID" < 100413 )
    2014-08-29 14:24:19,104 INFO [Thread-11] org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
    2014-08-29 14:24:19,155 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.AbstractMethodError: org.netezza.sql.NzPreparedStatament.isClosed()Z
    at org.apache.sqoop.mapreduce.db.DBRecordReader.close(DBRecordReader.java:163)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.close(MapTask.java:499)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1982)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:772)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

    I use the following import command:

    sqoop import --username map -P --connect jdbc:netezza://nzhost:5480/nzdb --table BR_SITE_LOCATION_V --split-by SITE_ID --target-dir /user/coz323/br-site --verbose

    I’ve tried the Netezza JDBC drivers versions 5.0 and 7.0 but the error is the same for both. Using the –direct option also does not make a difference. I am confident Sqoop is using the correct connection manager as I see the following lines logged to the console:

    manager.DefaultManagerFactory: Trying with scheme: jdbc:netezza:
    sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.NetezzaManager@3595b750

    Any ideas or suggestions would be highly appreciated. Thanks.

to create new topics or reply. | New User Registration

  • Author
  • #59539

    The root cause is described in SQOOP-1279. I think the product does not have the right Sqoop version. Can you get the later version of HDP and try


    Carsten Piepel

    Thanks, Venkat. I will give that a try and report back.

    Carsten Piepel

    Thanks, that did it: I installed Hortonworks Data Platform for Windows and the sqoop import from Netezza is working now.

The topic ‘java.lang.AbstractMethodError: org.netezza.sql.NzPreparedStatament.isClosed()Z’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.