Sqoop Forum

sqoop-export from HCatalog to MySQL: java.lang.NullPointerException

  • #53901
    Norton Le

    I am using HDP sandbox 2.1. I create a database “hose” in HCatalog, there is a table price_analysis. I create a database hose in mysql and table price_analysis in hose database. I try to execute this below command
    sqoop export –connect jdbc:mysql://localhost:3306/hose –username root –table price_analysis –hcatalog-database ‘hose’ –hcatalog-table ‘price_analysis’

    and it returns me java.lang.NullPointerException

    14/05/10 11:36:43 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
    14/05/10 11:36:47 INFO hcat.SqoopHCatUtilities: HCatalog table partitioning key fields = []
    14/05/10 11:36:47 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
    at org.apache.hcatalog.data.schema.HCatSchema.get(HCatSchema.java:99)
    at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:343)
    at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:394)
    at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:853)
    at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81)
    at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:240)

    is there anyone experienced it before?

    thanks and regards

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.