Hive / HCatalog Forum

Hive + Parquet = Exception

  • #37112

    When querying hive table based on parquet file I get exception below. The strange thing for me is that Hive calls parquet and then parquet invokes Hive and at this point no such method exception(not a class not found) happens which is pretty strange since this method is in the source code. Any ideas?

    Error: java.lang.reflect.InvocationTargetException
    at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(
    at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.(
    at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(
    at org.apache.hadoop.mapred.MapTask.runOldMapper(
    at org.apache.hadoop.mapred.YarnChild$
    at Method)
    at org.apache.hadoop.mapred.YarnChild.main(
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
    at java.lang.reflect.Constructor.newInstance(
    at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(
    … 11 more
    Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hive.ql.plan.MapredWork.getPathToPartitionInfo()Ljava/util/LinkedHashMap;
    at parquet.hive.ManageJobConfig.init(
    at parquet.hive.ManageJobConfig.cloneJobAndInit(
    at parquet.hive.DeprecatedParquetInputFormat$RecordReaderWrapper.getSplit(
    at parquet.hive.DeprecatedParquetInputFormat$RecordReaderWrapper.(
    at parquet.hive.DeprecatedParquetInputFormat.getRecordReader(
    … 16 more

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.