Hive / HCatalog Forum

Hbase integration with Hive : Register Hbase table in Hive

  • #50700
    Afaque KHAN

    I am using Sandbox 2.0
    I am trying to register my hbase table into hive using the following query

    CREATE TABLE IF NOT EXISTS Document_Table_Hive (key STRING, author STRING, category STRING) STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’ WITH SERDEPROPERTIES (‘hbase.columns.mapping’ = ‘:key,metadata:author,categories:category’) TBLPROPERTIES (‘’ = ‘Document’);

    this does not work: I get the following Exception

    2014-03-26 09:14:57,341 ERROR exec.DDLTask ( – java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
    at org.apache.hadoop.hive.hbase.HBaseStorageHandler.setConf(
    at org.apache.hadoop.util.ReflectionUtils.setConf(
    at org.apache.hadoop.util.ReflectionUtils.newInstance(

    2014-03-26 09:14:57,368 ERROR ql.Driver ( – FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org/apache/hadoop/hbase/HBaseConfiguration

    the Hbase Table “Document” already exists and the describe command gives the following description

    {NAME => ‘categories’,..},
    {NAME => ‘comments’,..},
    {NAME => ‘metadata’,..}

    I have tried the following things

    1) add hive.aux.jars.path in hive-site.xml


    2) add jars using hive add jar command
    add jar /usr/lib/hbase/lib/hbase-common-;
    add jar /usr/lib/hive/lib/hive-hbase-handler-;
    add jar /usr/lib/hbase/lib/hbase-client-;
    add jar /usr/lib/zookeeper/zookeeper-;
    add file /etc/hbase/conf/hbase-site.xml

    3) As you have mentioned
    export HADOOP_CLASSPATH=/etc/hbase/conf:/usr/lib/hbase/lib/hbase-common-

    And it is still not working!!

    Could you please help me in figuring out how I can add the jars in the hive classpath so that it finds the hbaseConfiguration class.

    Or if it is entirely another issue.

    Thanks in advance


to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.