HBase Forum

Hive External Table Pointing to HBASE

  • #45119
    Zin Zin

    I have a Hbase table called

    Client_Txn with column famlies
    Txn (Txn_no)
    Dt (Txn_dt)
    Txn_Type (Type)
    Cf1 (val1, val2)
    Cf2 (val3, val4)

    Can I create a Hive Table pointing to this structure

    I basically want to create the Hive Table that refers to this external table and bring in only

    Cf1_Val (from Cf1:val1)
    Cf2_Val (from Cf2:val3)

    Is it possible ?

to create new topics or reply. | New User Registration

  • Author
  • #45125
    Nick Dimiduk

    Yes, you can select which HBase column qualifiers are made available to Hive and using what name. This is the column mapping Serde property. See http://hortonworks.com/blog/using-hive-to-interact-with-hbase-part-2/ for a working (less complex) example.

    Zin Zin

    Thanks. This is what I tried

    create external table hive_txn (
    id int,
    pol_no string,
    pol_amt int,
    bene_amt int
    stored by

    Getting into the following error

    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org/apache/hadoop/hbase/HBaseConfiguration

    Where cpi;d I be wrong ?


    Hi Paulie,

    What are the HBase/Hive jars that were added to the Hive query as well as Set values? Here are some steps to help resolve the issue:

    In the Hql do the following:
    — Change this to the Zookeeper servers
    set hbase.zookeeper.quorum=ZKHOST1,ZKHOST2,ZKHOST3;
    set hive.zookeeper.client.port=2181;
    — The following may vary
    set zookeeper.znode.parent=/hbase-unsecure ;

    In order to have all the Jars without the use of “add jar” operator, please follow these steps: (Change the jar versions to the latest)
    – Create a directory called auxlib
    #mkdir /usr/lib/hive/auxlib
    – Copy all of the following Jars to the previous directory:
    /usr/lib/zookeeper/zookeeper- ;

    Hope this helps


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.