Hive / HCatalog Forum

Hbase Hive integration

  • #52305
    Nitin Gupta

    I am trying to create an external Hbase table using hive.

    here is my hive script for that
    ADD jar /usr/lib/hive/lib/hive-hbase-handler-;

    ADD jar /usr/lib/hbase/lib/hbase-client-;
    ADD jar /usr/lib/hbase/lib/hbase-common-;
    ADD jar /usr/lib/hbase/lib/hbase-protocol-;
    ADD jar /usr/lib/hbase/lib/hbase-server-;
    ADD jar /usr/lib/hbase/lib/htrace-core-2.01.jar;
    ADD jar /usr/lib/hcatalog/share/hcatalog/hcatalog-core.jar;
    ADD jar /usr/lib/zookeeper/zookeeper.jar;
    ADD jar /usr/lib/hive/lib/guava-12.0.1.jar;

    SET hive.zookeeper.client.port=2181;

    DROP table gw_summary_hbase;

    CREATE EXTERNAL TABLE gw_summary_hbase
    metric_ts timestamp,
    country string,
    bids BIGINT,
    calls BIGINT,
    mou BIGINT,
    seizures BIGINT
    COMMENT ‘GW Calls – in hbase’
    STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’
    “hbase.columns.mapping” = “:key,summary:timestamp,summary:country,summary:bids,summary:calls,summary:mou,summary:seizures”
    but is giving the exception, “FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org/apache/hadoop/hbase/HBaseConfiguration”

    what may be wrong here ?

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.