The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HBase Forum

HBase Importtsv comman not working

  • #42736
    Danish Alam

    The Same command that was working in HDP 1.3 is not working in HDP 2.0
    Here is the steps that is followed:

    1. create table ‘hbase_test’, ‘d’ >> where hbase_test is the table name and d is the column family.
    2. Created tab separated file/data – hbase_test.txt
    3. copy to hdfs
    4. export JAVA_HOME=/usr/java/jdk1.7.0_21 export HBASE_HOME=/usr/lib/hbase export HADOOP_HOME=/usr/lib/hadoop 5.HADOOP_CLASSPATH=’${HBASE_HOME}/bin/hbase classpath’:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/hbase/lib/protobuf-java-2.4.jar ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase- importtsv -Dimporttsv.columns=HBASE_ROW_KEY,d:c1,d:c2 hbase_test hdfs://IP:8020/user/danish/hbase_test.txt

    where c1 and c2 are the column names, first column of the file will always be the HBASE_ROW_KEY. hbase_test is the table created in hbase this Throwing error as there is no security.jar in new Hbase version. Could anybody provide me the details of mapped jar for the new version.. Thanks in advance

  • Author
  • #42824
    Nick Dimiduk

    The hbase jar name has changed in HDP-2.0 (it’s no longer hbase-0.94). Have you tried letting the hbase script build classpath for you? Ie, just `hbase importtsv …`

    Danish Alam

    Hi Nice thanks for your Repy, i tried it is throwing “java.lang.NoClassDefFoundError”

    Even i tried with Hbasestorage as follows

    1. Hbase: create ‘sample_names’, ‘info’
    2. Sample File:

    1, John, Smith
    2, Jane, Doe
    3, George, Washington
    4, Ben, Franklin

    3. copied into hdfs.
    4. Pig script to load data. pig_hbase.pig

    raw_data = LOAD ‘sample_data.csv’ USING PigStorage( ‘,’ ) AS (
    listing_id: chararray,
    fname: chararray,
    lname: chararray );

    STORE raw_data INTO ‘hbase://sample_names’ USING
    org.apache.pig.backend.hadoop.hbase.HBaseStorage (
    ‘info:fname info:lname’);

    Is Thrwoing Errory: ERROR 2998: Unhandled internal error. (class: org/apache/pig/backend/hadoop/hbase/HBaseStorage, method: addRowFilter signature: (Lorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;[B)V) Incompatible argument to function.

    PS : I want to load data in hbase either through
    1. Importtsv
    2. Hive to hbase
    3. pig to hbase
    4. completebulk

    but none of these worked in HDP 2.0 that i was able to do on HDP 1.3. could i get something on this so that i can achieve data load in hbase.
    Please suggest.

    Nick Dimiduk

    Can you paste the exception you received when you tried `hbase importtsv …` ? Let’s stick to debugging a single issue at a time 😉

    Jianyong Dai

    Not sure what happen in Pig side. Can you post Pig log with complete stack?

    Brian Brownlow

    Did changing the jar work? What was the new jar? I am also on HDP 2.

    Brian Brownlow

    Is there a better way to export from 1.3 and import to 2.0 hbase?

The forum ‘HBase’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.