HBase Importtsv comman not working

to create new topics or reply. | New User Registration

Tagged: ,

This topic contains 6 replies, has 4 voices, and was last updated by  Brian Brownlow 1 year, 8 months ago.

  • Creator
  • #42736

    Danish Alam

    The Same command that was working in HDP 1.3 is not working in HDP 2.0
    Here is the steps that is followed:

    1. create table ‘hbase_test’, ‘d’ >> where hbase_test is the table name and d is the column family.
    2. Created tab separated file/data – hbase_test.txt
    3. copy to hdfs
    4. export JAVA_HOME=/usr/java/jdk1.7.0_21 export HBASE_HOME=/usr/lib/hbase export HADOOP_HOME=/usr/lib/hadoop 5.HADOOP_CLASSPATH=’${HBASE_HOME}/bin/hbase classpath':/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/hbase/lib/protobuf-java-2.4.jar ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase- importtsv -Dimporttsv.columns=HBASE_ROW_KEY,d:c1,d:c2 hbase_test hdfs://IP:8020/user/danish/hbase_test.txt

    where c1 and c2 are the column names, first column of the file will always be the HBASE_ROW_KEY. hbase_test is the table created in hbase this Throwing error as there is no security.jar in new Hbase version. Could anybody provide me the details of mapped jar for the new version.. Thanks in advance

Viewing 6 replies - 1 through 6 (of 6 total)

You must be to reply to this topic. | Create Account

  • Author
  • #43661

    Brian Brownlow

    Is there a better way to export from 1.3 and import to 2.0 hbase?


    Brian Brownlow

    Did changing the jar work? What was the new jar? I am also on HDP 2.


    Jianyong Dai

    Not sure what happen in Pig side. Can you post Pig log with complete stack?


    Nick Dimiduk

    Can you paste the exception you received when you tried `hbase importtsv …` ? Let’s stick to debugging a single issue at a time 😉


    Danish Alam

    Hi Nice thanks for your Repy, i tried it is throwing “java.lang.NoClassDefFoundError”

    Even i tried with Hbasestorage as follows

    1. Hbase: create ‘sample_names’, ‘info’
    2. Sample File:

    1, John, Smith
    2, Jane, Doe
    3, George, Washington
    4, Ben, Franklin

    3. copied into hdfs.
    4. Pig script to load data. pig_hbase.pig

    raw_data = LOAD ‘sample_data.csv’ USING PigStorage( ‘,’ ) AS (
    listing_id: chararray,
    fname: chararray,
    lname: chararray );

    STORE raw_data INTO ‘hbase://sample_names’ USING
    org.apache.pig.backend.hadoop.hbase.HBaseStorage (
    ‘info:fname info:lname’);

    Is Thrwoing Errory: ERROR 2998: Unhandled internal error. (class: org/apache/pig/backend/hadoop/hbase/HBaseStorage, method: addRowFilter signature: (Lorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;[B)V) Incompatible argument to function.

    PS : I want to load data in hbase either through
    1. Importtsv
    2. Hive to hbase
    3. pig to hbase
    4. completebulk

    but none of these worked in HDP 2.0 that i was able to do on HDP 1.3. could i get something on this so that i can achieve data load in hbase.
    Please suggest.


    Nick Dimiduk

    The hbase jar name has changed in HDP-2.0 (it’s no longer hbase-0.94). Have you tried letting the hbase script build classpath for you? Ie, just `hbase importtsv …`

Viewing 6 replies - 1 through 6 (of 6 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.