HBase Forum

HBASE java.lang.ClassNotFoundException: org.cloudera.htrace.Trace

  • #44834
    Rupert Bailey

    Hello team,

    System: Sandbox 1.3 HBase

    I’m running the following code on a Sandbox 1.3, copied the following code from “Hadoop the definitive guide” and added the import statements to get it to compile.
    To enable it to compile, I needed to add:

    Also the classpath needed to be directed to include the following of which the first three I created, the last three I downloaded – added to the -libjars too.

    import java.io.IOException;

    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.hbase.HBaseConfiguration;
    import org.apache.hadoop.hbase.client.Get;
    import org.apache.hadoop.hbase.client.HBaseAdmin;
    import org.apache.hadoop.hbase.client.HTable;
    import org.apache.hadoop.hbase.client.Put;
    import org.apache.hadoop.hbase.client.Scan;
    import org.apache.hadoop.hbase.util.Bytes;
    import org.apache.hadoop.hbase.HTableDescriptor;
    import org.apache.hadoop.hbase.HColumnDescriptor;
    import org.apache.hadoop.hbase.client.Result;
    import org.apache.hadoop.hbase.client.ResultScanner;

    public class ExampleClient {
    public static void main(String[] args) throws IOException {
    Configuration config = HBaseConfiguration.create();
    // Create table
    HBaseAdmin admin = new HBaseAdmin(config);

    HTableDescriptor htd = new HTableDescriptor(“test”);
    HColumnDescriptor hcd = new HColumnDescriptor(“data”);
    byte[] tablename = htd.getName();
    HTableDescriptor[] tables = admin.listTables();
    if (tables.length != 1 && Bytes.equals(tablename, tables[0].getName())) {
    throw new IOException(“Failed create of table”);
    // Run some operations — a put, a get, and a scan — against the table.
    HTable table = new HTable(config, tablename);
    byte[] row1 = Bytes.toBytes(“row1”);
    Put p1 = new Put(row1);
    byte[] databytes = Bytes.toBytes(“data”);
    p1.add(databytes, Bytes.toBytes(“1”), Bytes.toBytes(“value1”));
    Get g = new Get(row1);
    Result result = table.get(g);
    System.out.println(“Get: ” + result);
    Scan scan = new Scan();
    ResultScanner scanner = table.getScanner(scan);
    try {
    for (Result scannerResult : scanner) {
    System.out.println(“Scan: ” + scannerResult);
    } finally {
    // Drop the table

to create new topics or reply. | New User Registration

  • Author
  • #44851
    Enis Soztutar


    It seems that you are trying to compile your code with HBase trunk ( there is no 0.99 version), but link it with HDP-1.3. HDP-1.3 comes with HBase 0.94 (http://hortonworks.com/products/hdp-1-3/), and is not binary compatible with trunk or 0.96 versions. You can use HDP-2.0 or use HBase 0.94 in your application.

    Rupert Bailey

    Thankyou, Interestingly the `hadoop classpath` output does not include /usr/lib/hbase/hbase-

    Including that on the classpath allows compilation and running the following command got me past
    export HADOOP_CLASSPATH=”./:/usr/lib/hbase/hbase-`hadoop classpath`”;

    but I hit another snag:
    hadoop jar ExampleClient.jar
    Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/zookeeper/KeeperException
    at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:185)
    at org.apache.hadoop.hbase.client.HBaseAdmin.(HBaseAdmin.java:116)
    at ExampleClient.main(ExampleClient.java:19)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
    Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    –8 More

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.