The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HBase Forum

HBASE java.lang.ClassNotFoundException: org.cloudera.htrace.Trace

  • #44834
    Rupert Bailey
    Participant

    Hello team,

    System: Sandbox 1.3 HBase

    I’m running the following code on a Sandbox 1.3, copied the following code from “Hadoop the definitive guide” and added the import statements to get it to compile.
    To enable it to compile, I needed to add:
    /home/rupert/jars/hbase-client-0.99.0-SNAPSHOT.jar
    /home/rupert/jars/hbase-server-0.99.0-SNAPSHOT.jar
    /home/rupert/jars/hbase-common-0.99.0-SNAPSHOT.jar

    Also the classpath needed to be directed to include the following of which the first three I created, the last three I downloaded – added to the -libjars too.
    /home/rupert/jars/hbase-server-0.99.0-SNAPSHOT.jar
    /home/rupert/jars/hbase-client-0.99.0-SNAPSHOT.jar
    /home/rupert/jars/hbase-common-0.99.0-SNAPSHOT.jar
    /usr/lib/hbase/lib/protobuf-java-2.4.0a.jar
    /usr/lib/zookeeper/zookeeper.jar
    /home/rupert/jars/hbase-hadoop-compat-0.95.1-hadoop1-sources.jar
    /home/rupert/jars/hbase-protocol-0.95.0.jar
    /home/rupert/jars/htrace-1.44-sources.jar

    import java.io.IOException;

    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.hbase.HBaseConfiguration;
    import org.apache.hadoop.hbase.client.Get;
    import org.apache.hadoop.hbase.client.HBaseAdmin;
    import org.apache.hadoop.hbase.client.HTable;
    import org.apache.hadoop.hbase.client.Put;
    import org.apache.hadoop.hbase.client.Scan;
    import org.apache.hadoop.hbase.util.Bytes;
    import org.apache.hadoop.hbase.HTableDescriptor;
    import org.apache.hadoop.hbase.HColumnDescriptor;
    import org.apache.hadoop.hbase.client.Result;
    import org.apache.hadoop.hbase.client.ResultScanner;

    public class ExampleClient {
    public static void main(String[] args) throws IOException {
    Configuration config = HBaseConfiguration.create();
    // Create table
    HBaseAdmin admin = new HBaseAdmin(config);

    @SuppressWarnings(“deprecation”)
    HTableDescriptor htd = new HTableDescriptor(“test”);
    HColumnDescriptor hcd = new HColumnDescriptor(“data”);
    htd.addFamily(hcd);
    admin.createTable(htd);
    byte[] tablename = htd.getName();
    HTableDescriptor[] tables = admin.listTables();
    if (tables.length != 1 && Bytes.equals(tablename, tables[0].getName())) {
    admin.close();
    throw new IOException(“Failed create of table”);
    }
    // Run some operations — a put, a get, and a scan — against the table.
    HTable table = new HTable(config, tablename);
    byte[] row1 = Bytes.toBytes(“row1”);
    Put p1 = new Put(row1);
    byte[] databytes = Bytes.toBytes(“data”);
    p1.add(databytes, Bytes.toBytes(“1”), Bytes.toBytes(“value1”));
    table.put(p1);
    Get g = new Get(row1);
    Result result = table.get(g);
    System.out.println(“Get: ” + result);
    Scan scan = new Scan();
    ResultScanner scanner = table.getScanner(scan);
    try {
    for (Result scannerResult : scanner) {
    System.out.println(“Scan: ” + scannerResult);
    }
    } finally {
    scanner.close();
    }
    // Drop the table
    table.close();
    admin.disableTable(tablename);
    admin.deleteTable(tablename);
    admin.close();
    }
    }

  • Author
    Replies
  • #44851
    Enis Soztutar
    Moderator

    Hi,

    It seems that you are trying to compile your code with HBase trunk ( there is no 0.99 version), but link it with HDP-1.3. HDP-1.3 comes with HBase 0.94 (http://hortonworks.com/products/hdp-1-3/), and is not binary compatible with trunk or 0.96 versions. You can use HDP-2.0 or use HBase 0.94 in your application.

    #45060
    Rupert Bailey
    Participant

    Thankyou, Interestingly the `hadoop classpath` output does not include /usr/lib/hbase/hbase-0.94.6.1.3.0.0-107-security.jar

    Including that on the classpath allows compilation and running the following command got me past
    export HADOOP_CLASSPATH=”./:/usr/lib/hbase/hbase-0.94.6.1.3.0.0-107-security.jar:`hadoop classpath`”;

    but I hit another snag:
    hadoop jar ExampleClient.jar
    Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/zookeeper/KeeperException
    at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:185)
    at org.apache.hadoop.hbase.client.HBaseAdmin.(HBaseAdmin.java:116)
    at ExampleClient.main(ExampleClient.java:19)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
    Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    –8 More

The forum ‘HBase’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.