Home Forums HBase Sandbox 1.3 Create HBase table from Beeswax?

This topic contains 16 replies, has 3 voices, and was last updated by  abdelrahman 8 months ago.

  • Creator
    Topic
  • #29592

    Brian Brownlow
    Participant

    I am on Sandbox 1.3. Through HUE, can Beeswax be used to create and load an HBASE table? I have tried:

    CREATE TABLE my_test(key int, value string)
    STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’
    WITH SERDEPROPERTIES (‘hbase.columns.mapping’ = ‘:key,cf1:val’)
    TBLPROPERTIES (‘hbase.table.name’ = ‘xyz’);

    FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.DDLTask

Viewing 16 replies - 1 through 16 (of 16 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #32967

    abdelrahman
    Moderator

    Hi Brian,

    Please add the needed Jar files into Hive and restart the VM and use Hive shell to perform the POC:

    - Create a directory called auxlib
    #mkdir /usr/lib/hive/auxlib
    - Copy all of the following Jars to the previous directory:
    /usr/lib/hive/lib/hive-hbase-handler-0.11.0.1.3.0.0-107.jar;
    /usr/lib/zookeeper/zookeeper-3.4.5.1.3.0.0-107.jar ;
    /usr/lib/hbase/hbase-0.94.6.1.3.0.0-107-security.jar;
    /usr/lib/hbase/lib/guava-11.0.2.jar;

    Also, Add the proper configuration of HBase.

    set hbase.zookeeper.quorum=zk1,zk2,zk3 ;
    set hive.zookeeper.client.port=2181;
    set zookeeper.znode.parent=/hbase-unsecure ;

    Thanks
    -Abdelrahman

    Collapse
    #30051

    Brian Brownlow
    Participant

    Ted,

    Could you do it in the hue bash shell in hue? I have been using the hbase, hive and bash shells. The only thing a little goffy to me seems to be that grunt is different than if you start /usr/bin/hive… Why was grunt chosen over /usr/bin/hive or are they the same?

    Thank you.

    Collapse
    #30050

    tedr
    Moderator

    Hi Brian,

    Yes that’s what I am on. And note this locating of the jars and then making the links was not done in Hue, it was done by ssh’ing into the sandbox and then nosing around from there.

    Thanks,
    Ted.

    Collapse
    #30040

    Brian Brownlow
    Participant

    Ted, Are you on the HDP 1.3 sandbox?

    Collapse
    #29991

    tedr
    Moderator

    Hi Brian,

    I found the jars that were needed by both making a shrewd guess based on the path shown in the exception and then verifying that with ‘jar tf |grep ‘. I did not have to write a java program to create the links, I just used the Linux command ‘ln -s’. NOTE: all of this hunting and linking was done by logging into the sandbox’s shell via ssh. The specific jar that I can remember now was the HBase jar found in the /usr/lib/hbase directory.

    Thanks,
    Ted.

    Collapse
    #29990

    Brian Brownlow
    Participant

    Ted,

    Thank you. How did you find the jar? What was it? Did you have to write a Java program to add the link and run the PIG script? Did you do something that adds the link each time you start grunt? Did you use grunt out of hui? Are you on Sandbox 1.3?

    Thank you.

    Collapse
    #29881

    tedr
    Moderator

    HI Brian,

    I did find the jars I need to add to the classpath to get rid of the exception that I was seeing. and was able to create the table from the commandline. The table showed the sandbox gui. I have not yet tested whether or not I can add rows via pig or not. Will get back to you when I find out what the results there are. By the way the method I used to get the jars on the Hive class path is rather easy, once you find out which jar the class is contained in just add a simlink to that jar in the /usr/lib/hive/lib directory.

    Thanks,
    Ted.

    Collapse
    #29727

    Brian Brownlow
    Participant

    I can create a table in Hbase via the command line, get the table to showup in HCat but cannot add any rows from the command line via a pig script or from pig within hue. I’ve been up and down the config and stared at the error logs. I think you are correct in your diagnosis. I wish my Java engineering skills were better. Thanks for the help.

    Collapse
    #29726

    Brian Brownlow
    Participant

    Thanks Ted. I have been trying to get entries into Hbase all day. Did you get something like this?

    Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.util.Bytes
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

    Collapse
    #29719

    tedr
    Moderator

    Hi Brian,

    In trying to find a solution for your issue, which I have duplicated, I am having troubles running that same command on the hive command line. though there I am getting a NoClassDefFound error for MasterNotRunningException. Trying to locate which jar that is in so I can make sure that it is on the classpath for hive and then try again.

    Thanks,
    Ted.

    Collapse
    #29674

    Brian Brownlow
    Participant

    I can create tables from the hbase command line in the sandbox but they will not be visible via any interface in Hue that I am aware of.

    Collapse
    #29673

    Brian Brownlow
    Participant

    Hive was running. I was doing other things with beeswax in Hue using tables in HCatalog.

    Collapse
    #29672

    tedr
    Moderator

    Hi Brian,

    Was hive running when you did this query?

    Ted.

    Collapse
    #29666

    Brian Brownlow
    Participant

    13/07/18 06:48:05 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
    13/07/18 06:48:05 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
    13/07/18 06:48:05 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO parse.ParseDriver: Parsing command: use default
    13/07/18 06:48:05 INFO parse.ParseDriver: Parse Completed
    13/07/18 06:48:05 INFO ql.Driver: Semantic Analysis Completed
    13/07/18 06:48:05 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver: Starting command: use default
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver:
    OK
    13/07/18 06:48:05 INFO ql.Driver: OK
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO parse.ParseDriver: Parsing command: CREATE TABLE my_test(key int, value string)
    STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’
    WITH SERDEPROPERTIES (‘hbase.columns.mapping’ = ‘:key,cf1:val’)
    TBLPROPERTIES (‘hbase.table.name’ = ‘xyz’)
    13/07/18 06:48:05 INFO parse.ParseDriver: Parse Completed
    13/07/18 06:48:05 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
    13/07/18 06:48:05 INFO parse.SemanticAnalyzer: Creating table my_test position=13
    13/07/18 06:48:05 INFO ql.Driver: Semantic Analysis Completed
    13/07/18 06:48:05 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 INFO ql.Driver: Starting command: CREATE TABLE my_test(key int, value string)
    STORED BY ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’
    WITH SERDEPROPERTIES (‘hbase.columns.mapping’ = ‘:key,cf1:val’)
    TBLPROPERTIES (‘hbase.table.name’ = ‘xyz’)
    13/07/18 06:48:05 INFO ql.Driver:
    FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.DDLTask
    13/07/18 06:48:05 ERROR ql.Driver: FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.DDLTask
    13/07/18 06:48:05 INFO ql.Driver:
    13/07/18 06:48:05 ERROR beeswax.BeeswaxServiceImpl: Exception while processing query
    BeeswaxException(message:Driver returned: -101. Errors: OK
    FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.DDLTask
    , log_context:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e, handle:QueryHandle(id:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e, log_context:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e), SQLState: )
    at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState.execute(BeeswaxServiceImpl.java:351)
    at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState$1$1.run(BeeswaxServiceImpl.java:609)
    at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState$1$1.run(BeeswaxServiceImpl.java:598)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:337)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1214)
    at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState$1.run(BeeswaxServiceImpl.java:598)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
    13/07/18 06:48:06 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
    13/07/18 06:48:06 ERROR security.UserGroupInformation: PriviledgedActionException as:hue cause:BeeswaxException(message:Driver returned: -101. Errors: OK
    FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.DDLTask
    , log_context:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e, handle:QueryHandle(id:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e, log_context:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e), SQLState: )
    13/07/18 06:48:06 ERROR beeswax.BeeswaxServiceImpl: Caught BeeswaxException
    BeeswaxException(message:Driver returned: -101. Errors: OK
    FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.DDLTask
    , log_context:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e, handle:QueryHandle(id:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e, log_context:7f5c0ceb-9930-4ce1-ba59-9bf80c28ce1e), SQLState: )
    at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState.execute(BeeswaxServiceImpl.java:351)
    at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState$1$1.run(BeeswaxServiceImpl.java:609)
    at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState$1$1.run(BeeswaxServiceImpl.java:598)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:337)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1214)
    at com.cloudera.beeswax.BeeswaxServiceImpl$RunningQueryState$1.run(BeeswaxServiceImpl.java:598)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)

    Collapse
    #29616

    tedr
    Moderator

    Hi Brian,

    Can you post the full stack trace? the stack trace can probably be found in the hive logs. Also did you start Hive as well or was that started by the Sandbox on startup?

    Thanks,
    Ted.

    Collapse
    #29594

    Brian Brownlow
    Participant

    I started Hbase via Ambari on the sandbox.

    Collapse
Viewing 16 replies - 1 through 16 (of 16 total)