HDP on Linux – Installation Forum

hbase client fails

  • #8343
    Ben Cuthbert
    Member

    All

    we have been using stumbleupon asynchbase 1.2.0 with a regular download of hadoop/hbase. We just downloaded hortonworks and we are trying to run our application using the latest platform. We have upgraded our jars to hadoop core 1.0.3 and hbase 0.92.1.14 as found in the lib_dir of the hortonworks installation.

    When connecting from our client application all we see in the client logs are

    13:34:47.311 [main-SendThread()] INFO org.apache.zookeeper.ClientCnxn – Opening socket connection to server /10.211.55.4:2181
    13:34:47.334 [main-SendThread(devhortonworks.localdomain:2181)] WARN o.a.z.client.ZooKeeperSaslClient – SecurityException: java.lang.SecurityException: Unable to locate a login configuration occurred when trying to find JAAS configuration.
    13:34:47.334 [main-SendThread(devhortonworks.localdomain:2181)] INFO o.a.z.client.ZooKeeperSaslClient – Client will not SASL-authenticate because the default JAAS configuration section ‘Client’ could not be found. If you are not using SASL, you may ignore this. On the other hand, if you expected SASL to work, please fix your JAAS configuration.
    13:34:47.344 [main-SendThread(devhortonworks.localdomain:2181)] INFO org.apache.zookeeper.ClientCnxn – Socket connection established to devhortonworks.localdomain/10.211.55.4:2181, initiating session
    13:34:47.345 [main-SendThread(devhortonworks.localdomain:2181)] DEBUG org.apache.zookeeper.ClientCnxn – Session establishment request sent on devhortonworks.localdomain/10.211.55.4:2181
    13:34:47.351 [main-SendThread(devhortonworks.localdomain:2181)] INFO org.apache.zookeeper.ClientCnxn – Session establishment complete on server devhortonworks.localdomain/10.211.55.4:2181, sessionid = 0x1392f53ee4e001e, negotiated timeout = 40000
    13:34:47.356 [main-SendThread(devhortonworks.localdomain:2181)] DEBUG org.apache.zookeeper.ClientCnxn – Reading reply sessionid:0x1392f53ee4e001e, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,243,0 request:: ‘/hbase/master,T response:: s{148,148,1345118948304,1345118948304,0,0,0,88153705237315597,85,0,148}
    13:34:47.359 [main-SendThread(devhortonworks.localdomain:2181)] DEBUG org.apache.zookeeper.ClientCnxn – Reading reply sessionid:0x1392f53ee4e001e, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,243,0 request:: ‘/hbase/master,T response:: #ffffffff00020313439313740646576686f72746f6e776f726b732e6c6f63616c646f6d61696e00646576686f72746f6e776f726b732e6c6f63616c646f6d61696e2c36303030302c31333435313138393437393130,s{148,148,1345118948304,1345118948304,0,0,0,88153705237315597,85,0,148}
    13:34:47.362 [main-SendThread(devhortonworks.localdomain:2181)] DEBUG org.apache.zookeeper.ClientCnxn – Reading reply sessionid:0x1392f53ee4e001e, packet:: clientPath:null serverPath:null finished:false header:: 3,3 replyHeader:: 3,243,0 request:: ‘/hbase/root-region-server,T response:: s{165,165,1345119087641,1345119087641,0,0,0,0,83,0,165}
    13:34:47.364 [main-SendThread(devhortonworks.localdomain:2181)] DEBUG org.apache.zookeeper.ClientCnxn – Reading reply sessionid:0x1392f53ee4e001e, packet:: clientPath:null serverPath:null finished:false header:: 4,4 replyHeader:: 4,243,0 request:: ‘/hbase/root-region-server,T response:: #ffffffff00020313634343340646576686f72746f6e776f726b732e6c6f63616c646f6d61696e646576686f72746f6e776f726b732e6c6f63616c646f6d61696e2c36303032302c31333435313139303833363734,s{165,165,1345119087641,1345119087641,0,0,0,0,83,0,165}
    13:34:47.365 [main-SendThread(devhortonworks.localdomain:2181)] DEBUG org.apache.zookeeper.ClientCnxn – Reading reply sessionid:0x1392f53ee4e001e, packet:: clientPath:null serverPath:null finished:false header:: 5,3 replyHeader:: 5,243,0 request:: ‘/hbase,F response:: s{34,34,1345112586806,1345112586806,0,14,0,0,0,10,165}
    13:34:47.366 [main-SendThread(devhortonworks.localdomain:2181)] DEBUG org.apache.zookeeper.ClientCnxn – Reading reply sessionid:0x1392f53ee4e001e, packet:: clientPath:null serverPath:null finished:false header:: 6,3 replyHeader:: 6,243,0 request:: ‘/hbase/hbaseid,F response:: s{43,150,1345112587802,1345118948846,2,0,0,0,73,0,43}
    13:34:47.367 [main-SendThread(devhortonworks.localdomain:2181)] DEBUG org.apache.zookeeper.ClientCnxn – Reading reply sessionid:0x1392f53ee4e001e, packet:: clientPath:null serverPath:null finished:false header:: 7,4 replyHeader:: 7,243,0 request:: ‘/hbase/hbaseid,F response:: #ffffffff00020313439313740646576686f72746f6e776f726b732e6c6f63616c646f6d61696e63646436373261312d653463372d343630642d383133362d366636386337666466633964,s{43,150,1345112587802,1345118948846,2,0,0,0,73,0,43}
    13:34:47.420 [main] INFO org.apache.zookeeper.ZooKeeper – Initiating client connection, connectString=10.211.55.4:2181 sessionTimeout=60000 watcher=catalogtracker-on-org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@9dd1752
    13:34:47.420 [main-SendThread()] INFO org.apache.zookeeper.ClientCnxn – Opening socket connection to server /10.211.55.4:2181

    And the zookeeper just shows

    2012-08-16 13:36:59,242 – INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxnFactory@213] – Accepted socket connection from /10.211.55.2:62637
    2012-08-16 13:36:59,243 – INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:ZooKeeperServer@838] – Client attempting to establish new session at /10.211.55.2:62637
    2012-08-16 13:36:59,244 – INFO [SyncThread:0:ZooKeeperServer@604] – Established session 0x1392f53ee4e0023 with negotiated timeout 40000 for client /10.211.55.2:62637
    2012-08-16 13:36:59,249 – INFO [ProcessThread(sid:0 cport:-1)::PrepRequestProcessor@466] – Processed session termination for sessionid: 0x1392f53ee4e0023
    2012-08-16 13:36:59,250 – INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxn@1000] – Closed socket connection for client /10.211.55.2:62637 which had sessionid 0x1392f53ee4e0023
    2012-08-16 13:37:00,275 – INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:NIOServerCnxnFactory@213] – Accepted socket connection from /10.211.55.2:62638
    2012-08-16 13:37:00,276 – INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181:ZooKeeperServer@838] – Client attempting to establish new session at /10.211.55.2:62638
    2012-08-16 13:37:00,277 – INFO [SyncThread:0:ZooKeeperServer@604] – Established session 0x1392f53ee4e0024 with negotiated timeout 40000 for client /10.211.55.2:62638
    2012-08-16 13:37:00,286 – INFO [ProcessThread(sid:0 cport:-1)::PrepRequestProcessor@466] – Processed session termination for sessionid: 0x1392f53ee4e0024

    Is there a hortonworks api we should be using?

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #8357
    Sasha J
    Moderator

    Ben,
    this may be related to you incorrect mining to loopback…
    We are investigating on this.

    Thank you!
    Sasha

    #8401
    Ben Cuthbert
    Member

    Yup could have been :) I am now connected. I do see this error in hbase hbase-hbase-regionserver-devhortonworks.localdomain.log

    2012-08-16 20:50:00,140 ERROR org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
    java.lang.NegativeArraySizeException: -1
    at org.apache.hadoop.hbase.util.Bytes.readByteArray(Bytes.java:147)
    at org.apache.hadoop.hbase.client.OperationWithAttributes.readAttributes(OperationWithAttributes.java:102)
    at org.apache.hadoop.hbase.client.Put.readFields(Put.java:399)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:565)
    at org.apache.hadoop.hbase.client.Action.readFields(Action.java:101)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:565)
    at org.apache.hadoop.hbase.client.MultiAction.readFields(MultiAction.java:116)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
    at org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1248)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1177)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:715)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:507)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:482)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
    2012-08-16 20:50:00,140 ERROR org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
    java.io.IOException: Error in readFields
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:655)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:565)
    at org.apache.hadoop.hbase.client.Action.readFields(Action.java:101)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:565)
    at org.apache.hadoop.hbase.client.MultiAction.readFields(MultiAction.java:116)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
    at org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1248)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1177)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:715)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:507)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:482)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: java.lang.NegativeArraySizeException: -1
    at org.apache.hadoop.hbase.util.Bytes.readByteArray(Bytes.java:147)
    at org.apache.hadoop.hbase.client.OperationWithAttributes.readAttributes(OperationWithAttributes.java:102)
    at org.apache.hadoop.hbase.client.Put.readFields(Put.java:399)
    at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
    … 15 more

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.