The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Hortonworks Sandbox Forum

Beeswax and Hcat "Timed Out" Errors

  • #55575
    Horton Works
    Participant

    All of the sandbox features work except Beeswax and HCat.
    When I click on these I get a “Timed Out” error message.

    Here are the log file entries for the HCat “Timed Out”

    HCat Timed Out

    [11/Jun/2014 06:48:00] middleware DEBUG No desktop_app known for request.
    [11/Jun/2014 06:48:00] access INFO 192.168.106.134 frank – “GET /hcatalog/ HTTP/1.0”
    [11/Jun/2014 06:48:00] views DEBUG Getting database name from cookies
    [11/Jun/2014 06:48:00] thrift_util DEBUG Thrift call: <class ‘hive_metastore.ThriftHiveMetastore.Client’>.get_all_databases(args=(), kwargs={})
    [11/Jun/2014 06:48:10] thrift_util WARNING Not retrying thrift call get_all_databases due to socket timeout
    [11/Jun/2014 06:48:10] thrift_util INFO Thrift saw a socket error: timed out
    [11/Jun/2014 06:48:10] middleware INFO Processing exception: timed out (code THRIFTSOCKET): None: Traceback (most recent call last):
    File “/usr/lib/hue/build/env/lib/python2.6/site-packages/Django-1.2.3-py2.6.egg/django/core/handlers/base.py”, line 100, in get_response
    response = callback(request, *callback_args, **callback_kwargs)
    File “/usr/lib/hue/apps/hcatalog/src/hcatalog/views.py”, line 53, in index
    return show_tables(request, database=database)
    File “/usr/lib/hue/apps/hcatalog/src/hcatalog/views.py”, line 92, in show_tables
    databases = db.get_databases()
    File “/usr/lib/hue/apps/beeswax/src/beeswax/server/dbms.py”, line 92, in get_databases
    return self.client.get_databases()
    File “/usr/lib/hue/apps/beeswax/src/beeswax/server/beeswax_lib.py”, line 124, in get_databases
    return self.meta_client.get_all_databases()
    File “/usr/lib/hue/desktop/core/src/desktop/lib/thrift_util.py”, line 302, in wrapper
    raise StructuredException(‘THRIFTSOCKET’, str(e), data=None, error_code=502)
    StructuredException: timed out (code THRIFTSOCKET): None

    [11/Jun/2014 06:48:13] middleware DEBUG No desktop_app known for request.
    [11/Jun/2014 06:48:13] access INFO 192.168.106.134 frank – “GET /about/ HTTP/1.0”
    [11/Jun/2014 06:48:16] access WARNING 192.168.106.134 frank – “GET /logs HTTP/1.0”
    [11/Jun/2014 06:48:17] access WARNING 192.168.106.134 frank – “POST /logs HTTP/1.0”
    [11/Jun/2014 06:48:21] access WARNING 192.168.106.134 frank – “GET /download_logs HTTP/1.0”

  • Author
    Replies
  • #56814
    iandr413
    Moderator

    Hi,
    Have you ensure your Hcatalog and Hive are up and running for the sandbox? Can you use these services via command line successfully?

    Ian

    #56816
    Horton Works
    Participant

    I could not get it to work using the VMPLAYER version so I tried the Oracle Virtual BOX version and that worked fine.

    #57606
    PACHONG Donald
    Participant

    Hi,
    do you have a solution for Hue 2.3 with a 3 nodes clusters ?

    Here is my log file
    [22/Jul/2014 04:46:46 +0000] access WARNING XXXX admin – “GET /logs HTTP/1.1”

    [22/Jul/2014 04:45:00 +0000] access WARNING XXXX admin – “POST /logs HTTP/1.1”

    [22/Jul/2014 04:44:46 +0000] middleware INFO Processing exception: Could not read table: Traceback (most recent call last):
    File “/usr/lib/hue/build/env/lib/python2.6/site-packages/Django-1.2.3-py2.6.egg/django/core/handlers/base.py”, line 100, in get_response
    response = callback(request, *callback_args, **callback_kwargs)
    File “/usr/lib/hue/apps/hcatalog/src/hcatalog/views.py”, line 408, in read_table
    raise PopupException(_(‘Could not read table’), detail=e)
    PopupException: Could not read table

    [22/Jul/2014 04:44:41 +0000] thrift_util DEBUG Thrift call <class ‘hive_metastore.ThriftHiveMetastore.Client’>.get_table returned in 31ms: Table(partitionKeys=[], parameters={‘transient_lastDdlTime’: ‘1406028155’}, privileges=None, tableName=’logs’, tableType=’MANAGED_TABLE’, createTime=1406028155, lastAccessTime=0, viewOriginalText=None, owner=’admin’, viewExpandedText=None, sd=StorageDescriptor(outputFormat=’org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat’, sortCols=[], inputFormat=’org.apache.hadoop.mapred.TextInputFormat’, cols=[FieldSchema(comment=None, type=’string’, name=’name’), FieldSchema(comment=None, type=’string’, name=’surname’)], compressed=False, bucketCols=[], numBuckets=-1, parameters={}, serdeInfo=SerDeInfo(serializationLib=’org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe’, name=None, parameters={‘field.delim’: ‘\x01’, ‘serialization.format’: ‘\x01’, ‘mapkey.delim’: ‘\x03’, ‘colelction.delim’: ‘\x02′}), location=’hdfs://hdp-cluster-dfossouo-1.novalocal:8020/apps/hive/warehouse/logs’), dbName=’default’, retention=0)

    [22/Jul/2014 04:44:41 +0000] thrift_util DEBUG Thrift call: <class ‘hive_metastore.ThriftHiveMetastore.Client’>.get_table(args=(u’default’, u’logs’), kwargs={})

    [22/Jul/2014 04:44:41 +0000] views DEBUG Getting database name from argument

    [22/Jul/2014 04:44:41 +0000] access INFO XXXX admin – “GET /hcatalog/table/default/logs/read HTTP/1.1”

    [22/Jul/2014 04:44:02 +0000] access WARNING XXXX admin – “POST /logs HTTP/1.1”

    [22/Jul/2014 04:42:00 +0000] access WARNING XXXX admin – “POST /logs HTTP/1.1”

    [22/Jul/2014 04:41:46 +0000] access WARNING XXXX min – “POST /logs HTTP/1.1”

    [22/Jul/2014 04:41:41 +0000] thrift_util DEBUG Thrift call: <class ‘beeswaxd.BeeswaxService.Client’>.query(args=(Query(query=’SELECT * FROM default.logs‘, configuration=[‘use default’], hadoop_user=u’admin’),), kwargs={})

    [22/Jul/2014 04:41:41 +0000] dbms DEBUG Made new QueryHistory id 1 user admin query: SELECT * FROM `default.lo…

The forum ‘Hortonworks Sandbox’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.