Beeswax and Hcat "Timed Out" Errors

to create new topics or reply. | New User Registration


This topic contains 3 replies, has 3 voices, and was last updated by  PACHONG Donald 8 months, 1 week ago.

  • Creator
  • #55575

    Horton Works

    All of the sandbox features work except Beeswax and HCat.
    When I click on these I get a “Timed Out” error message.

    Here are the log file entries for the HCat “Timed Out”

    HCat Timed Out

    [11/Jun/2014 06:48:00] middleware DEBUG No desktop_app known for request.
    [11/Jun/2014 06:48:00] access INFO frank – “GET /hcatalog/ HTTP/1.0″
    [11/Jun/2014 06:48:00] views DEBUG Getting database name from cookies
    [11/Jun/2014 06:48:00] thrift_util DEBUG Thrift call: <class ‘hive_metastore.ThriftHiveMetastore.Client’>.get_all_databases(args=(), kwargs={})
    [11/Jun/2014 06:48:10] thrift_util WARNING Not retrying thrift call get_all_databases due to socket timeout
    [11/Jun/2014 06:48:10] thrift_util INFO Thrift saw a socket error: timed out
    [11/Jun/2014 06:48:10] middleware INFO Processing exception: timed out (code THRIFTSOCKET): None: Traceback (most recent call last):
    File “/usr/lib/hue/build/env/lib/python2.6/site-packages/Django-1.2.3-py2.6.egg/django/core/handlers/”, line 100, in get_response
    response = callback(request, *callback_args, **callback_kwargs)
    File “/usr/lib/hue/apps/hcatalog/src/hcatalog/”, line 53, in index
    return show_tables(request, database=database)
    File “/usr/lib/hue/apps/hcatalog/src/hcatalog/”, line 92, in show_tables
    databases = db.get_databases()
    File “/usr/lib/hue/apps/beeswax/src/beeswax/server/”, line 92, in get_databases
    return self.client.get_databases()
    File “/usr/lib/hue/apps/beeswax/src/beeswax/server/”, line 124, in get_databases
    return self.meta_client.get_all_databases()
    File “/usr/lib/hue/desktop/core/src/desktop/lib/”, line 302, in wrapper
    raise StructuredException(‘THRIFTSOCKET’, str(e), data=None, error_code=502)
    StructuredException: timed out (code THRIFTSOCKET): None

    [11/Jun/2014 06:48:13] middleware DEBUG No desktop_app known for request.
    [11/Jun/2014 06:48:13] access INFO frank – “GET /about/ HTTP/1.0″
    [11/Jun/2014 06:48:16] access WARNING frank – “GET /logs HTTP/1.0″
    [11/Jun/2014 06:48:17] access WARNING frank – “POST /logs HTTP/1.0″
    [11/Jun/2014 06:48:21] access WARNING frank – “GET /download_logs HTTP/1.0″

Viewing 3 replies - 1 through 3 (of 3 total)

You must be to reply to this topic. | Create Account

  • Author
  • #57606

    PACHONG Donald

    do you have a solution for Hue 2.3 with a 3 nodes clusters ?

    Here is my log file
    [22/Jul/2014 04:46:46 +0000] access WARNING XXXX admin – “GET /logs HTTP/1.1″

    [22/Jul/2014 04:45:00 +0000] access WARNING XXXX admin – “POST /logs HTTP/1.1″

    [22/Jul/2014 04:44:46 +0000] middleware INFO Processing exception: Could not read table: Traceback (most recent call last):
    File “/usr/lib/hue/build/env/lib/python2.6/site-packages/Django-1.2.3-py2.6.egg/django/core/handlers/”, line 100, in get_response
    response = callback(request, *callback_args, **callback_kwargs)
    File “/usr/lib/hue/apps/hcatalog/src/hcatalog/”, line 408, in read_table
    raise PopupException(_(‘Could not read table’), detail=e)
    PopupException: Could not read table

    [22/Jul/2014 04:44:41 +0000] thrift_util DEBUG Thrift call <class ‘hive_metastore.ThriftHiveMetastore.Client’>.get_table returned in 31ms: Table(partitionKeys=[], parameters={‘transient_lastDdlTime': ‘1406028155’}, privileges=None, tableName=’logs’, tableType=’MANAGED_TABLE’, createTime=1406028155, lastAccessTime=0, viewOriginalText=None, owner=’admin’, viewExpandedText=None, sd=StorageDescriptor(outputFormat=’’, sortCols=[], inputFormat=’org.apache.hadoop.mapred.TextInputFormat’, cols=[FieldSchema(comment=None, type=’string’, name=’name’), FieldSchema(comment=None, type=’string’, name=’surname’)], compressed=False, bucketCols=[], numBuckets=-1, parameters={}, serdeInfo=SerDeInfo(serializationLib=’org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe’, name=None, parameters={‘field.delim': ‘\x01′, ‘serialization.format': ‘\x01′, ‘mapkey.delim': ‘\x03′, ‘colelction.delim': ‘\x02′}), location=’hdfs://hdp-cluster-dfossouo-1.novalocal:8020/apps/hive/warehouse/logs’), dbName=’default’, retention=0)

    [22/Jul/2014 04:44:41 +0000] thrift_util DEBUG Thrift call: <class ‘hive_metastore.ThriftHiveMetastore.Client’>.get_table(args=(u’default’, u’logs’), kwargs={})

    [22/Jul/2014 04:44:41 +0000] views DEBUG Getting database name from argument

    [22/Jul/2014 04:44:41 +0000] access INFO XXXX admin – “GET /hcatalog/table/default/logs/read HTTP/1.1″

    [22/Jul/2014 04:44:02 +0000] access WARNING XXXX admin – “POST /logs HTTP/1.1″

    [22/Jul/2014 04:42:00 +0000] access WARNING XXXX admin – “POST /logs HTTP/1.1″

    [22/Jul/2014 04:41:46 +0000] access WARNING XXXX min – “POST /logs HTTP/1.1″

    [22/Jul/2014 04:41:41 +0000] thrift_util DEBUG Thrift call: <class ‘beeswaxd.BeeswaxService.Client’>.query(args=(Query(query=’SELECT * FROM default.logs‘, configuration=[‘use default’], hadoop_user=u’admin’),), kwargs={})

    [22/Jul/2014 04:41:41 +0000] dbms DEBUG Made new QueryHistory id 1 user admin query: SELECT * FROM `default.lo…


    Horton Works

    I could not get it to work using the VMPLAYER version so I tried the Oracle Virtual BOX version and that worked fine.



    Have you ensure your Hcatalog and Hive are up and running for the sandbox? Can you use these services via command line successfully?


Viewing 3 replies - 1 through 3 (of 3 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.