Home Forums Hortonworks Sandbox Tutorial 1 – error building nyse_stocks table (HCatalog)

This topic contains 18 replies, has 14 voices, and was last updated by  Muhamad Pajar 2 days, 8 hours ago.

  • Creator
    Topic
  • #29722

    John Fisk
    Member

    I am receiving this error:

    The following error(s) occurred:

    HCatClient error on create table: {“statement”:”use default; create table nyse_stocks3(`exchange` string, `stock_symbol` string, `date` string, `stock_price_open` float, `stock_price_high` float, `stock_price_low` float, `stock_price_close` double, `stock_volume` bigint, `stock_price_adj_close` float) row format delimited fields terminated by ‘\\t’;”,”error”:”unable to create table: nyse_stocks3″,”exec”:{“stdout”:””,”stderr”:”13/07/19 08:37:04 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.\nOK\nTime taken: 11.202 seconds\n”,”exitcode”:143}} (error 500)
    Any help appreciated!

Viewing 18 replies - 1 through 18 (of 18 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #60559

    Muhamad Pajar
    Participant

    I think one factor is the computer specifications

    Collapse
    #60558

    Muhamad Pajar
    Participant

    hi, I’ve been trying for 4 days with hdp sandbox 2.1 and 2.0 and the results are still error 500,

    I’ve tried editing the webchat-site.xml on linux and the results are worth it,
    I’ve tried to add memory to the hdp sandbox 2.1 and 2.0 in virtual box from 1024 mb to 1536 mb but still error.

    and in the end I tried to use hdp sandbox 1.3 then succesfully.

    Collapse
    #52860

    Louise Glynn
    Participant

    Sorry I can’t delete post below – I’m stll having an issue with hcatalog create table….

    Collapse
    #52847

    Louise Glynn
    Participant

    I got this error and I rebooted my machine and the error went away. I couldn’t create a database, a table, view the databases etc using Hcat

    Collapse
    #49569

    Hi,

    Below is the full error text I get when importing the NYSE files (NYSE-2000-2001.tsv.gz)

    HCatClient error on create table: {“statement”:”use default; create table nyse_stocks_3(exchange string, stock_symbol string, date string, stock_price_open double, stock_price_high double, stock_price_low double, stock_price_close double, stock_volume bigint, stock_price_adj_close double) row format delimited fields terminated by ‘\\t’;”,”error”:”unable to create table: nyse_stocks_3″,”exec”:{“stdout”:””,”stderr”:”which: no /usr/lib/hadoop/bin/hadoop in ((null))\ndirname: missing operand\nTry `dirname –help’ for more information.\n14/03/05 02:43:39 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive\n14/03/05 02:43:39 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize\n14/03/05 02:43:39 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize\n14/03/05 02:43:39 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack\n14/03/05 02:43:39 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node\n14/03/05 02:43:39 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces\n14/03/05 02:43:39 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative\nSLF4J: Class path contains multiple SLF4J bindings.\nSLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.\nSLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]\nOK\nTime taken: 8.358 seconds\nOK\nTime taken: 40.204 seconds\n Command was terminated due to timeout(60000ms). See templeton.exec.timeout property”,”exitcode”:143}} (error 500)

    Collapse
    #46176

    I am Using Sandbox 2.o on VMware player. I found error 500 while trying to create table using HCatalog on Tutorial 1.
    I tried to ssh to sandbox at 127.0.0.1 on port 2222 but found connection refused.
    How do I overcome this problem?
    I tried to modify firewall of ssh iptables to include port 2222, but again I found connection refused.
    please help, i stopped studying tutorial since I can’t proceed.

    Collapse
    #43121

    Shahab Khan
    Member

    Had the same error:
    So i did what the earlier post said and removed the hive.metastore.local from the file /etc/hcatalog/conf/webhcat-site.xml. However, i was still getting the same error. Doing a unix find | xargs grep i found some files in /tmp/hadoop-hcat/* that this value so i removed the propery from them also. Now the error is simialr but not exactly the same:
    [06/Nov/2013 10:34:37 +0000] thrift_util DEBUG Thrift call .get_tables returned in 689ms: ['sample_07', 'sample_08']

    [06/Nov/2013 10:34:36 +0000] thrift_util DEBUG Thrift call: .get_tables(args=(‘default’, ‘.*’), kwargs={})

    [06/Nov/2013 10:34:36 +0000] access INFO 10.0.2.2 hue – “POST /hcatalog/get_tables HTTP/1.0″

    [06/Nov/2013 10:34:34 +0000] access INFO 10.0.2.2 hue – “POST /shell/retrieve_output HTTP/1.0″

    [06/Nov/2013 10:34:31 +0000] hcat_client ERROR {“statement”:”use default; create table test(`exchange` string, `stock_symbol` string, `date` string, `stock_price_open` float, `stock_price_high` float, `stock_price_low` float, `stock_price_close` float, `stock_volume` bigint, `stock_price_adj_close` float) comment ‘nyse’ row format delimited fields terminated by ‘\\t’;”,”error”:”unable to create table: test”,”exec”:{“stdout”:””,”stderr”:””,”exitcode”:143}} (error 500)
    Traceback (most recent call last):
    File “/usr/lib/hue/apps/hcatalog/src/hcatalog/hcat_client.py”, line 169, in create_table
    resp = self.put(‘ddl/database/%s/table/%s’ % (dbname, table), data=query)
    File “/usr/lib/hue/apps/pig/src/pig/templeton.py”, line 54, in put
    raise error
    RestException: {“statement”:”use default; create table test(`exchange` string, `stock_symbol` string, `date` string, `stock_price_open` float, `stock_price_high` float, `stock_price_low` float, `stock_price_close` float, `stock_volume` bigint, `stock_price_adj_close` float) comment ‘nyse’ row format delimited fields terminated by ‘\\t’;”,”error”:”unable to create table: test”,”exec”:{“stdout”:””,”stderr”:””,”exitcode”:143}} (error 500)

    [06/Nov/2013 10:33:27 +0000] http_client DEBUG PUT http://sandbox:50111/templeton/v1/ddl/database/default/table/test?user.name=hue

    [06/Nov/2013 10:33:27 +0000] hcat_client INFO HCatalog client, create table query:
    {
    “columns”: [{"name": "`exchange`", "type": "string"},{"name": "`stock_symbol`", "type": "string"},{"name": "`date`", "type": "string"},{"name": "`stock_price_open`", "type": "float"},{"name": "`stock_price_high`", "type": "float"},{"name": "`stock_price_low`", "type": "float"},{"name": "`stock_price_close`", "type": "float"},{"name": "`stock_volume`", "type": "bigint"},{"name": "`stock_price_adj_close`", "type": "float"}],
    “comment”: “nyse”,
    “format”: {
    “rowFormat”: {
    “fieldsTerminatedBy”: “\\t”
    }
    },
    “permissions”: “rwxrwxrwx”,
    “external”: “false”
    }

    Collapse
    #35626

    Mark None
    Member

    The deprecation warning is actually coming through the stadard out as an error and stopping the import routine.

    Follow these steps to fix:
    1) You need to login to the Sandbox SSH at 127.0.0.1, port 2222 with the login/password root/hadoop.
    2) Type this command: nano /etc/hcatalog/conf/webhcat-site.xml
    3) Scroll down with the down arrow and look for the line that contains the following: hive.metastore.local=false,hive.metastore.uris=thrift://sandbox:9083,hive.metastore.sasl.enabled=yes,hive.metastore.execute.setugi=true
    4) Remove hive.metastore.local=false, from this line
    5) CTRL+X to exit the Nano editor. Remember to press Y to save the change.
    6) Restart your Sandbox. I did a VirtualBox reboot.

    Collapse
    #33520

    Looks like the same error OP is getting. I don’t see a log tab either, but here is the error (pretty sure it’s exactly the same, minus the datestamp).


    The following error(s) occurred:

    HCatClient error on create table: {"statement":"use default; create table nyse_stocks(`exchange` string, `stock_symbol` string, `date` string, `stock_price_open` float, `stock_price_high` float, `stock_price_low` float, `stock_price_close` float, `stock_volume` bigint, `stock_price_adj_close` float) row format delimited fields terminated by '\\t';","error":"unable to create table: nyse_stocks","exec":{"stdout":"","stderr":"13/08/28 04:01:37 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.\nOK\nTime taken: 6.316 seconds\n","exitcode":143}} (error 500)

    Collapse
    #32823


    Member

    Same error for me also. I am using Sandbox 2.0 Community Preview Oracle VM.
    Error log as below:
    HCatClient error on create table: {“statement”:”use default; create table nyse_stocks(`exchange` string, `stock_symbol` string, `date` string, `stock_price_open` double, `stock_price_high` double, `stock_price_low` double, `stock_price_close` double, `stock_volume` bigint, `stock_price_adj_close` double) row format delimited fields terminated by ’1′;”,”error”:”unable to create table: nyse_stocks”,”exec”:{“stdout”:””,”stderr”:”which: no /usr/lib/hadoop/bin/hadoop in ((null))\ndirname: missing operand\nTry `dirname –help’ for more information.\n13/08/21 08:45:45 WARN conf.Configuration: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize\n13/08/21 08:45:45 WARN conf.Configuration: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize\n13/08/21 08:45:45 WARN conf.Configuration: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack\n13/08/21 08:45:45 WARN conf.Configuration: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node\n13/08/21 08:45:45 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces\n13/08/21 08:45:45 WARN conf.Configuration: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative\n13/08/21 08:45:48 WARN conf.Configuration: org.apache.hadoop.hive.conf.LoopingByteArrayInputStream@5954864a:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.\n13/08/21 08:45:48 WARN conf.Configuration: org.apache.hadoop.hive.conf.LoopingByteArrayInputStream@5954864a:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.\n13/08/21 08:45:48 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.\nSLF4J: Class path contains multiple SLF4J bindings.\nSLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.\nOK\nTime taken: 6.879 seconds\nOK\n”,”exitcode”:143}} (error 500)

    Collapse
    #30526

    Cheryle Custer
    Moderator

    You may want to try removing the Virtual Machine and re-importing it.

    Cheryle

    Collapse
    #30069

    Hi,

    All the tutorials are working fine for me.
    Could you please share the logs and the steps you have tried. Just want to know any of the step that was mentioned got bypassed.
    Also let me now which hadoop sandbox has been used .
    Regards

    Collapse
    #29926

    Hi Ted,

    Even I am facing the same issue,could you please help me to resolve it.

    Thanks,
    Suthan

    Collapse
    #29772

    Sasha J
    Moderator

    Works perfectly fine for me in Sandbox 1.3.
    Works perfectly fine for me in Sandbox 2.0.

    Please, double check if you have set up correctly.

    Thank you!
    Sasha

    Collapse
    #29771

    Sasha J
    Moderator

    John,
    by the way, are we talking about Sandbox 1.3 or Sandbox 2.0?

    Thank you!
    Sasha

    Collapse
    #29738

    John Fisk
    Member

    Hi Ted – I tried again this morning – same error and no “log” tab was created….?

    John

    Collapse
    #29731

    John Fisk
    Member

    Hi Ted –

    Thank you for the reply – yes, I will paste the log contents here soon – probably tomorrow morning EST.

    John

    Collapse
    #29725

    tedr
    Moderator

    Hi John,

    Just after the error is shown on the screen when you attempt to create the table, there should be a tab named ‘log’ show on the page, could you go to that tab and copy the entire contents and paste them here?

    Thanks,
    Ted.

    Collapse
Viewing 18 replies - 1 through 18 (of 18 total)