How to Configure Hive Server 2 for ODBC Connection on HDP 1.3 on Windows

This topic contains 5 replies, has 4 voices, and was last updated by  Gold User 1 month ago.

  • Creator
    Topic
  • #33385

    Seth Lyubich
    Keymaster

    By default, the Hive Server 2 service on Windows starts in ‘HTTP’ mode. In order to enable the Hortonworks ODBC driver, the Hive Server 2 service needs to run in ‘Thrift’ mode. Please see steps below on how to configure Hive Server 2 to run in ‘Thrift’ mode.

    1. Stop any Hive running Hive services:

    – hiveserver2
    – hiveserver1
    – hwi
    – metastore

    2. Configure the HiveServer2 service to run in ‘Thrift’ mode

    – locate hive-site.xml located in %HIVE_CONF_DIR%.

    3. Locate hive.server2.servermode property and set it to Thrift. See example below:

    hive.server2.servermode
    thrift
    HiveServer server type: thrift or http

    4. Restart Hive Services from step 1:

    – hiveserver2
    – hiveserver1
    – hwi
    – metastore

    Thanks,
    Seth

Viewing 5 replies - 1 through 5 (of 5 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #64927

    Gold User
    Participant

    To augment my post above, here is the last from the hive.log

    2014-12-25 07:35:16,394 INFO metastore.ObjectStore (ObjectStore.java:setConf(247)) – Initialized ObjectStore
    2014-12-25 07:35:16,401 INFO metadata.HiveUtils (HiveUtils.java:getMetaStoreAuthorizeProviderManagers(353)) – Adding metastore authorization provider: org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider
    2014-12-25 07:35:16,401 INFO metadata.HiveUtils (HiveUtils.java:getMetaStoreAuthorizeProviderManagers(353)) – Adding metastore authorization provider: org.apache.hadoop.hive.ql.security.authorization.MetaStoreAuthzAPIAuthorizerEmbedOnly
    2014-12-25 07:35:16,428 INFO common.FileUtils (FileUtils.java:mkdir(504)) – Creating directory if it doesn’t exist: hdfs://USAMZAPD2038:8020/hive/warehouse/xyz
    2014-12-25 07:35:16,726 INFO DataNucleus.Datastore (Log4JLogger.java:info(77)) – The class “org.apache.hadoop.hive.metastore.model.MFieldSchema” is tagged as “embedded-only” so does not have its own datastore table.
    2014-12-25 07:35:16,727 INFO DataNucleus.Datastore (Log4JLogger.java:info(77)) – The class “org.apache.hadoop.hive.metastore.model.MOrder” is tagged as “embedded-only” so does not have its own datastore table.

    -My successful command runs end with

    2014-12-25 07:34:56,198 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(713)) – 1: get_tables: db=default pat=.*
    2014-12-25 07:34:56,199 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(339)) – ugi=hadoop ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
    2014-12-25 07:34:56,221 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) – </PERFLOG method=runTasks start=1419510896098 end=1419510896221 duration=123 from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,222 INFO hooks.ATSHook (ATSHook.java:<init>(87)) – Created ATS Hook
    2014-12-25 07:34:56,222 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) – <PERFLOG method=PostHook.org.apache.hadoop.hive.ql.hooks.ATSHook from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,222 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) – </PERFLOG method=PostHook.org.apache.hadoop.hive.ql.hooks.ATSHook start=1419510896222 end=1419510896222 duration=0 from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,222 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) – </PERFLOG method=Driver.execute start=1419510895516 end=1419510896222 duration=706 from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,224 INFO ql.Driver (SessionState.java:printInfo(824)) – OK
    2014-12-25 07:34:56,224 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) – <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,224 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) – </PERFLOG method=releaseLocks start=1419510896224 end=1419510896224 duration=0 from=org.apache.hadoop.hive.ql.Driver>
    But this one doesnt

    Collapse
    #64926

    Gold User
    Participant

    Seth,
    In my configuration, the final sql command that I issued below never returns back.

    11/19/2014 11:49 AM 850 metatool
    11/19/2014 11:49 AM 905 schematool
    11/19/2014 11:49 AM 1,634 start_daemons.cmd
    11/19/2014 11:49 AM 1,190 stop_daemons.cmd
    18 File(s) 685,997 bytes
    3 Dir(s) 2,094,387,179,520 bytes free

    D:\apache\hdp\hive-0.14.0.2.2.0.0-2041\bin>beeline
    Beeline version 0.14.0.2.2.0.0-2041 by Apache Hive
    beeline> !connect jdbc:hive2://localhost:10001/default hadoop Hadoop.2013 org.apache.hive.jdbc.HiveDriver
    Connecting to jdbc:hive2://localhost:10001/default
    14/12/25 07:34:51 INFO jdbc.Utils: Supplied authorities: localhost:10001
    14/12/25 07:34:51 INFO jdbc.Utils: Resolved authority: localhost:10001
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/D:/apache/hdp/hadoop-2.6.0.2.2.0.0-2041/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.cl
    ass]
    SLF4J: Found binding in [jar:file:/D:/apache/hdp/hive-0.14.0.2.2.0.0-2041/lib/hive-jdbc-0.14.0.2.2.0.0-2041-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.cl
    ass]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    14/12/25 07:34:51 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://localhost:10001/default
    Connected to: Apache Hive (version 0.14.0.2.2.0.0-2041)
    Driver: Hive JDBC (version 0.14.0.2.2.0.0-2041)
    Transaction isolation: TRANSACTION_REPEATABLE_READ
    0: jdbc:hive2://localhost:10001/default> show tables;
    +———–+–+
    | tab_name |
    +———–+–+
    +———–+–+
    No rows selected (1.785 seconds)
    0: jdbc:hive2://localhost:10001/default> !sql create table xyz (zyx1 int);

    I have the settings for HTTP mode. With or without it the results are the same. No other suggested changes had any impact with this issue. This is the case with external JDBC access also. The simple commands like list and show work but the first sql statement that is valid freezes the connection. There is no error reported in the log to point to any ERRORs.

    My config – Win 2008, JDK 1.7_45, Hortonworks 2.2, SQL Server 2008 for Hive access.

    Collapse
    #60404

    Trevor Philipps
    Participant

    I encountered another problem. If i start my hive server 2 in thrift mode, i get following errors in my log-files:


    2014-09-15 09:30:52,856 ERROR [pool-5-thread-1]: server.TThreadPoolServer (TThreadPoolServer.java:run(215)) - Error occurred during processing of message.
    java.lang.RuntimeException: org.apache.thrift.transport.TTransportException
    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:189)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    Caused by: org.apache.thrift.transport.TTransportException
    at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
    at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
    at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
    at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
    ... 4 more

    Anyone know how to solve this problem?

    Collapse
    #44375

    Field Wachi
    Participant

    Problem has been resolved.
    UserName : Hadoop and open the firewall…..

    Collapse
    #44374

    Field Wachi
    Participant

    I encountered a problem.

    My hive-site.xml
    hive.server2.servermode : thrift
    hive.server2.thrift.port : 10001
    javax.jdo.option.ConnectionUserName : hive
    javax.jdo.option.ConnectionPassword : hive

    My Hortonworks Hive ODBC Driver Configuration
    host : HDPSERVER
    post : 10001
    database : default
    hive server type : hive server 2
    mechainism : username
    username :hive

    Test Results shows
    Driver Version : v1.2.0.1005
    Running connectivity tests…
    Attempting connection
    Failed to establish connection
    SQLSTATE: HY000[Hortonworks][Hardy] (34) Error from Hive: connect() failed: errno = 10060.
    TESTS COMPLETED WITH ERROR.

    Collapse
Viewing 5 replies - 1 through 5 (of 5 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.