The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Windows – Installation Forum

How to Configure Hive Server 2 for ODBC Connection on HDP 1.3 on Windows

  • #33385
    Seth Lyubich

    By default, the Hive Server 2 service on Windows starts in ‘HTTP’ mode. In order to enable the Hortonworks ODBC driver, the Hive Server 2 service needs to run in ‘Thrift’ mode. Please see steps below on how to configure Hive Server 2 to run in ‘Thrift’ mode.

    1. Stop any Hive running Hive services:

    – hiveserver2
    – hiveserver1
    – hwi
    – metastore

    2. Configure the HiveServer2 service to run in ‘Thrift’ mode

    – locate hive-site.xml located in %HIVE_CONF_DIR%.

    3. Locate hive.server2.servermode property and set it to Thrift. See example below:

    HiveServer server type: thrift or http

    4. Restart Hive Services from step 1:

    – hiveserver2
    – hiveserver1
    – hwi
    – metastore


  • Author
  • #44374
    Field Wachi

    I encountered a problem.

    My hive-site.xml
    hive.server2.servermode : thrift
    hive.server2.thrift.port : 10001
    javax.jdo.option.ConnectionUserName : hive
    javax.jdo.option.ConnectionPassword : hive

    My Hortonworks Hive ODBC Driver Configuration
    host : HDPSERVER
    post : 10001
    database : default
    hive server type : hive server 2
    mechainism : username
    username :hive

    Test Results shows
    Driver Version : v1.2.0.1005
    Running connectivity tests…
    Attempting connection
    Failed to establish connection
    SQLSTATE: HY000[Hortonworks][Hardy] (34) Error from Hive: connect() failed: errno = 10060.

    Field Wachi

    Problem has been resolved.
    UserName : Hadoop and open the firewall…..

    Trevor Philipps

    I encountered another problem. If i start my hive server 2 in thrift mode, i get following errors in my log-files:

    2014-09-15 09:30:52,856 ERROR [pool-5-thread-1]: server.TThreadPoolServer ( - Error occurred during processing of message.
    java.lang.RuntimeException: org.apache.thrift.transport.TTransportException
    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(
    at org.apache.thrift.server.TThreadPoolServer$
    at java.util.concurrent.ThreadPoolExecutor.runWorker(
    at java.util.concurrent.ThreadPoolExecutor$
    Caused by: org.apache.thrift.transport.TTransportException
    at org.apache.thrift.transport.TTransport.readAll(
    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(
    at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(
    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(
    ... 4 more

    Anyone know how to solve this problem?

    Gold User

    In my configuration, the final sql command that I issued below never returns back.

    11/19/2014 11:49 AM 850 metatool
    11/19/2014 11:49 AM 905 schematool
    11/19/2014 11:49 AM 1,634 start_daemons.cmd
    11/19/2014 11:49 AM 1,190 stop_daemons.cmd
    18 File(s) 685,997 bytes
    3 Dir(s) 2,094,387,179,520 bytes free

    Beeline version by Apache Hive
    beeline> !connect jdbc:hive2://localhost:10001/default hadoop Hadoop.2013 org.apache.hive.jdbc.HiveDriver
    Connecting to jdbc:hive2://localhost:10001/default
    14/12/25 07:34:51 INFO jdbc.Utils: Supplied authorities: localhost:10001
    14/12/25 07:34:51 INFO jdbc.Utils: Resolved authority: localhost:10001
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/D:/apache/hdp/hadoop-!/org/slf4j/impl/
    SLF4J: Found binding in [jar:file:/D:/apache/hdp/hive-!/org/slf4j/impl/
    SLF4J: See for an explanation.

    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    14/12/25 07:34:51 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://localhost:10001/default
    Connected to: Apache Hive (version
    Driver: Hive JDBC (version
    Transaction isolation: TRANSACTION_REPEATABLE_READ
    0: jdbc:hive2://localhost:10001/default> show tables;
    | tab_name |
    No rows selected (1.785 seconds)
    0: jdbc:hive2://localhost:10001/default> !sql create table xyz (zyx1 int);

    I have the settings for HTTP mode. With or without it the results are the same. No other suggested changes had any impact with this issue. This is the case with external JDBC access also. The simple commands like list and show work but the first sql statement that is valid freezes the connection. There is no error reported in the log to point to any ERRORs.

    My config – Win 2008, JDK 1.7_45, Hortonworks 2.2, SQL Server 2008 for Hive access.

    Gold User

    To augment my post above, here is the last from the hive.log

    2014-12-25 07:35:16,394 INFO metastore.ObjectStore ( – Initialized ObjectStore
    2014-12-25 07:35:16,401 INFO metadata.HiveUtils ( – Adding metastore authorization provider:
    2014-12-25 07:35:16,401 INFO metadata.HiveUtils ( – Adding metastore authorization provider:
    2014-12-25 07:35:16,428 INFO common.FileUtils ( – Creating directory if it doesn’t exist: hdfs://USAMZAPD2038:8020/hive/warehouse/xyz
    2014-12-25 07:35:16,726 INFO DataNucleus.Datastore ( – The class “org.apache.hadoop.hive.metastore.model.MFieldSchema” is tagged as “embedded-only” so does not have its own datastore table.
    2014-12-25 07:35:16,727 INFO DataNucleus.Datastore ( – The class “org.apache.hadoop.hive.metastore.model.MOrder” is tagged as “embedded-only” so does not have its own datastore table.

    -My successful command runs end with

    2014-12-25 07:34:56,198 INFO metastore.HiveMetaStore ( – 1: get_tables: db=default pat=.*
    2014-12-25 07:34:56,199 INFO HiveMetaStore.audit ( – ugi=hadoop ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
    2014-12-25 07:34:56,221 INFO log.PerfLogger ( – </PERFLOG method=runTasks start=1419510896098 end=1419510896221 duration=123 from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,222 INFO hooks.ATSHook (<init>(87)) – Created ATS Hook
    2014-12-25 07:34:56,222 INFO log.PerfLogger ( – <PERFLOG from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,222 INFO log.PerfLogger ( – </PERFLOG start=1419510896222 end=1419510896222 duration=0 from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,222 INFO log.PerfLogger ( – </PERFLOG method=Driver.execute start=1419510895516 end=1419510896222 duration=706 from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,224 INFO ql.Driver ( – OK
    2014-12-25 07:34:56,224 INFO log.PerfLogger ( – <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
    2014-12-25 07:34:56,224 INFO log.PerfLogger ( – </PERFLOG method=releaseLocks start=1419510896224 end=1419510896224 duration=0 from=org.apache.hadoop.hive.ql.Driver>
    But this one doesnt

    xin wang

    I got the following error if I don’t use “default” database schema on Windows 32 bit ODBC driver:

    Driver Version: V1.4.14.1014

    Running connectivity tests…

    Attempting connection
    Failed to establish connection
    SQLSTATE: HY000[Hortonworks][HiveODBC] (68) Error returned trying to set hive as the initial database: Error while compiling statement: FAILED: SemanticException [Error 10072]: Database does not exist: hive; Also tried quoting the database name hive but the query failed with the following error: Error while compiling statement: FAILED: SemanticException [Error 10072]: Database does not exist: hive

    Any ideas?

The forum ‘HDP on Windows – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.