Home Forums Hive / HCatalog Some Hive commands when using Hive ODBC driver from QlikView

Tagged: , ,

This topic contains 5 replies, has 2 voices, and was last updated by  Göran Sander 8 months, 4 weeks ago.

  • Creator
    Topic
  • #44574

    Göran Sander
    Participant

    I am running Hive queries from QlikView (version 11.20 SR4 (=latest), 64 bit, Windows Server 2008, latest services packs etc), using the Hive ODBC driver to connect to a Hive Server 2.
    That works fine for queries such as

    describe mytable;

    or

    ADD FILE /my/file/name.dat;

    or

    select * from mytable limit 10;

    But fails for anything involving a “set”, for example:

    SET mapred.job.queue.name=default;

    Specifically, running this VERY basic QlikView reload script fails:

    ODBC CONNECT TO Hive64;
    SQL SET mapred.job.queue.name=default;
    DISCONNECT;

    With an error message as follows:
    Problem signature:
    Problem Event Name: APPCRASH
    Application Name: QvConnect64.EXE
    Application Version: 11.20.12129.0
    Application Timestamp: 526e3c90
    Fault Module Name: ntdll.dll
    Fault Module Version: 6.0.6002.18881
    Fault Module Timestamp: 51da3d16
    Exception Code: c0000005
    Exception Offset: 000000000002574a
    OS Version: 6.0.6002.2.2.0.274.10
    Locale ID: 1053
    Additional Information 1: 26a8
    Additional Information 2: c965ed09d6a44175f02fda08aa2f151d
    Additional Information 3: b049
    Additional Information 4: b6d6acd7334e3a17a113029635b23526

    Read our privacy statement:

    http://go.microsoft.com/fwlink/?linkid=50163&clcid=0x0409

    Granted – this might be a problem in QlikView rather than the Hive ODBC driver. Still posting here though in case others have the same problem, or in case someone has good ideas on how to solve or work around it.

    Thanks!

Viewing 5 replies - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #47556

    Göran Sander
    Participant

    Oh, by the way:
    I have verified that the /my/file/name.dat file does exist. It in fact exists both on the Windows server, as well as on the server where HiveServer2 is running.

    Collapse
    #47555

    Göran Sander
    Participant

    So I am still struggling with this problem, even after installing the latest HortonWorks ODBC driver (1.03.19.13).

    The Hive code that works just fine when executed from CLI with “hive -f code.hql” fails when executed through ODBC connector and HiveServer2.

    Looking more closely in the logs actually give some clues:

    With ODBC tracing turned on, after sending the
    ADD FILE /my/file/name.dat;
    statement to the HiveServer2, the following appears in the SQL.log file:


    X ODBC 2bbc-2268 ENTER SQLExecDirectW
    HSTMT 0x00000000004BB430
    WCHAR * 0x0000000001DACE58 [ -3] "ADD FILE d:\GeoIPCity.dat\ 0"
    SDWORD -3

    X ODBC 2bbc-2268 EXIT SQLExecDirectW with return code 0 (SQL_SUCCESS)
    HSTMT 0x00000000004BB430
    WCHAR * 0x0000000001DACE58 [ -3] "ADD FILE d:\GeoIPCity.dat\ 0"
    SDWORD -3

    X ODBC 2bbc-2268 ENTER SQLNumResultCols
    HSTMT 0x00000000004BB430
    SWORD * 0x000000000013EAF4

    X ODBC 2bbc-2268 EXIT SQLNumResultCols with return code 0 (SQL_SUCCESS)
    HSTMT 0x00000000004BB430
    SWORD * 0x000000000013EAF4 (0)

    X ODBC 2bbc-2268 ENTER SQLFetch
    HSTMT 0x00000000004BB430

    X ODBC 2bbc-2268 EXIT SQLFetch with return code -1 (SQL_ERROR)
    HSTMT 0x00000000004BB430

    DIAG [24000] [Hortonworks][ODBC] (10510) Invalid cursor state. (10510)

    X ODBC 2bbc-2268 ENTER SQLErrorW
    HENV 0x00000000004B8F20
    HDBC 0x00000000004B9000
    HSTMT 0x00000000004BB430
    WCHAR * 0x0000000001DAA458
    SDWORD * 0x000000000013FA68
    WCHAR * 0x0000000002240098
    SWORD 2047
    SWORD * 0x000000000013FA00

    X ODBC 2bbc-2268 EXIT SQLErrorW with return code 0 (SQL_SUCCESS)
    HENV 0x00000000004B8F20
    HDBC 0x00000000004B9000
    HSTMT 0x00000000004BB430
    WCHAR * 0x0000000001DAA458 [ 5] "24000"
    SDWORD * 0x000000000013FA68 (10510)
    WCHAR * 0x0000000002240098 [ 49] "[Hortonworks][ODBC] (10510) Invalid cursor state."
    SWORD 2047
    SWORD * 0x000000000013FA00 (49)

    X ODBC 2bbc-2268 ENTER SQLFreeHandle
    SQLSMALLINT 3 <SQL_HANDLE_STMT>
    SQLHANDLE 0x00000000004BB430

    X ODBC 2bbc-2268 EXIT SQLFreeHandle with return code 0 (SQL_SUCCESS)
    SQLSMALLINT 3 <SQL_HANDLE_STMT>
    SQLHANDLE 0x00000000004BB430

    X ODBC 2bbc-2268 ENTER SQLDisconnect
    HDBC 0x00000000004B9000

    Does this mean that the ADD FILE statement never reahed the HiveServer2? That could explain the subsequent failure when the query is actually running..
    Would this be a bug in the ODBC driver??

    Collapse
    #44929

    Göran Sander
    Participant

    Btw, when you mention setting properties in the cluster directly, more exactly what do you think of then?
    Where would I do that? In some config file I guess.. but where?

    Collapse
    #44928

    Göran Sander
    Participant

    @yi Zhang, thanks for responding.
    Sounds like I might have stumbled upon this bug..
    Setting properties in the cluster directly might be possible but not really practical..

    Any insights into when a fixed ODBC driver might be available?

    Collapse
    #44804

    Yi Zhang
    Moderator

    Hi Goran,

    There is a bug with the ODBC driver with certain applications. Some applications don’t have this problem with the ODBC driver. Basically the property passed are turned into capital case and not recognized by hadoop. Alternatively you can set these properties in the cluster directly if that is acceptable.

    Thanks,
    Yi

    Collapse
Viewing 5 replies - 1 through 5 (of 5 total)