Error from Hive: '403' Message: 'Error while processing statement: null'

to create new topics or reply. | New User Registration

Tagged: , , ,

This topic contains 0 replies, has 1 voice, and was last updated by  Steen Paulsen 1 year, 10 months ago.

  • Creator
    Topic
  • #37526

    Steen Paulsen
    Participant

    I have created a table in Hive and loaded it with data.
    Whenever I try to get data from the table via ODBC I get the following error:

    “Open Database Connectivity (ODBC) error occurred. state: ‘HY000′. Native Error Code: 35. [Hortonworks][Hardy] (35) Error from Hive: error code: ‘403’ error message: ‘Error while processing statement: null’.”

    The table has four columns:
    type – string
    date – string
    entries – int
    value – string

    When checking for null values in Hive, nothing is returned.
    – SELECT COUNT(*) FROM tablename WHERE type IS NULL;
    – SELECT COUNT(*) FROM tablename WHERE date IS NULL;
    – SELECT COUNT(*) FROM tablename WHERE entries IS NULL;
    – SELECT COUNT(*) FROM tablename WHERE value IS NULL;
    All queries returns “0”, so zero NULL values here.

    I can do SELECT statements via command line directly in Hive without any problems (SELECT * FROM tablename;).
    But as soon as I’m trying to access the data via the ODBC connector, the above happens.
    The response I get comes almost instantly, which leads me to believe that it is an issue with the driver as the data in the table is 17GB, naturally this takes some time processing.

    My setup is a cluster of three CentOS 6.4 nodes with HDP 1.3.2 in Hyper-V on Windows 8 Pro.
    Any ideas what could be happening?

You must be to reply to this topic. | Create Account

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.