Hive / HCatalog Forum

using user-supplied jars

  • #28479
    Andrew Hume
    Participant

    we are running a jdbc query that uses an .jar we have.
    we installed the jar file in the same path on all nodes in our cluster,
    and added a property to the hive hive-site.xml file:

    hive.aux.jars.path
    file:///usr/lib/hive/auxlib/SA_UDF.jar
    available for one and all

    we pushed this file out to all nodes, and then stopped/restarted hive.

    after all that, the query still can’t find teh jar file (or more exactly, a class
    defined in that jar file).

    what did i do wrong?
    (and is this documented anywhere?)

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #28480
    Andrew Hume
    Participant

    somehow the wrteched editor corrupted teh property.
    here it is again:

    hive.aux.jars.path
    file:///usr/lib/hive/auxlib/SA_UDF.jar
    available for one and all

    #28481
    Andrew Hume
    Participant

    bollocks! i don’t know how to escap teh xml so i’ll set off teh field names

    name=hive.aux.jars.path
    value=file:///usr/lib/hive/auxlib/SA_UDF.jar
    description=available for one and all

    #28495
    Seth Lyubich
    Moderator

    Hi Andrew,

    I think you can try few things:

    1. From Hive shell add your jar with command ‘add jar /path/to/jar’.
    2. Run Hive command specifying aux jar with something like below:
    hive –auxpath /path-to-/hive-examples.jar

    Hope this helps,

    Thanks,
    Seth

    #28496
    Andrew Hume
    Participant

    the hive query works; its just the JDBC query that doesn’t find it.

    #28591
    Andrew Hume
    Participant

    after putting some debugging and being reminded of how moronic the java jar paths thing is,
    i decided to brute force it by adding
    HIVE_AUX_JARS_PATH=/usr/lib/hive/auxlib
    near the top of hive-config.sh.

    problem solved.

    #28609
    tedr
    Moderator

    Hi Andrew,

    Thanks for letting us know that you solved the problem and how you did it.

    Thanks,
    Ted.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.