Hive / HCatalog Forum

jdbc test

  • #33771

    I’m working with SAS support to make SAS connect work with HDP 1.1 on Windows.
    They gave me a test Java client to test the JDBC connection but we can’t run it :(
    It can’t load: org.apache.hadoop.hive.jdbc.HiveDriver
    It’s like this is not there:

    Test client:


    import java.io.*;
    import java.util.*;
    import java.sql.*;

    public class HiveJdbcClient3 {
    public static void main(String[] args) throws SQLException,ClassNotFoundException,InstantiationException,IllegalAccessException {

    int Hive_edition = 0;
    Driver hiveDriver;
    Connection con;
    String connection_string;
    Properties props = new Properties();

    if (args.length == 2)
    {
    System.out.println( "Performing a Hive1 connect to: " + args[0] + ":" + args[1] );
    Hive_edition = 1;
    }
    else if (args.length == 3)
    {
    System.out.println("Performing a KERBEROS Hive2 connect to: " + args[0] + ":" + args[1] + " with Kerberos principle " + args[2]);
    Hive_edition = 2;
    }
    else
    {
    System.out.println("Invalid invocation, #args must be 2 or 3, this invocation gave: " + args.length);
    System.out.println("");
    System.out.println("A valid 2 arg invocation supplies server and port, and assumes a Hive1 server");
    System.out.println("");
    System.out.println("A valid 3 arg invocation supplies server, port, and Kerberos JDBC principal, and assumes a Hive2 server");
    System.exit(1);
    }

    if ( Hive_edition == 1 )
    {
    connection_string = "jdbc:hive://" + args[0] + ":" + args[1] + "/default";
    System.out.println( "Full connection string is: " + connection_string );
    hiveDriver = (Driver)Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver").newInstance();
    con = hiveDriver.connect( connection_string, props );
    }
    else
    {
    connection_string = "jdbc:hive2://" + args[0] + ":" + args[1] + "/default;principal=" + args[2];
    System.out.println( "Full connection string is: " + connection_string );
    hiveDriver = (Driver)Class.forName("org.apache.hive.jdbc.HiveDriver").newInstance();
    con = hiveDriver.connect( connection_string, props );
    }

    Statement stmt = con.createStatement();

    // See if show tables works
    ResultSet res;
    String sql = "show tables" ;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    int table_count = 1;
    while (res.next())
    {
    System.out.println(res.getString(1) );
    if ( table_count++ == 3 )
    {
    System.out.println("");
    System.out.println( "Connectivity SUCCESSFULLY demonstrated, exiting after 3 table names displayed" );
    System.exit(1);
    }
    }
    }
    }

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #35656
    David Schorow
    Moderator

    First and foremost, I’d suggest upgrading to HDP 1.3 Windows. That is far more up to date release with better documentation.

    I’m not completely clear what is applicable to HDP 1.1 Windows. But you’ll first need to get all of the relevant JARs on to your client machine and into your Java Classpath. See the section “Running the JDBC Sample Code” in the documentation HiveServer2 Clients.

    https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-RunningtheJDBCSampleCode

    Replicated here:
    # To run the program using remote hiveserver in non-kerberos mode, we need the following jars in the classpath
    # from hive/build/dist/lib
    # hive-jdbc*.jar
    # hive-service*.jar
    # libfb303-0.9.0.jar# libthrift-0.9.0.jar# log4j-1.2.16.jar# slf4j-api-1.6.1.jar# slf4j-log4j12-1.6.1.jar# commons-logging-1.0.4.jar#
    #
    # Following additional jars are needed for the kerberos secure mode –
    # hive-exec*.jar
    # commons-configuration-1.6.jar
    # and from hadoop – hadoop-*core.jar

    I hope that helps,
    David

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.