Home Forums Hive / HCatalog jdbc test

Tagged: 

This topic contains 1 reply, has 2 voices, and was last updated by  David Schorow 11 months, 3 weeks ago.

  • Creator
    Topic
  • #33771

    I’m working with SAS support to make SAS connect work with HDP 1.1 on Windows.
    They gave me a test Java client to test the JDBC connection but we can’t run it :(
    It can’t load: org.apache.hadoop.hive.jdbc.HiveDriver
    It’s like this is not there:

    Test client:


    import java.io.*;
    import java.util.*;
    import java.sql.*;

    public class HiveJdbcClient3 {
    public static void main(String[] args) throws SQLException,ClassNotFoundException,InstantiationException,IllegalAccessException {

    int Hive_edition = 0;
    Driver hiveDriver;
    Connection con;
    String connection_string;
    Properties props = new Properties();

    if (args.length == 2)
    {
    System.out.println( "Performing a Hive1 connect to: " + args[0] + ":" + args[1] );
    Hive_edition = 1;
    }
    else if (args.length == 3)
    {
    System.out.println("Performing a KERBEROS Hive2 connect to: " + args[0] + ":" + args[1] + " with Kerberos principle " + args[2]);
    Hive_edition = 2;
    }
    else
    {
    System.out.println("Invalid invocation, #args must be 2 or 3, this invocation gave: " + args.length);
    System.out.println("");
    System.out.println("A valid 2 arg invocation supplies server and port, and assumes a Hive1 server");
    System.out.println("");
    System.out.println("A valid 3 arg invocation supplies server, port, and Kerberos JDBC principal, and assumes a Hive2 server");
    System.exit(1);
    }

    if ( Hive_edition == 1 )
    {
    connection_string = "jdbc:hive://" + args[0] + ":" + args[1] + "/default";
    System.out.println( "Full connection string is: " + connection_string );
    hiveDriver = (Driver)Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver").newInstance();
    con = hiveDriver.connect( connection_string, props );
    }
    else
    {
    connection_string = "jdbc:hive2://" + args[0] + ":" + args[1] + "/default;principal=" + args[2];
    System.out.println( "Full connection string is: " + connection_string );
    hiveDriver = (Driver)Class.forName("org.apache.hive.jdbc.HiveDriver").newInstance();
    con = hiveDriver.connect( connection_string, props );
    }

    Statement stmt = con.createStatement();

    // See if show tables works
    ResultSet res;
    String sql = "show tables" ;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    int table_count = 1;
    while (res.next())
    {
    System.out.println(res.getString(1) );
    if ( table_count++ == 3 )
    {
    System.out.println("");
    System.out.println( "Connectivity SUCCESSFULLY demonstrated, exiting after 3 table names displayed" );
    System.exit(1);
    }
    }
    }
    }

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #35656

    First and foremost, I’d suggest upgrading to HDP 1.3 Windows. That is far more up to date release with better documentation.

    I’m not completely clear what is applicable to HDP 1.1 Windows. But you’ll first need to get all of the relevant JARs on to your client machine and into your Java Classpath. See the section “Running the JDBC Sample Code” in the documentation HiveServer2 Clients.

    https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-RunningtheJDBCSampleCode

    Replicated here:
    # To run the program using remote hiveserver in non-kerberos mode, we need the following jars in the classpath
    # from hive/build/dist/lib
    # hive-jdbc*.jar
    # hive-service*.jar
    # libfb303-0.9.0.jar# libthrift-0.9.0.jar# log4j-1.2.16.jar# slf4j-api-1.6.1.jar# slf4j-log4j12-1.6.1.jar# commons-logging-1.0.4.jar#
    #
    # Following additional jars are needed for the kerberos secure mode -
    # hive-exec*.jar
    # commons-configuration-1.6.jar
    # and from hadoop – hadoop-*core.jar

    I hope that helps,
    David

    Collapse
Viewing 1 replies (of 1 total)