HBase Forum

Java client not able to connect to secure hbase

  • #53542
    Gaurav Thakur
    Participant

    Hi,

    I have developed the following program, using which I`m trying to connect to a secure hbase instance from a java client.
    As the connection to hbase goes via zookeeper, I`m using GSSAPI. My code connects to the zookeeper which means that the kerberos authentication is working fine, but the hbase connection still fails.

    Please see the code as below:
    System.setProperty(CommonConstants.KRB_REALM, ConfigUtil.getProperty(CommonConstants.HADOOP_CONF, “krb.realm”));
    System.setProperty(CommonConstants.KRB_KDC, ConfigUtil.getProperty(CommonConstants.HADOOP_CONF,”krb.kdc”));
    System.setProperty(CommonConstants.KRB_DEBUG, “true”);
    System.setProperty( “java.security.auth.login.config”, “src/main/resources/login.conf”);
    System.setProperty( “javax.security.auth.useSubjectCredsOnly”, “true”);
    final Configuration config = HBaseConfiguration.create();
    /*config.set(CommonConfigurationKeysPublic.HADOOP_SECURITY_AUTHENTICATION, AUTH_KRB);
    config.set(CommonConfigurationKeysPublic.HADOOP_SECURITY_AUTHORIZATION, AUTHORIZATION);
    config.set(CommonConfigurationKeysPublic.FS_AUTOMATIC_CLOSE_KEY, AUTO_CLOSE);
    config.set(CommonConfigurationKeysPublic.FS_DEFAULT_NAME_KEY, defaultFS);*/
    config.set(“hbase.zookeeper.quorum”, ConfigUtil.getProperty(CommonConstants.HBASE_CONF, “hbase.host”));
    config.set(“hbase.zookeeper.property.clientPort”, ConfigUtil.getProperty(CommonConstants.HBASE_CONF, “hbase.port”));
    config.set(“hbase.client.retries.number”, Integer.toString(0));
    config.set(“zookeeper.session.timeout”, Integer.toString(6000));
    config.set(“zookeeper.recovery.retry”, Integer.toString(0));
    config.set(“hbase.master”, “gauravt-namenode.pbi.global.pvt:60000″);
    config.set(“zookeeper.znode.parent”, “/hbase-secure”);
    config.set(“hbase.rpc.engine”, “org.apache.hadoop.hbase.ipc.SecureRpcEngine”);
    config.set(“hbase.security.authentication”, AUTH_KRB);
    config.set(“hbase.security.authorization”, AUTHORIZATION);
    config.set(“hbase.master.kerberos.principal”, “hbase/gauravt-namenode.pbi.global.pvt@pbi.global.pvt”);
    //config.set(“hbase.master.keytab.file”, “D:/var/lib/bda/secure/keytabs/hbase.service.keytab”);
    config.set(“hbase.regionserver.kerberos.principal”, “hbase/gauravt-datanode2.pbi.global.pvt@pbi.global.pvt”);
    //config.set(“hbase.regionserver.keytab.file”, “D:/var/lib/bda/secure/keytabs/hbase.service.keytab”);
    LoginContext loginCtx = null;
    // “KerberizedServer” refers to a section of the JAAS configuration in the jaas.conf file.
    Subject subject = null;
    loginCtx = new LoginContext( “Client”);
    loginCtx.login();
    subject = loginCtx.getSubject();
    HBaseAdmin admins = new HBaseAdmin(config);
    if(admins.isTableAvailable(“ambarismoketest”)) {
    System.out.println(“Table is available”);
    };
    The exception is :
    Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
    at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.