Security Forum

Kerberos security in HDP, "GSS initiate failed" for the "hdfs" user

  • #28100
    Dmitry Ochnev
    Participant

    I’m trying to enable security in HDP 2.0, deployed using Ambari 1.4.0 (from the developers’ repository), on a virtual machine, in a single-node cluster.
    I have a problem with Kerberos TGT.
    .
    I try to execute the following 2 commands (taken from error messages from Puppet):

    [root@dev01 ~]# /usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs
    [root@dev01 ~]# su hdfs -c “hadoop –config /etc/hadoop/conf fs -mkdir -p /mapred”
    13/06/25 10:14:04 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    13/06/25 10:14:04 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    13/06/25 10:14:04 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs (auth:KERBEROS) cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    mkdir: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: “dev01.hortonworks.com/192.168.56.101″; destination host is: “dev01.hortonworks.com”:8020;
    [root@dev01 ~]#

    The keytab file (/etc/security/keytabs/hdfs.headless.keytab) is in place, the 1st command finished OK, but the 2nd comand did not work.

    Then I tried:
    [root@dev01 ~]# kinit -R
    kinit: Ticket expired while renewing credentials

    It looks like a ticket has expired immediately after kinit.
    Then I try to check:

    [root@dev01 ~]# klist
    Ticket cache: FILE:/tmp/krb5cc_0
    Default principal: hdfs@EXAMPLE.COM

    Valid starting Expires Service principal
    06/25/13 10:13:46 06/26/13 10:13:46 krbtgt/EXAMPLE.COM@EXAMPLE.COM
    renew until 06/25/13 10:13:46
    [root@dev01 ~]#

    But it looks like the ticket is valid, as far as I understand.
    Now I don’t understand what’s going on with Kerberos TGT here.

    Here is the Kerberos config (/etc/krb5.conf):
    ——–
    [logging]
    default = FILE:/var/log/krb5libs.log
    kdc = FILE:/var/log/krb5kdc.log
    admin_server = FILE:/var/log/kadmind.log

    [libdefaults]
    default_realm = EXAMPLE.COM
    dns_lookup_realm = false
    dns_lookup_kdc = false
    ticket_lifetime = 24h
    renew_lifetime = 7d
    forwardable = true

    [realms]
    EXAMPLE.COM = {
    kdc = dev01.hortonworks.com
    admin_server = dev01.hortonworks.com
    }

    [domain_realm]
    .hortonworks.com = EXAMPLE.COM
    dev01.hortonworks.com = EXAMPLE.COM
    ——–

    Can somebody help me?

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #28102
    Dmitry Ochnev
    Participant

    Checking the keytab itself:

    [root@dev01 ~]# klist -e -k -t /etc/security/keytabs/hdfs.headless.keytab
    Keytab name: FILE:/etc/security/keytabs/hdfs.headless.keytab
    KVNO Timestamp Principal
    —- —————– ——————————————————–
    2 06/20/13 03:47:55 hdfs@EXAMPLE.COM (aes256-cts-hmac-sha1-96)
    2 06/20/13 03:47:55 hdfs@EXAMPLE.COM (aes128-cts-hmac-sha1-96)
    2 06/20/13 03:47:55 hdfs@EXAMPLE.COM (des3-cbc-sha1)
    2 06/20/13 03:47:55 hdfs@EXAMPLE.COM (arcfour-hmac)
    2 06/20/13 03:47:55 hdfs@EXAMPLE.COM (des-hmac-sha1)
    2 06/20/13 03:47:55 hdfs@EXAMPLE.COM (des-cbc-md5)
    [root@dev01 ~]#

    Is everything OK with the keytab?

    #28193
    Sasha J
    Moderator

    Dmitry,
    Have you put all the needed properties to core-site.xml, hdfs-site.xml and captured-site.xml?
    All processes have to be restarted .
    Also, this is a wrong thread for HDP 2.0 questions.
    Please, use correct one: http://hortonworks.com/community/forums/forum/hdp-2-0-alpha-feedback-2/

    Thank you!
    Sasha

    #28373
    Dmitry Ochnev
    Participant

    Sasha,
    Thank you for your answer.

    I checked, according to this document: http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/ClusterSetup.html#Running_Hadoop_in_Secure_Mode

    core-site.xml has these properties:

    hadoop.security.authentication
    kerberos

    hadoop.security.authorization
    true

    hdfs-site.xml has these properties:

    dfs.secondary.namenode.kerberos.principal
    nn/_HOST@EXAMPLE.COM

    dfs.datanode.kerberos.principal
    dn/_HOST@EXAMPLE.COM

    dfs.namenode.kerberos.principal
    nn/_HOST@EXAMPLE.COM

    These properties are absent:
    dfs.namenode.kerberos.https.principal
    dfs.namenode.secondary.kerberos.https.principal
    dfs.datanode.kerberos.https.principal

    Possibly, it can be the cause of the problem.

    The file named “captured-site.xml” doesn’t exist and I haven’t seen any reference to it before.

    By the way, is this sufficient for the “hdfs” user keytab?
    xst -k /etc/security/keytabs/hdfs.headless.keytab hdfs

    I just thought, maybe I should try also this:
    xst -k /etc/security/keytabs/hdfs.headless.keytab host/dev01.hortonworks.com
    (I didn’t do that yet)

    #28381
    Sasha J
    Moderator

    Dmitry,
    take a look to this doc:
    http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-1.2.3.1/bk_installing_manually_book/content/rpm-chap14.html
    It is for HDP 1.x, but I believe should work same way on HDP 2.x
    All the properties should be installed.

    I set up kerberos number of time, using mentioned document, it works fine for me.
    You may also try to use script setupKerberos.sh ( it coming together with gsINstaller package).
    Always works fine.
    http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-1.2.4/bk_gsInstaller/content/ch_gsInstaller-chp1.html

    Thank you!
    Sasha

    #46106
    Yeyun Lu
    Member

    Have you sovled the problem? I met the similar problem with yours…Everything is like OK, but when I do hdfs operation, then get the same error..Failed to find any Kerberos tgt

    #55569
    Peter Bulman
    Participant

    did this get resolved? I’m facing a similair issue

    #55631
    Dmitry Ochnev
    Participant

    No. There was no chance, it just didn’t work.
    However, one man from the Ambari developers team told me that they had solved this problem in one of later versions.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.