Home Forums HDFS Cross Realm Authentication

This topic contains 1 reply, has 1 voice, and was last updated by  Chok S 4 months, 1 week ago.

  • Creator
    Topic
  • #58751

    Chok S
    Participant

    Hello everyone,
    I’m having issues while setting up cross-realm authentication between two secured cluster (HDP 1.3.3). All services are working fine in their respective cluster.

    Here is the krb5.conf file in one realm (EXAMPLE.COM)
    —–
    [logging]
    default = FILE:/var/log/krb5libs.log
    kdc = FILE:/var/log/krb5kdc.log
    admin_server = FILE:/var/log/kadmind.log

    [appdefaults]
    validate=false

    [libdefaults]
    default_realm = EXAMPLE.COM
    dns_lookup_realm = false
    dns_lookup_kdc = false
    ticket_lifetime = 24h
    renew_lifetime = 7d
    forwardable = true

    [realms]
    EXAMPLE.COM = {
    kdc = ip-10-0-0-60
    admin_server = ip-10-0-0-60
    }

    TEST.DATALAKE.COM = {
    kdc = ip-10-0-0-240
    admin_server = ip-10-0-0-240
    }

    [domain_realm]
    .example.com = EXAMPLE.COM
    examplecom = EXAMPLE.COM
    .test.datalake.com = TEST.DATALAKE.COM
    test.datalake.com =TEST.DATALAKE.COM
    —–

    And here is krb5.conf file in my second realm (TEST.DATALAKE.COM)
    —–
    [logging]
    default = FILE:/var/log/krb5libs.log
    kdc = FILE:/var/log/krb5kdc.log
    admin_server = FILE:/var/log/kadmind.log

    [appdefaults]
    validate=false

    [libdefaults]
    default_realm = TEST.DATALAKE.COM
    dns_lookup_realm = false
    dns_lookup_kdc = false
    ticket_lifetime = 24h
    renew_lifetime = 7d
    forwardable = true

    [realms]
    TEST.DATALAKE.COM = {
    kdc = ip-10-0-0-240
    admin_server = ip-10-0-0-240
    }

    EXAMPLE.COM = {
    kdc = ip-10-0-0-60
    admin_server = ip-10-0-0-60
    }

    [domain_realm]
    .test.datalake.com = TEST.DATALAKE.COM
    test.datalake.com = TEST.DATALAKE.COM
    .example.com = EXAMPLE.COM
    example.com = EXAMPLE.COM
    —–

    I have added following two principals in both Keberos database
    krbtgt/TEST.DATALAKE.COM@EXAMPLE.COM
    krbtgt/EXAMPLE.COM@TEST.DATALAKE.COM

    This is my hadoop.security.auth_to_local setting in cluster EXAMPLE.COM –
    ——
    RULE:[1:$1@$0\](^.*@test\.DATALAKE\.COM$)s/^(.*)@test\.DATALAKE\.COM$/$1/g
    RULE:[2:$1@$0\](^.*@test\.DATALAKE\.COM$)s/^(.*)@test\.DATALAKE\.COM$/$1/g
    RULE:[2:$1@$0](jt@.*EXAMPLE.COM)s/.*/mapred/
    RULE:[2:$1@$0](tt@.*EXAMPLE.COM)s/.*/mapred/
    RULE:[2:$1@$0](nn@.*EXAMPLE.COM)s/.*/hdfs/
    RULE:[2:$1@$0](dn@.*EXAMPLE.COM)s/.*/hdfs/
    RULE:[2:$1@$0](hbase@.*EXAMPLE.COM)s/.*/hbase/
    RULE:[2:$1@$0](hbase@.*EXAMPLE.COM)s/.*/hbase/
    RULE:[2:$1@$0](oozie@.*EXAMPLE.COM)s/.*/oozie/
    DEFAULT
    ——-
    And this is my hadoop.security.auth_to_local setting in cluster TEST.DATALAKE.COM –
    RULE:[1:$1@$0\](^.*@EXAMPLE\.COM$)s/^(.*)@EXAMPLE\.COM$/$1/g
    RULE:[2:$1@$0\](^.*@EXAMPLE\.COM$)s/^(.*)@EXAMPLE\.COM$/$1/g
    RULE:[2:$1@$0](jt@.*TEST.DATALAKE.COM)s/.*/mapred/
    RULE:[2:$1@$0](tt@.*TEST.DATALAKE.COM)s/.*/mapred/
    RULE:[2:$1@$0](nn@.*TEST.DATALAKE.COM)s/.*/hdfs/
    RULE:[2:$1@$0](dn@.*TEST.DATALAKE.COM)s/.*/hdfs/
    RULE:[2:$1@$0](hbase@.*TEST.DATALAKE.COM)s/.*/hbase/
    RULE:[2:$1@$0](hbase@.*TEST.DATALAKE.COM)s/.*/hbase/
    RULE:[2:$1@$0](oozie@.*TEST.DATALAKE.COM)s/.*/oozie/
    DEFAULT
    ——

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #58752

    Chok S
    Participant

    datalake is an user in EXAMPLE.COM cluster and has a file in HDFS as:
    hdfs://ip-10-0-0-62:8020/user/datalake/file.txt

    When I’m running the following command from EXAMPLE.COM cluster:
    hadoop distcp hdfs://ip-10-0-0-62:8020/user/datalake/file.txt hdfs://ip-10-0-0-42:8020/user/bond007/distcp-receiver

    I’m getting the following error:
    14/08/14 20:16:53 ERROR security.UserGroupInformation: PriviledgedActionException as:datalake@EXAMPLE.COM cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) - UNKNOWN_SERVER)]

    What am I missing here? Any helps would be highly appreciated. Is there any documents/tutorials available which covers the exact same topc? I found document between HDP & AD but not between two HDPs.

    Thanks in advance.

    Collapse
Viewing 1 replies (of 1 total)