HDFS Forum

Cross Realm Authentication

  • #58751
    Chok S
    Participant

    Hello everyone,
    I’m having issues while setting up cross-realm authentication between two secured cluster (HDP 1.3.3). All services are working fine in their respective cluster.

    Here is the krb5.conf file in one realm (EXAMPLE.COM)
    —–
    [logging]
    default = FILE:/var/log/krb5libs.log
    kdc = FILE:/var/log/krb5kdc.log
    admin_server = FILE:/var/log/kadmind.log

    [appdefaults]
    validate=false

    [libdefaults]
    default_realm = EXAMPLE.COM
    dns_lookup_realm = false
    dns_lookup_kdc = false
    ticket_lifetime = 24h
    renew_lifetime = 7d
    forwardable = true

    [realms]
    EXAMPLE.COM = {
    kdc = ip-10-0-0-60
    admin_server = ip-10-0-0-60
    }

    TEST.DATALAKE.COM = {
    kdc = ip-10-0-0-240
    admin_server = ip-10-0-0-240
    }

    [domain_realm]
    .example.com = EXAMPLE.COM
    examplecom = EXAMPLE.COM
    .test.datalake.com = TEST.DATALAKE.COM
    test.datalake.com =TEST.DATALAKE.COM
    —–

    And here is krb5.conf file in my second realm (TEST.DATALAKE.COM)
    —–
    [logging]
    default = FILE:/var/log/krb5libs.log
    kdc = FILE:/var/log/krb5kdc.log
    admin_server = FILE:/var/log/kadmind.log

    [appdefaults]
    validate=false

    [libdefaults]
    default_realm = TEST.DATALAKE.COM
    dns_lookup_realm = false
    dns_lookup_kdc = false
    ticket_lifetime = 24h
    renew_lifetime = 7d
    forwardable = true

    [realms]
    TEST.DATALAKE.COM = {
    kdc = ip-10-0-0-240
    admin_server = ip-10-0-0-240
    }

    EXAMPLE.COM = {
    kdc = ip-10-0-0-60
    admin_server = ip-10-0-0-60
    }

    [domain_realm]
    .test.datalake.com = TEST.DATALAKE.COM
    test.datalake.com = TEST.DATALAKE.COM
    .example.com = EXAMPLE.COM
    example.com = EXAMPLE.COM
    —–

    I have added following two principals in both Keberos database
    krbtgt/TEST.DATALAKE.COM@EXAMPLE.COM
    krbtgt/EXAMPLE.COM@TEST.DATALAKE.COM

    This is my hadoop.security.auth_to_local setting in cluster EXAMPLE.COM –
    ——
    RULE:[1:$1@$0\](^.*@test\.DATALAKE\.COM$)s/^(.*)@test\.DATALAKE\.COM$/$1/g
    RULE:[2:$1@$0\](^.*@test\.DATALAKE\.COM$)s/^(.*)@test\.DATALAKE\.COM$/$1/g
    RULE:[2:$1@$0](jt@.*EXAMPLE.COM)s/.*/mapred/
    RULE:[2:$1@$0](tt@.*EXAMPLE.COM)s/.*/mapred/
    RULE:[2:$1@$0](nn@.*EXAMPLE.COM)s/.*/hdfs/
    RULE:[2:$1@$0](dn@.*EXAMPLE.COM)s/.*/hdfs/
    RULE:[2:$1@$0](hbase@.*EXAMPLE.COM)s/.*/hbase/
    RULE:[2:$1@$0](hbase@.*EXAMPLE.COM)s/.*/hbase/
    RULE:[2:$1@$0](oozie@.*EXAMPLE.COM)s/.*/oozie/
    DEFAULT
    ——-
    And this is my hadoop.security.auth_to_local setting in cluster TEST.DATALAKE.COM –
    RULE:[1:$1@$0\](^.*@EXAMPLE\.COM$)s/^(.*)@EXAMPLE\.COM$/$1/g
    RULE:[2:$1@$0\](^.*@EXAMPLE\.COM$)s/^(.*)@EXAMPLE\.COM$/$1/g
    RULE:[2:$1@$0](jt@.*TEST.DATALAKE.COM)s/.*/mapred/
    RULE:[2:$1@$0](tt@.*TEST.DATALAKE.COM)s/.*/mapred/
    RULE:[2:$1@$0](nn@.*TEST.DATALAKE.COM)s/.*/hdfs/
    RULE:[2:$1@$0](dn@.*TEST.DATALAKE.COM)s/.*/hdfs/
    RULE:[2:$1@$0](hbase@.*TEST.DATALAKE.COM)s/.*/hbase/
    RULE:[2:$1@$0](hbase@.*TEST.DATALAKE.COM)s/.*/hbase/
    RULE:[2:$1@$0](oozie@.*TEST.DATALAKE.COM)s/.*/oozie/
    DEFAULT
    ——

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #58752
    Chok S
    Participant

    datalake is an user in EXAMPLE.COM cluster and has a file in HDFS as:
    hdfs://ip-10-0-0-62:8020/user/datalake/file.txt

    When I’m running the following command from EXAMPLE.COM cluster:
    hadoop distcp hdfs://ip-10-0-0-62:8020/user/datalake/file.txt hdfs://ip-10-0-0-42:8020/user/bond007/distcp-receiver

    I’m getting the following error:
    14/08/14 20:16:53 ERROR security.UserGroupInformation: PriviledgedActionException as:datalake@EXAMPLE.COM cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) – UNKNOWN_SERVER)]

    What am I missing here? Any helps would be highly appreciated. Is there any documents/tutorials available which covers the exact same topc? I found document between HDP & AD but not between two HDPs.

    Thanks in advance.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.