Home Forums HDP on Linux – Installation Deploy secure HDP

This topic contains 26 replies, has 5 voices, and was last updated by  Rishab Malhotra 1 year, 9 months ago.

  • Creator
    Topic
  • #12291

    Munjeong Kim
    Member

    Hi, I’m mun.

    I want to deploy secure Hadoop cluster, so I refered documents(http://docs.hortonworks.com/CURRENT/index.htm#Deploying_Hortonworks_Data_Platform/Using_gsInstaller/Deploying_Secure_Hadoop_Cluster/Deploying_Secure_Hadoop_Clusters.htm) and tried to install HDP with KDC.

    but there was problem like this when Step8 : Start installation -> sh gsInstaller.sh

    cat: gateway: No such file or directory
    cat: gateway: No such file or directory
    ===============================================================================
    Grid Stack Installer
    ===============================================================================

    ===============================================================================
    Installation Details
    ===============================================================================
    ===============================================================================
    yum -y install pdsh
    ===============================================================================
    ===============================================================================
    hadoop packages were not found. Command -> yum search hadoop on host did not find hadoop.

    What can I check to success install HDP with KDC?

    Also, can I deploy HDP by HMC first, and apply KDC later?

    I mean how use kerberos on already existing Hadoop cluster(HDP).

    Best regards,
    Mun

Viewing 26 replies - 1 through 26 (of 26 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #13085

    Larry – apologies, i mistyped – email is rmalhotra@aptmail.com

    Collapse
    #13084

    Larry – email is malhotrar@aptmail.com

    Collapse
    #13078

    Larry Liu
    Moderator

    Hi, Rishab,

    Can you please elaborate your issue? We can discuss the issue in the email as well if you can provide me the email address.

    Thanks

    Larry

    Collapse
    #13076

    I am encountering the same issue – were you guys able to figure out a solution?

    Collapse
    #12840

    Hi Larry,

    you can contact me at g at my_surname dot nl (where my_surname is lanzani)

    Collapse
    #12837

    Larry Liu
    Moderator

    Hi, Giovanni

    Can we move the investigation offline?

    I will send you an email regarding this issue with my findings. Can you please provide your email address?

    Thanks

    Regards

    Larry

    Collapse
    #12815

    Hi Larry,

    BTW I’ve found another problem with the scripts. To install ganglia, we need to create the file gangliaweb (equal in my case to gangliaserver) otherwise ganglia cannot start.

    Cheers,

    Giovanni

    Collapse
    #12788

    Hi Larry,

    any news on this front? We are really looking forward to a secure cluster with Hortonworks HDP.

    Best

    Giovanni

    Collapse
    #12664

    Larry Liu
    Moderator

    HI, Giovanni

    Thanks for trying HDP secure cluster.

    After trying, the issue might be for non-single node cluster. I will get back to you once I have a solution.

    Regards

    Larry

    Collapse
    #12658

    Thanks Larry. A correction for what regards the setup (four -> three)

    - setup is 4 amazon machines: 1 kerberos, 1 hdp manager (with nagios, ganglia, etc), 1 namenode (with hbasemaster, etc), 1 snamenode. General nodes are provided by these last three machines.

    Collapse
    #12600

    Larry Liu
    Moderator

    HI, Giovanni

    Let me look into your log and configurations.

    I will get back to you once I have an idea.

    Thanks and have a good weekend.

    Larry

    Collapse
    #12595

    Hi Larry, I’ve posted the file on github, is it ok?

    gsInstaller here https://gist.github.com/4215013
    [Note: I've changed realm before posting. I have something like HADOOP.**** in my real file. Note: no hive installation]

    log here https://gist.github.com/4231334
    [Note: all the IP addresses were changed to MYSUPERSECRETIP]

    General notes:

    - setup is 4 amazon machines: 1 kerberos, 1 hdp manager (with nagios, ganglia, etc), 1 namenode (with hbasemaster, etc), 1 snamenode. General nodes are provided by these last four machines.

    - I’ve changed line 142 of gsInstaller.properties: from `homes` to `home`, as specified in a previous post (there’s the same error on the single node install folder)

    - I’ve change line 811 of gsLib.sh: I think ssh_cmd=”” makes no sense there, so I’ve put ssh_cmd=”ssh”. Let me know if it is completely stupid or what.

    Thanks for helping me out!

    Collapse
    #12551

    Larry Liu
    Moderator

    Hi, Giovanni

    Thanks for trying HDP.

    The possible reason is that the namenode is not started. Can you please check if all the service is running as expected?

    If everything looks running fine, can you please provide the full log, gsInstaller.properties and uploaded to the following ftp server?

    Connect to http://ftp.hortonworks.com
    login as: dropoff
    password: horton

    Thanks

    Regards,
    Larry

    Collapse
    #12547

    Hi Larry,

    since you seem to know way more than I do, I’m stuck at the gsInstaller phase, when Waiting for 600 seconds for namenode to come out of safe mode. Logs say (over and over)


    on ip-10-32-62-82.eu-west-1.compute.internal
    on ip-10-33-156-29.eu-west-1.compute.internal running ssh -o ConnectTimeOut=3 -q -t -i /root/.ssh/id_rsa root@**** "su - hdfs -c '/usr/bin/kinit -kt /tmp/hdfs.headless.keytab hdfs; hadoop --config /etc/hadoop/conf dfsadmin -safemode get'"
    Output from gwhost safe mode command: 12/12/05 15:55:27 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs@HADOOP.KPMG cause:org.apache.hadoop.ipc.RemoteException: GSS initiate failed^M
    12/12/05 15:55:27 INFO security.UserGroupInformation: Initiating logout for hdfs@HADOOP.***^M
    12/12/05 15:55:27 INFO security.UserGroupInformation: Initiating re-login for hdfs@HADOOP.***^M
    12/12/05 15:55:32 ERROR security.UserGroupInformation: PriviledgedActionException as:hdfs@HADOOP.*** cause:org.apache.hadoop.ipc.RemoteException: GSS initiate failed^M
    12/12/05 15:55:32 WARN security.UserGroupInformation: Not attempting to re-login since the last re-login was attempted less than 600 seconds before.^M

    Kerberos seems to have been installed without problems.

    Collapse
    #12540

    Larry Liu
    Moderator

    Hi, Mun

    How’s going with the installation?

    The smoke_test_user_keytab is created in /root folder and copied to /tmp. Can you please check if you can find hdptestuser.headless.keytab from /root and /tmp? If you can find it in /tmp, please change smoke_test_user_keytab=”/tmp/${smoke_test_user}.headless.keytab”.

    Let me know if this helps.

    Thanks

    Regards

    Larry

    Collapse
    #12538

    When downloading http://public-repo-1.hortonworks.com/HDP-1.1.1.16/tools/HDP-gsInstaller-1.1.1.16-1.tar.gz
    the file gsInstaller.properties has this following, wrong, line

    smoke_test_user_keytab=”/homes/${smoke_test_user}/${smoke_test_user}.headless.keytab”

    It should be

    smoke_test_user_keytab=”/home/${smoke_test_user}/${smoke_test_user}.headless.keytab”

    Once this is corrected, you have to rerun setupKerberos.sh and gsInstaller.sh

    Can someone correct it upstream?

    Collapse
    #12445

    Larry Liu
    Moderator

    Hi, Mun

    Can you please provide the following value from gsInstaller.properties?

    smoke_test_user_keytab

    Please make sure that the value for test user keytab location exists.

    Hope this helps

    Thanks

    Larry

    Collapse
    #12338

    Munjeong Kim
    Member

    Hi Sef,

    When I excute cat /etc/passwd,

    there is

    hdptestuser:x:2000:2000::/home/hdptestuser:/bin/bash

    I think hdptestuser was created on my server.

    Thanks
    Mun

    Collapse
    #12337

    Seth Lyubich
    Keymaster

    Hi Mun,

    Can you please check if hdptestuser was created on your servers?

    Thanks,
    Seth

    Collapse
    #12336

    Munjeong Kim
    Member

    12/11/29 00:56:10 ERROR security.UserGroupInformation: PriviledgedActionException as:hdptestuser cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    java.io.IOException: Call to ip-10-162-15-182.ap-northeast-1.compute.internal/10.162.15.182:8020 failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:1129)
    at org.apache.hadoop.ipc.Client.call(Client.java:1097)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
    at $Proxy6.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:411)
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:120)

    Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:543)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1122)
    at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:488)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:590)
    at org.apache.hadoop.ipc.Client$Connection.access$2100(Client.java:187)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1228)

    Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:194)
    at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:134)
    at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:385)
    at org.apache.hadoop.ipc.Client$Connection.access$1200(Client.java:187)
    at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:583)
    at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:580)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)

    Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
    at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:130)
    at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:106)

    Thanks,
    Mun

    Collapse
    #12335

    Munjeong Kim
    Member

    Hi Seth,

    I got this error
    (This is over 3000 characters so I posted two)

    kinit: No such file or directory while getting initial credentials
    12/11/29 00:56:07 ERROR security.UserGroupInformation: PriviledgedActionException as:hdptestuser cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    12/11/29 00:56:07 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    12/11/29 00:56:07 ERROR security.UserGroupInformation: PriviledgedActionException as:hdptestuser cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    Bad connection to FS. command aborted. exception: Call to ip-10-162-15-182.ap-northeast-1.compute.internal/10.162.15.182:8020 failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    12/11/29 00:56:08 ERROR security.UserGroupInformation: PriviledgedActionException as:hdptestuser cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    12/11/29 00:56:08 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    12/11/29 00:56:08 ERROR security.UserGroupInformation: PriviledgedActionException as:hdptestuser cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    Bad connection to FS. command aborted. exception: Call to ip-10-162-15-182.ap-northeast-1.compute.internal/10.162.15.182:8020 failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    12/11/29 00:56:10 ERROR security.UserGroupInformation: PriviledgedActionException as:hdptestuser cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    12/11/29 00:56:10 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

    Collapse
    #12334

    Seth Lyubich
    Keymaster

    Hi Mun,

    It is good idea to make sure that smoke tests pass. This ensures that components were installed and function correctly in the cluster. Do you remember what failed in smoke test? You should be able to find some logs in /tmp.

    Thanks,
    Seth

    Collapse
    #12333

    Munjeong Kim
    Member

    Hi Seth,

    As you said, I set gateway file then it work.

    Now I installed HDP but there was a ERROR when Hadoop smoke test.

    Anyway, HDP with secure(KDC) install successfully, right?

    Do I have to success smoke test, too?

    Collapse
    #12332

    Seth Lyubich
    Keymaster

    Hi Mun,

    Can you please check if you created file gateway file per installation instructions?

    Based on what you pasted, there might me an issue that installer cannot find gateway file with correct hostname defined.

    cat: gateway: No such file or directory
    cat: gateway: No such file or directory

    Thanks,
    Seth

    Collapse
    #12302

    Munjeong Kim
    Member

    Hi, Larry

    This is my answer,

    1. OS
    I use CentOS6.x and EC2

    2. hardware spec
    8 ECUs(CPU Units), 4 Cores, 15G(memory)

    3. is it single node deployment?
    No, I will deploy 1 matster and 3 slaves.

    4. step you have performed
    1) install wget, vim
    2) install jdk
    3) follow documents
    wget http://public-repo-1.hortonworks.com/HDP-1.1.1.16/tools/HDP-gsInstaller-1.1.1.16-1.tar.gz
    tar xvzf HDP-gsInstaller-1.1.1.16-1.tar.gz
    cd HDP-gsInstaller-1.1.1.16/gsInstaller
    echo `hostname -f` > kdcserver
    sh setupKerberos.sh
    cp kdcserver namenode
    cp kdcserver jobtracker
    cp kdcserver gangliaserver
    cp kdcserver daashboardhost
    (and set configuration)
    sh gsPreRequisites.sh
    sh createUsers.sh
    sh gsInstaller.sh -> Here is where I quest

    5. configuration you have made.

    deployuser=`whoami`
    package=rpm
    security=yes
    hdfs_user_keytab=”/root/hdfs.headless.keytab”
    enableappend=true
    enablewebhdfs=false
    datanode_dir=/hdp/1/hadoop/hdfs/data
    namenode_dir=/hdp/1/hadoop/hdfs/namenode
    snamenode_dir=/hdp/1/hadoop/hdfs/snamenode
    mapred_dir=/hdp/1/hadoop/mapred
    log_dir=/var/log/hadoop
    pid_dir=/var/run/hadoop
    taskscheduler=”org.apache.hadoop.mapred.CapacityTaskScheduler”
    keytabdir=/etc/security/keytabs
    realm=EXAMPLE.COM
    enablelzo=no
    installpig=yes
    installhbase=no
    enableshortcircuit=true
    installhive=no
    installhcat=no
    smoke_test_user=”hdptestuser”
    smoke_test_user_keytab=”/home/${smoke_test_user}/${smoke_test_user}.headless.keytab”
    installtempleton=no
    installsqoop=yes
    installoozie=no
    enablemon=no
    localyumrepo=yes

    Also, I added private key from master to slaves(authorized_keys).

    Thanks,
    Mun

    Collapse
    #12292

    Larry Liu
    Moderator

    Hi, Mun,

    To answer your question “can I deploy HDP by HMC first, and apply KDC later”, I think you can do that but there is no documentation for it and I believe it will require a lot of manual work.

    Can you please provide the following information for the issues you are seeing while deploying secure cluster from gs installer?

    1. OS
    2. hardware spec
    3. is it single node deployment?
    4. steps you have performed
    5. configuration you have made.

    Let’s start from there.

    Thanks

    Larry

    Collapse
Viewing 26 replies - 1 through 26 (of 26 total)