Home Forums HDP on Linux – Installation HDP installation on Amazon Ec2

This topic contains 41 replies, has 4 voices, and was last updated by  Sasha J 2 years ago.

Viewing 11 replies - 31 through 41 (of 41 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #7800

    Kalyan,
    What worked for me was the root key generated on my hmc deployment node with ssh-keygen.
    you can perform a get opertion after connecing to the node with sftp.
    ex.

    blue@Stallion:~$ sftp -i blue.pem root@ec2-50-17-141-9.compute-1.amazonaws.com
    Connected to ec2-50-17-141-9.compute-1.amazonaws.com.
    sftp> get .ssh/id_rsa
    Fetching /root/.ssh/id_rsa to id_rsa
    /root/.ssh/id_rsa 100% 1675 1.6KB/s 00:00
    sftp>

    I have recently tried providing the amazon key, in my case blue.pem and since they set it up so that you need this private key to access your nodes it should work. Ill get back with you if its successful or not…

    Collapse
    #7799

    kalyan reddy
    Member

    very vital information for beginers
    sorry to keep the doubt again…
    you should have a separate file with your root private key.——– is this file going to be the .ppk file which was created while creating the mazon instacne? (or) the public,private keypair which is id_rsa and id_rsa.pub.
    if these two are not then from where by using sftp i can downlaod?
    as i need to select the SSH Private Key File for root.

    Thanks

    Collapse
    #7797

    Also, http://hortonworks.com/community/forums/topic/common-issues/
    and make sure your ec2 security groups allow access on the ports you need.
    Since I don’t care about security on my development cluster I opened everything ( 0.0.0.0/0 )
    But this is obviously not good for production clusters. If anyone is good with security / networking it would be nice of you to post a minimal set.

    Collapse
    #7795

    Kalyan,
    you should have a separate file with your root private key. If you are accessing the hmc gui from your local machine you can get both files with sftp. If you are doing a multi node install just add the fqdn of each node onto your host file & make sure each node has the same /etc/hosts file. If you run into issues, there is a good chance someone already ran into the same issue so check the forums.

    Collapse
    #7794

    kalyan reddy
    Member

    Hi Miguel,
    many thanks for informatiion.
    finally i am seeing the HMC GUI.
    as a part of basic cluster setup i created the host file with single fqdn(internal dns).
    what file should have for SSH private key file?
    is it going to be differed if its single (or)multinode cluster?
    and also is there any other issues i need to take care
    please let me know
    thanks

    Collapse
    #7738

    Kalyan,

    1) Yes, ex. 10.190.111.104 ip-10-190-111-104.ec2.internal Deploy
    ( hostname -i, hostname -f, name )

    2) I used x11 forwarding for 2 months.. its murder until i found out each ec2 instance has a public dns. You should be able to direct your local browser.
    ex. http://ec2-67-202-36-162.compute-1.amazonaws.com/hmc/html

    3) Puppet is heavily used to install all the hmc components, honestly you shouldn’t have to know much about it. Do not be mislead by puppet kick failed and such. If you have an error in your deploy log you can go to the root cause by doing control + F (err) also you can take a look at the logs in /var/log/hmc… One common issue is a time out in which case you can do follow this: http://hortonworks.com/community/forums/topic/puppet-failed-no-cert/
    ( this link also shows you how to uninstall and reinstall, as the default method usually results in failures )

    4) I used HMC, although the other installer is supposed to work with RHEL / CentOS 6.x

    Good luck

    Collapse
    #7736

    kalyan reddy
    Member

    Hi Miguel
    I almost near to install successfully after followed the doc.http://www.linuxdict.com/2012-06-auto-deploy-hadoop-cluster-with-hdp/
    since i have single node cluster
    1)do i need to change the host file to FQDN?(right now it is 127.0.0.1 localhost.localdomain localhost ::1 localhost6.localdomain6 localhost6)
    2) do i have to have the browser on my EC2 instance(or) is there a way i can call HMC from windows.
    if EC2 requires the browser please let me know the process
    3)i am following your posts ,i am not clear on Puppet concept..
    4) which installation is recommended 1)HMC 2)gc Installer.

    Thanks in advance.

    Collapse
    #7728

    After you enable this repo:
    rpm -Uvh http://public-repo-1.hortonworks.com/HDP-1.0.0.12/repos/centos5/hdp-release-1.0.0.12-1.el5.noarch.rpm

    and do the yum hmc install, you should be able to start hmc by executing: service hmc start.

    Collapse
    #7727

    Kalyan, RHEL 6.x / CentOS 6.x are not officially supported yet.

    I followed this guide and successfully deployed HDP on CentOS 6.2. I used this right scale community ami: ami-cf18b6a6

    http://www.linuxdict.com/2012-06-auto-deploy-hadoop-cluster-with-hdp/

    Tip, try the basic services first ( hdfs / mapreduce / ganglia / nagios )

    If you hit the Nagios libperl.so issue I had:

    http://hortonworks.com/community/forums/topic/nagios/

    Collapse
    #7726

    kalyan reddy
    Member

    Hi Miguel,
    Thanks for reply.
    I changed the instance type to xLarge and the OS type is Red Hat Enterprise Linux 6.3

    And i still have the same issue.
    Am i missing any steps? I followed the document provided by Hortonworks.

    My plan is if it works well for single node cluster then i can go for multinode cluster.
    If possible please throw some light.

    Collapse
    #7724

    Use a xLarge instance for a single node install. Micro just doesn’t have enough ram.
    The spot instances are really cheap.

    What operating system are you using?

    example:
    [root@domU-12-31-39-05-51-51 ~]# uname -a
    Linux domU-12-31-39-05-51-51 2.6.32-220.23.1.el6.centos.plus.x86_64 #1 SMP Tue Jun 19 04:14:37 BST 2012 x86_64 x86_64 x86_64 GNU/Linux

    Collapse
Viewing 11 replies - 31 through 41 (of 41 total)