Home Forums HDP on Linux – Installation Ambari failing when using local repositories

This topic contains 10 replies, has 4 voices, and was last updated by  Bob Russell 6 months, 2 weeks ago.

  • Creator
    Topic
  • #30909

    Bart Buter
    Participant

    We are trying to install HortonWorks 1.3 using Ambari, on Centos6 hosts.
    The nodes have no access to the internet, therefore we have setup local repositories, I will list the names of the repo files and the repositories they use:
    CentOS-Base.repo
    base
    updates
    extras
    ambari.repo
    ambari-1.x
    HDP-UTILS-1.1.0.15
    Updates-ambari-1.2.4.9
    epel.repo
    epel
    HDP-epel.repo
    HDP-epel
    HDP.repo

    Installation of ambari-server and ambari-agents goes well, page “Confirm Hosts” all gives “Success” to all nodes.
    Installation of the services is a problem. If only HDFS and MapReduce are selected the installation already fails.
    The problem seems to be that during installation the files /etc/yum.repos.d/HDP-epel.repo and /etc/yum.repos.d/HDP.repo get overwritten with files that point to internet repositories.

    My next course of action will be to pre-install some packages on the nodes before trying the ambari install route again.
    Any help is appreciated

Viewing 10 replies - 1 through 10 (of 10 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #51207

    Bob Russell
    Participant

    Sasha, you mentioned:

    you miss one small step when setup local repos….

    http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-1.3.1/bk_using_Ambari_book/content/ambari-chap1-6.html

    I updated as appropriate, but there is still a problem. I needed to have “proxy=_none_” in the HDP.repo file on the agents as I have a mixed situation with some stuff going outside and needing proxy and other stuff (i.e. HDP) being obtained local. In this scenario, the only way I was able to make progress was to declare the proxy as _none_ in the repo file. Unfortunately, it looks like the server pushes a new HDP.repo over to the agent for each service. Is there any way to avoid this push OR to specify proxy in the repoinfo.xml file (which I assume is the src info that creates HDP.repo)?

    Collapse
    #32116

    Ray Roberts
    Participant

    Here is what I’m seeing during the ambari servcies installation:

    stderr:
    none
    none

    stdout:
    warning: Could not retrieve fact fqdn
    warning: Host is missing hostname and/or domain: boxie
    notice: Finished catalog run in 0.06 seconds
    warning: Could not retrieve fact fqdn
    warning: Host is missing hostname and/or domain: boxie
    notice: Finished catalog run in 0.07 seconds
    warning: Could not retrieve fact fqdn
    warning: Host is missing hostname and/or domain: boxie
    warning: Dynamic lookup of $conf_dir at /var/lib/ambari-agent/puppet/modules/hdp-hadoop/manifests/init.pp:108 is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $service_state at /var/lib/ambari-agent/puppet/modules/hdp-hadoop/manifests/init.pp:221 is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $service_state at /var/lib/ambari-agent/puppet/modules/hdp-hadoop/manifests/service.pp:76 is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $service_state at /var/lib/ambari-agent/puppet/modules/hdp-hadoop/manifests/service.pp:85 is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $configuration is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $mapred-site is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $tasktracker_port is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $ambari_db_rca_url is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $ambari_db_rca_driver is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $ambari_db_rca_username is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    warning: Dynamic lookup of $ambari_db_rca_password is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    err: /Stage[1]/Hdp::Snmp/Hdp::Package[snmp]/Hdp::Package::Process_pkg[snmp]/Package[net-snmp-utils]/ensure: change from absent to pres

    Collapse
    #32079

    Ray Roberts
    Participant

    Do you know if the HDP-epel repo is required?

    I can’t find any tarballs of it.

    Collapse
    #32060

    Bart Buter
    Participant

    I’m afraid I cannot help you with your 2nd problem.
    Nr 1 is part of the ambari/puppet installation process. The server that HDP.repo and HDP-epel.repo point to is set in /var/lib/ambari-server/resources… You will also need to make a local repository for these. By the way the epel.repo and HDP-epel.repo are the same as far as I can tell.

    I hope this helps a bit

    Collapse
    #31974

    Ray Roberts
    Participant

    I’m trying the same thing on a test server. I have no internet connection and I’ve setup the hdp.repo and hdp-utils.repo per the instructions. I’ve also modified the repos in /var/lib/ambari-server/resources/……

    I’m seeing a couple things that seem weird.

    1. Two repo files (HDP.repo and HDP-epel.repo) magically appear in /etc/yum.repos.d/. I know one seems to get its info from the the repo file in /var/lib/ambari-server/resources but I’m not sure where the HDP-epel is coming from.
    2. I’m getting an error during the services install in Ambari saying “Could not retrieve fact fqdn” and “Host is missing hostname and/or domain”

    I’m not sure if any of these are why the installation of the services fail, but I assume it is.

    As for the DNS, its a standalone machine and I put an entry in the /etc/hosts just to be safe.

    Any ideas?

    -Ray

    Collapse
    #31259

    Sasha J
    Moderator

    Bart,
    yes, time synchronization is very critical for HBase functionally, as it stated pre-requisites in the documentation.
    Thank you!
    Sasha

    Collapse
    #31254

    Bart Buter
    Participant

    It all works now. We had a problem that the times weren’t synced up properly. Now that they are properly synced to a ntp server it all works fine.
    Thanks for the help

    Collapse
    #31182

    Sasha J
    Moderator

    Smoke test warning could be safely ignored.
    go to use cluster and check if there any other alerts/errors from the HBase.
    It may be just not enough memory on the boxes (how big is it)?

    Thank you!
    Sasha

    Collapse
    #30994

    Bart Buter
    Participant

    Sasha,

    Thanks for the tip, this was indeed what was missing. With your help we are able to install HDFS + Mapreduce.
    The full HortonWorks 1.3 suite still gives a warning at the end when performing the HBase Smoketest. This was using all the default settings, only adding some passwords. The steps described in the forum: “hbase test failed when deployment” don’t work since yum cant find hmc nor puppet to remove. What is the quickest way to reset the ambari configuration so I can re-install using a different configuration?

    HBase Check errors:
    warning: Dynamic lookup of $configuration is deprecated. Support will be removed in Puppet 2.8. Use a fully-qualified variable name (e.g., $classname::variable) or parameterized classes.
    notice: /Stage[1]/Hdp::Snappy::Package/Hdp::Snappy::Package::Ln[32]/Hdp::Exec[hdp::snappy::package::ln 32]/Exec[hdp::snappy::package::ln 32]/returns: executed successfully
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/File[/tmp/hbaseSmoke.sh]/ensure: defined content as ‘{md5}a8119189dfd341c06bc94171fc4a992a’
    err: /Stage[2]/Hdp-hbase::Hbase::Service_check/Exec[/tmp/hbaseSmoke.sh]/returns: change from notrun to 0 failed: Command exceeded timeout at /var/lib/ambari-agent/puppet/modules/hdp-hbase/manifests/hbase/service_check.pp:46

    Collapse
    #30924

    Sasha J
    Moderator

    Bart,
    you miss one small step when setup local repos….

    http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-1.3.1/bk_using_Ambari_book/content/ambari-chap1-6.html

    Make suggested change and everything should work fine for you.
    This should be done on Ambari node.

    Thank you!
    Sasha

    Collapse
Viewing 10 replies - 1 through 10 (of 10 total)