This topic contains 4 replies, has 3 voices, and was last updated by  Wim Derweduwe 1 year, 1 month ago.

  • Creator
    Topic
  • #36608

    I tried to install a hadoop cluster of 4 nodes.
    First problem I ran into is that I could not use the server running Ambari as part of the cluster. In the step confirm hosts it failed

    A second problem is when it started installing it got stuck on 4% and then failed with the message: “puppet has been killed due to timeout”

    Part of output:

    Anchor[hdp::package::glibc::end]: Dependency Package[glibc.i686] has failures: true
    warning: /Stage[1]/Hdp/Hdp::Package[glibc]/Hdp::Package::Process_pkg[glibc]/Anchor[hdp::package::glibc::end]: Skipping because of failed dependencies
    notice: /Stage[2]/Hdp-hadoop::Initialize/Anchor[hdp-hadoop::initialize::begin]: Dependency Package[glibc.i686] has failures: true
    warning: /Stage[2]/Hdp-hadoop::Initialize/Anchor[hdp-hadoop::initialize::begin]: Skipping because of failed dependencies
    notice: /Stage[2]/Hdp-hadoop::Initialize/Hdp-hadoop::Common[common]/Anchor[hdp-hadoop::common::begin]: Dependency Package[glibc.i686] has failures: true
    warning: /Stage[2]/Hdp-hadoop::Initialize/Hdp-hadoop::Common[common]/Anchor[hdp-hadoop::common::begin]: Skipping because of failed dependencies
    notice: /Stage[main]/Hdp-hadoop/Anchor[hdp-hadoop::begin]: Dependency Package[glibc.i686] has failures: true
    warning: /Stage[main]/Hdp-hadoop/Anchor[hdp-hadoop::begin]: Skipping because of failed dependencies
    notice: /Stage[main]/Hdp-hadoop/Hdp-hadoop::Package[hadoop]/Anchor[hdp-hadoop::package::helper::begin]: Dependency Package[glibc.i686] has failures: true
    warning: /Stage[main]/Hdp-hadoop/Hdp-hadoop::Package[hadoop]/Anchor[hdp-hadoop::package::helper::begin]: Skipping because of failed dependencies
    notice: /Stage[main]/Hdp-hadoop/Hdp-hadoop::Package[hadoop]/Hdp::Package[hadoop 64]/Hdp::Package::Process_pkg[hadoop 64]/Anchor[hdp::package::hadoop 64::begin]: Dependency Package[glibc.i686] has failures: true
    warning: /Stage[main]/Hdp-hadoop/Hdp-hadoop::Package[hadoop]/Hdp::Package[hadoop 64]/Hdp::Package::Process_pkg[hadoop 64]/Anchor[hdp::package::hadoop 64::begin]: Skipping because of failed dependencies
    notice: /Stage[main]/Hdp-hadoop/Hdp-hadoop::Package[hadoop]/Hdp::Package[hadoop 64]/Hdp::Package::Process_pkg[hadoop 64]/Package[hadoop-sbin]: Dependency Package[glibc.i686] has failures: true
    warning: /Stage[main]/Hdp-hadoop/Hdp-hadoop::Package[hadoop]/Hdp::Package[hadoop 64]/Hdp::Package::Process_pkg[hadoop 64]/Package[hadoop-sbin]: Skipping because of failed dependencies
    notice: /Stage[main]/Hdp-hadoop/Hdp-hadoop::Package[hadoop]/Hdp::Package[hadoop 64]/Hdp::Package::Process_pkg[hadoop 64]/Package[hadoop-libhdfs]: Dependency Package[glibc.i686] has failures: true

    any help is welcome.

    W

Viewing 4 replies - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #37024

    Started all over again.

    Managed now to get all host in the cluster. All host files are correct.
    The install failed again.
    For the host with the ambari-server on it just had warnings.
    For HBASE Master Install
    warning: Scope(Hdp::Configfile[/etc/hbase/conf/hbase-env.sh]): Could not look up qualified variable ‘::hdp-hadoop::params::conf_dir'; class ::hdp-hadoop::params has not been evaluated
    notice: /Stage[1]/Hdp::Snappy::Package/Hdp::Snappy::Package::Ln[32]/Hdp::Exec[hdp::snappy::package::ln 32]/Exec[hdp::snappy::package::ln 32]/returns: executed successfully

    For the other hosts it failed with error:
    host02: Ganglia Monitor install
    Puppet has been killed due to timeout
    Host03: HDFS Client install
    Puppet has been killed due to timeout
    Host04: HBase Client install
    Puppet has been killed due to timeout
    Strangely enough Ganglia Monitor install is installed correctly on host04 and host03

    So seems to be due to some kind of timeouts? Any suggestions I should try? Any information I should pass on?
    How do i check it is 32 or 64 bit repo?
    Thnaks

    Wim

    Collapse
    #36730

    Yi Zhang
    Moderator

    Hi Vim,

    Can you try install glibc.i686 manually? Does your repo have that 32bit version, besides the 64bit version?

    Thanks,
    Yi

    Collapse
    #36650

    I followed the installation guide.
    Struggled a bit with SSH configuration but finally managed that all server with Ambari on could connect to all other hosts without password.
    In the meanwhile I found what I did wrong why it would not accept the server with Ambari as a node in the cluster. I forgot to add the public key to the authorized_keys on that server, so it could not connect to itself without password.

    I will provide the info about the servers later. I can’t access them from home.

    Thanks,

    Wim

    Collapse
    #36624

    Dave
    Moderator

    Hi Wim,

    What is the OS version of all nodes (including Ambari) in your cluster and are they all 64 bit – as it looks like your Ambari machine is 32 bit.
    I would expect the failure to install is more than likely network related. Is each node in all the other nodes hosts file? Can you reverse lookup each node and do the hostnames resolve to the correct IPs?

    Thanks

    Dave

    Collapse
Viewing 4 replies - 1 through 4 (of 4 total)