HDP 1.2.4 Installation Failing

to create new topics or reply. | New User Registration

This topic contains 3 replies, has 2 voices, and was last updated by  Dave 1 year, 9 months ago.

  • Creator
    Topic
  • #40281

    Liran Badiri
    Participant

    HI
    HDP installation is endinfg with error (3 linux servers):
    err: /Stage[2]/Hdp-hadoop::Datanode/Hdp-hadoop::Service[datanode]/Hdp::
    Exec[sleep 5; ls /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid >/dev/null 2>&1 &&
    ps `cat /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid` >/dev/null 2>&1]
    /Exec[sleep 5;
    ls /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid >/dev/null 2>&1 &&
    ps `cat /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid` >/dev/null 2>&1]/returns: change from notrun to 0 failed:
    sleep 5; ls /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid >/dev/null 2>&1 &&
    ps `cat /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid` >/dev/null 2>&1 returned 2 instead of one of [0] at /var/lib/ambari-agent/puppet/modules/hdp/manifests/init.pp:438

    it look like the process is starting with the wrong user ,and the process.pid file is cretaed under the wrong dirctory .
    the ambari server show that the service is down (ex: hbase) becouse the hbase pid file was cretaed/run by diffrent linux user

    outut from may /etc/passwd and /etc/group file’s:

    cat /etc/passwd ->
    postgres:x:26:26:PostgreSQL Server:/var/lib/pgsql:/bin/bash
    puppet:x:781:199::/home/puppet:/bin/bash
    nagios:x:782:456::/home/nagios:/bin/bash
    ambari-qa:x:1012:457::/home/ambari-qa:/bin/bash
    hbase:x:784:457::/home/hbase:/bin/bash
    hdfs:x:199:457:Hadoop HDFS:/usr/lib/hadoop:/bin/bash
    rrdcached:x:199:100:rrdcached:/var/rrdtool/rrdcached:/sbin/nologin
    apache:x:48:48:Apache:/var/www:/sbin/nologin
    mapred:x:199:457:Hadoop MapReduce:/usr/lib/hadoop:/bin/bash
    #hdfs:x:199:457:Hadoop HDFS:/usr/lib/hadoop:/bin/bash
    zookeeper:x:199:457:ZooKeeper:/var/run/zookeeper:/bin/bash

    cat /etc/group ->
    postgres:x:26:
    puppet:x:199:
    nagios:x:456:apache
    hadoop:x:457:hbase,mapred,hdfs
    apache:x:48:
    hdfs:x:199:
    mapred:x:199:
    hbase:x:199:
    nagiocmd:x:458:apache

    thanks
    liran badiri

Viewing 3 replies - 1 through 3 (of 3 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #40742

    Dave
    Moderator

    Hi Liran,

    I haven’t come across this issue before and have installed HDP 1.3.2 multiple times without fail.
    Were your clusters of 3 machines cloned from the same machines – as this would explain why they experience the same issue?

    Thanks

    Dave

    Collapse
    #40723

    Liran Badiri
    Participant

    Hi Dave
    thanks for your replay.
    small fix i installed HDP 1.3.2
    for fixing the problem . i have deleted the pid files that were created under the wrong dirctory (So the ambari can’t lcoate them)
    then i change the /etc/passwd move the hdfs user 2 lines’ up -> this solved the hdfs error becouse now the pid file owner was hdfs user.
    and the file were creted under the currect dirctory.
    Now ambari find the pid file and show the correct status of the HDFS.

    the issue was with the mapred procees – > i changed the UID and it solved the error .

    Is it a know bug at the HDP 1.3.2 install or i am the first one that make this manuall fix ( i installed it on 2 clusters of 3 linux VM’s
    and have the same issue .

    thanks
    liran

    Collapse
    #40579

    Dave
    Moderator

    Hi Liran,

    You want to kill the PID in the process file by a different user, and remove the PID file.
    Then you want to start it via Ambari.

    If Ambari installed & Configured everything and failed on the starting of services then you will be able to do this.

    Let me know how it goes,

    Thanks

    Dave

    Collapse
Viewing 3 replies - 1 through 3 (of 3 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.