HDFS Forum

Services failure in Ambari web UI with HDP2.0

  • #46224
    Dharanikumar Bodla
    Participant

    hi to all,
    new to ambari hadoop.Installation of ambari Hdp2.0 is done well on centos 6.under etc/hosts master.hadoop 192.168.2.120,slave.hadoop-192.168.2.121 is done.
    ambari-agent is installed on slave.hadoop,i have only problem with services like HDFS,MAPREDUCE2 and HBASE.
    the following services with errors.
    HBASE:
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Anchor[hdp-hbase::hbase::service_check::end]: Dependency Exec[/tmp/hbase-smoke.sh] has failures: true
    notice: Finished catalog run in 301.72 seconds
    MAPREDUCE2:
    notice: /Stage[2]/Hdp-yarn::Mapred2::Service_check/Anchor[hdp-yarn::mapred2::service_check::end]: Dependency Exec[hadoop –config /etc/hadoop/conf fs -rm -r -f /user/ambari-qa/mapredsmokeoutput /user/ambari-qa/mapredsmokeinput] has failures: true
    notice: Finished catalog run in 3.65 seconds
    HDFS:
    notice: /Stage[2]/Hdp-hbase::Hbase::Service_check/Anchor[hdp-hbase::hbase::service_check::end]: Dependency Exec[/tmp/hbase-smoke.sh] has failures: true
    notice: Finished catalog run in 301.72 seconds
    please find the errors listed above.
    I had another issue how to run a map/reduce program in yarn environment with example in detail.

    thanks&regards,
    dharani kumar bodla

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #46365
    Robert Molina
    Moderator

    Hi Dharanikumar ,
    The smoke.tests that seem to be failing are for hbase, mapreduce only. The HDFS issue you referenced is in regards to an hbase smoke test. Can you confirm that the hdfs services are running properly? Ideally, before any of the smoke tests for hbase and mapreduce to go through successfully, HDFS services has to be up and running. To confirm if HDFS is up and running fine, visit the namenode webui and confirm it is not in safemode and that there are live datanodes and capacity within the cluster.

    Regards,
    Robert

    #46440
    Dharanikumar Bodla
    Participant

    hi Robert ,
    Thanks for the information.
    I had another doubt ,can I run multiple map/reduce job at a time from a single slave.If possible how do I run.
    At present I can able to run from two terminals ,but jobs are running one after the other ,I need simultaneously run of jobs.

    thanks & regards,
    Dharani Kumar bodla.

    #46494
    Dharanikumar Bodla
    Participant

    hi to all ,
    these are errors getting under nagios ,even the hosts are up running .
    h.hadoop DATANODE::DataNode space CRITICAL 01-08-2014 19:25:17 0d 3h 13m 26s 2/2 CRITICAL: Data inaccessible, Status code = 200
    s.hadoop HDFS::Percent DataNodes with space available CRITICAL 01-08-2014 19:27:24 0d 3h 12m 11s 1/1 CRITICAL: total:<2>, affected:<1>
    and HDFS with Percent DataNodes with space available CRITICAL: total:<2>, affected:<1> what is meant by this and how to resolve it.

    thanks & regards,
    Dharani Kumar Bodla.

    #50926
    Robert Molina
    Moderator

    Hi Dharani Kumar Bodla,
    The error, indicates that maybe some of your data mounts are not accessible on your datanodes. I would log into each box and do a df -h to see if you are seeing any space limitations or missing mounts.

    Regards,
    Robert

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.