The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Ambari Forum

Ambari 1.6.1, Disk Usage: Data Unavailable (just on 1 slave)

  • #58098
    Pavel Hladik

    Hi, we are running HDP 2.0.6 stack and up to Ambari version 1.5.1 every things works well. Recently we upgrade Ambari Server and Agents to the last version 1.6.1 and we have issue one of our slave hosts shows in Summary:

    Data: Data Unavailable
    Load Avg: 0.00

    All other slaves and masters show right values. If I go to Ganglia UI I see that Ganglia client is working. If I go to Dashboard – Heatmaps I see invalid data on slave host.

    I tried to restart all services, even services on master where Ganglia Server is, also try to restart Nagios service. I tried to uninstall Ganglia Monitor and install again – nothing changed.

    Ganglia client services running on the slave host which has issue:
    nobody 25351 0.2 0.0 59108 1564 ? Ssl Jul29 4:58 /usr/sbin/gmond –conf=/etc/ganglia/hdp/HDPSlaves/gmond.core.conf –pid-file=/var/run/ganglia/hdp/HDPSlaves/
    nobody 25377 0.2 0.0 42812 996 ? Ssl Jul29 4:53 /usr/sbin/gmond –conf=/etc/ganglia/hdp/HDPJournalNode/gmond.core.conf –pid-file=/var/run/ganglia/hdp/HDPJournalNode/
    nobody 25403 0.2 0.0 59108 1616 ? Ssl Jul29 4:57 /usr/sbin/gmond –conf=/etc/ganglia/hdp/HDPDataNode/gmond.core.conf –pid-file=/var/run/ganglia/hdp/HDPDataNode/

    Can you please give me a hint how to put this data to Ambari back? Thanks a lot.

  • Author
  • #58100
    Pavel Hladik

    There are my @ installed packages from HDP-UTILS- and available from HDP-UTILS-

    ganglia-gmond.x86_64 3.5.0-99 @HDP-UTILS-
    ganglia-gmond-modules-python.x86_64 3.5.0-99 @HDP-UTILS-
    libganglia.x86_64 3.5.0-99 @HDP-UTILS-
    ganglia-debuginfo.x86_64 3.5.0-99 HDP-UTILS-
    ganglia-devel.x86_64 3.5.0-99 HDP-UTILS-
    ganglia-gmetad.x86_64 3.5.0-99 HDP-UTILS-
    ganglia-web.noarch 3.5.7-99 HDP-UTILS-
    hdp_mon_ganglia_addons.noarch 1.2.2-1.el6 Updates-ambari-1.6.1

    Pavel Hladik

    I see the strange exitcode 0 in ambari-agent.log:

    INFO 2014-07-30 15:12:50,845 – Running command [‘/usr/bin/python2.6′,
    INFO 2014-07-30 15:12:50,973 – Result: {‘structuredOut’: {}, ‘stdout’: ”, ‘stderr’: ”, ‘exitcode’: 1}

    All others exitcodes are 0.

    Arnaud LINZ

    I have the same issue with Ambari 2.0.0. All is fine under the Ganglia UI, but the “metrics” JSON section of  http://ambariserver:8080/api/v1/clusters/myCluster/hosts/faultyNode

    is all 0, resulting to invalid data on heat maps.

    Did you solve your problem ? I’ve spent a day trying various things (such as uninstalling & installing ganglia monitor) without success.

    Sid Wagle

    If you edit /etc/ganglia/hdp/HDPSlaves/gmond.core.conf, and set debug_level to 10, then restart gmond as service hdp-gmond restart, the HDPSlaves daemon will startup in foreground print a ton of log to indicate what is wrong.

    Pavel Hladik

    It is a issue from July 2014. I’m using Ambari 2.1 without Ganglia and everything is fine.

The forum ‘Ambari’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.