Ambari Forum

Ambari 1.6.1, Disk Usage: Data Unavailable (just on 1 slave)

  • #58098
    Pavel Hladik
    Participant

    Hi, we are running HDP 2.0.6 stack and up to Ambari version 1.5.1 every things works well. Recently we upgrade Ambari Server and Agents to the last version 1.6.1 and we have issue one of our slave hosts shows in Summary:

    Data: Data Unavailable
    Load Avg: 0.00

    All other slaves and masters show right values. If I go to Ganglia UI I see that Ganglia client is working. If I go to Dashboard – Heatmaps I see invalid data on slave host.

    I tried to restart all services, even services on master where Ganglia Server is, also try to restart Nagios service. I tried to uninstall Ganglia Monitor and install again – nothing changed.

    Ganglia client services running on the slave host which has issue:
    nobody 25351 0.2 0.0 59108 1564 ? Ssl Jul29 4:58 /usr/sbin/gmond –conf=/etc/ganglia/hdp/HDPSlaves/gmond.core.conf –pid-file=/var/run/ganglia/hdp/HDPSlaves/gmond.pid
    nobody 25377 0.2 0.0 42812 996 ? Ssl Jul29 4:53 /usr/sbin/gmond –conf=/etc/ganglia/hdp/HDPJournalNode/gmond.core.conf –pid-file=/var/run/ganglia/hdp/HDPJournalNode/gmond.pid
    nobody 25403 0.2 0.0 59108 1616 ? Ssl Jul29 4:57 /usr/sbin/gmond –conf=/etc/ganglia/hdp/HDPDataNode/gmond.core.conf –pid-file=/var/run/ganglia/hdp/HDPDataNode/gmond.pid

    Can you please give me a hint how to put this data to Ambari back? Thanks a lot.

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #58100
    Pavel Hladik
    Participant

    There are my @ installed packages from HDP-UTILS-1.1.0.16 and available from HDP-UTILS-1.1.0.17:

    ganglia-gmond.x86_64 3.5.0-99 @HDP-UTILS-1.1.0.16
    ganglia-gmond-modules-python.x86_64 3.5.0-99 @HDP-UTILS-1.1.0.16
    libganglia.x86_64 3.5.0-99 @HDP-UTILS-1.1.0.16
    ganglia-debuginfo.x86_64 3.5.0-99 HDP-UTILS-1.1.0.17
    ganglia-devel.x86_64 3.5.0-99 HDP-UTILS-1.1.0.17
    ganglia-gmetad.x86_64 3.5.0-99 HDP-UTILS-1.1.0.17
    ganglia-web.noarch 3.5.7-99 HDP-UTILS-1.1.0.17
    hdp_mon_ganglia_addons.noarch 1.2.2-1.el6 Updates-ambari-1.6.1

    #58102
    Pavel Hladik
    Participant

    I see the strange exitcode 0 in ambari-agent.log:

    INFO 2014-07-30 15:12:50,845 PythonExecutor.py:78 – Running command [‘/usr/bin/python2.6′,
    u’/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS/package/scripts/hdfs_client.py’,
    ‘STATUS’,
    ‘/var/lib/ambari-agent/data/status_command.json’,
    u’/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS/package’,
    ‘/var/lib/ambari-agent/data/structured-out-status.json’,
    ‘INFO’]
    INFO 2014-07-30 15:12:50,973 PythonExecutor.py:112 – Result: {‘structuredOut': {}, ‘stdout': ”, ‘stderr': ”, ‘exitcode': 1}

    All others exitcodes are 0.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.