The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Ambari – Metrics are not shown

  • #43061
    Ardavan Moinzadeh
    Participant

    I have a HDP 1.3 cluster of 3 nodes installed in a public cloud environment and after the installation I have few Alerts on my console:

    1- I am getting a “Ambari Agent process CRIT for 8 days CRITICAL – Socket timeout after 10 seconds” error in two of my nodes and as I check , ambari agent is running on both of them. tried killing the process and restart and didn’t help.
    2- I am getting “JobTracker CPU utilization UNKNOWN for 8 daysUnable to contact host: X” alert on the node that my Jobtracker is installed.
    Mapreduce smoke test provided in ambari console finishes successfully , however in the “Jobs” tab I don’t see anything and the MapReduce summary window is not showing any information about the mapreduce job that is running :
    {
    JobTracker Started View Host
    TaskTrackers 3/3 Trackers Live View Hosts
    Job Trackers Uptime Not Running
    TaskTrackers Status 0 blacklist / 0 graylist / decommissioning
    JobTracker Heap n/a of n/a (0.0% used)
    Total Slots Capacity 0 maps / 0 reduces / n/a avg per node
    Total Jobs n/a submitted / n/a completed
    Map Slots n/a occupied / n/a reserved
    Reduce Slots n/a occupied / n/a reserved
    Tasks: Maps n/a running / n/a waiting
    Tasks: Reduces n/a running / n/a waiting
    }
    3- Also for node 2 & 3 ganglia is not showing any graph /information, neither in Ambari or X.X.X/ganglia . Only my main master node were Ambari is installed is showing up. As I check on all nodes ganglia components are running.

    P.S . All required ports for HDP installation are listening)

    I don’t know if these issues are related or not but I’ll appreciate any help/suggestions

  • Author
    Replies
  • #43066
    Ardavan Moinzadeh
    Participant

    I thought this might help to diagnose the issue: this is the last few lines in /var/log/http/error_log:

    Tue Nov 05 20:06:24 2013] [error] [client 10.4.1.50] PHP Warning: file_get_contents(/var/nagios/status.dat) [function.file-get-contents]: failed to open stream: No such file or directory in /usr/share/hdp/nagios/nagios_alerts.php on line 132
    [Tue Nov 05 20:06:39 2013] [error] [client 10.4.1.50] PHP Warning: file_get_contents(/var/nagios/status.dat) [function.file-get-contents]: failed to open stream: No such file or directory in /usr/share/hdp/nagios/nagios_alerts.php on line 132
    [Tue Nov 05 20:10:17 2013] [notice] caught SIGTERM, shutting down
    [Tue Nov 05 20:10:17 2013] [notice] suEXEC mechanism enabled (wrapper: /usr/sbin/suexec)
    [Tue Nov 05 20:10:17 2013] [notice] SSL FIPS mode disabled
    [Tue Nov 05 20:10:17 2013] [warn] module proxy_ajp_module is already loaded, skipping
    [Tue Nov 05 20:10:17 2013] [notice] Digest: generating secret for digest authentication …
    [Tue Nov 05 20:10:17 2013] [notice] Digest: done
    [Tue Nov 05 20:10:17 2013] [notice] SSL FIPS mode disabled
    [Tue Nov 05 20:10:17 2013] [notice] Apache/2.2.3 (CentOS) configured — resuming normal operations
    ERROR: opening ‘/var/lib/ganglia/rrds/HDPJobTracker/__SummaryInfo__/cpu_num.rrd’: No such file or directory
    ERROR: opening ‘/var/lib/ganglia/rrds/HDPJobTracker/__SummaryInfo__/load_one.rrd’: No such file or directory
    ERROR: opening ‘/var/lib/ganglia/rrds/HDPJobTracker/__SummaryInfo__/cpu_num.rrd’: No such file or directory
    ERROR: opening ‘/var/lib/ganglia/rrds/HDPJobTracker/__SummaryInfo__/load_one.rrd’: No such file or directory
    ERROR: invalid rpn expression in: 0
    ERROR: opening ‘/var/lib/ganglia/rrds/HDPJobTracker/__SummaryInfo__/cpu_num.rrd’: No such file or directory
    ERROR: opening ‘/var/lib/ganglia/rrds/HDPJobTracker/__SummaryInfo__/load_one.rrd’: No such file or directory
    ERROR: opening ‘/var/lib/ganglia/rrds/HDPJobTracker/__SummaryInfo__/cpu_num.rrd’: No such file or directory
    ERROR: opening ‘/var/lib/ganglia/rrds/HDPJobTracker/__SummaryInfo__/load_one.rrd’: No such file or directory

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.