HDP on Linux – Installation Forum

Gmond unable to create tcp_accept_channel

  • #27051
    finance turd

    I’ve installed using Ambari on a single-node VM with all services running except for ganglia monitoring (gmond). Gmeta is running and reports unable to connect in /var/log/messages as follows:
    Jun 7 13:41:44 bumblebee /usr/sbin/gmetad[2043]: data_thread() for [my cluster] failed to contact node
    Jun 7 13:41:44 bumblebee /usr/sbin/gmetad[2043]: data_thread() got no answer from any [my cluster] datasource

    Jun 7 13:40:19 bumblebee /usr/sbin/gmond[30538]: Unable to create tcp_accept_channel. Exiting.#012

    I have tried this install several times and keep getting the same error. With some debugging, I found the command that starts gmond: /usr/libexec/hdp/ganglia/startGmond.sh

    I then pulled out the actual command it’s executing:
    /usr/sbin/gmond –conf=/etc/ganglia/hdp/HDPSlaves/gmond.core.conf –pid-file=/var/run/ganglia/hdp/HDPSlaves/gmond.pid

    It does seem that the gmond process itself cannot start with the given configurations. I have not changed the Ganglia configurations at all following setup and can attach them if that would help… they are long, though.

to create new topics or reply. | New User Registration

  • Author
  • #27103

    Hi Finance,

    Thanks for trying out HDP and Ambari. The problem may lie in the unfortunate fact that when Ganglia is installed it installs its own startup script in /etc/init.d, which causes it to launch a process of it’s own before Ambari launches the process it wants to. The fix for this is to:
    1) pkill -f gmond
    2) pkill -f gmetad
    3) chkconfig gmetad off
    4) go to Ambari services page and start Ganglia


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.