Gmond unable to create tcp_accept_channel

to create new topics or reply. | New User Registration

This topic contains 1 reply, has 2 voices, and was last updated by  tedr 1 year, 10 months ago.

  • Creator
    Topic
  • #27051

    finance turd
    Member

    I’ve installed using Ambari on a single-node VM with all services running except for ganglia monitoring (gmond). Gmeta is running and reports unable to connect in /var/log/messages as follows:
    Jun 7 13:41:44 bumblebee /usr/sbin/gmetad[2043]: data_thread() for [my cluster] failed to contact node 127.0.0.1
    Jun 7 13:41:44 bumblebee /usr/sbin/gmetad[2043]: data_thread() got no answer from any [my cluster] datasource

    Jun 7 13:40:19 bumblebee /usr/sbin/gmond[30538]: Unable to create tcp_accept_channel. Exiting.#012

    I have tried this install several times and keep getting the same error. With some debugging, I found the command that starts gmond: /usr/libexec/hdp/ganglia/startGmond.sh

    I then pulled out the actual command it’s executing:
    /usr/sbin/gmond –conf=/etc/ganglia/hdp/HDPSlaves/gmond.core.conf –pid-file=/var/run/ganglia/hdp/HDPSlaves/gmond.pid

    It does seem that the gmond process itself cannot start with the given configurations. I have not changed the Ganglia configurations at all following setup and can attach them if that would help… they are long, though.

Viewing 1 replies (of 1 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #27103

    tedr
    Moderator

    Hi Finance,

    Thanks for trying out HDP and Ambari. The problem may lie in the unfortunate fact that when Ganglia is installed it installs its own startup script in /etc/init.d, which causes it to launch a process of it’s own before Ambari launches the process it wants to. The fix for this is to:
    1) pkill -f gmond
    2) pkill -f gmetad
    3) chkconfig gmetad off
    4) go to Ambari services page and start Ganglia

    Thanks,
    Ted.

    Collapse
Viewing 1 replies (of 1 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.