The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Manual Installation Ganglia

  • #49345
    Gwenael Le Barzic

    Hello !

    I’m trying to install Ganglia on the following architecture :
    – 1 master
    – 3 slaves

    I already installed Hadoop and some other components thanks to Ambari But I didn’t install Ganglia at this time.

    So, in order to install Ganglia, I downloaded the companion tar.gz file, and followed all the manual installation guide for Ganglia :

    At the end of the installation guide, there are some steps to validate the installation. On the Ganglia master node, which is also my master node for Hadoop, I tried the following command :

    [root@master ~]# /etc/init.d/hdp-gmetad start
    Starting hdp-gmetad...
    Failed to create base directory '-B': Permission denied
    chgrp: impossible d'accéder à « /var/run/ganglia/hdp/rrdcached.sock »: Auc
    un fichier ou dossier de ce type
    chgrp: impossible d'accéder à « /var/run/ganglia/hdp/Â
    »: Aucun fichier ou dossier de ce type
    Failed to start /usr/bin/rrdcached
    Not starting /usr/sbin/gmetad because starting /usr/bin/rrdcached failed.

    I investigated this problem and found that the problem was coming from a variable which is not set. This is why there is an error in the creation of the base directory.

    I create a second topic for this problem, because my previous one was more concerning a problem in my own repo folder.

    -m 664 -l unix:${RRDCACHED_ALL_ACCESS_UNIX_SOCKET} \

    Which gives this once I echoed it after the run of the script :
    su nobody -c /usr/bin/rrdcached -p /var/run/ganglia/hdp/
    -m 664 -l unix:/var/run/ganglia/hdp/rrdcached.sock
    -m 777 -P FLUSH,STATS,HELP -l unix:/var/run/ganglia/hdp/
    -b -B

    Seems like ${RRDCACHED_BASE_DIR} is not set.

    Do you have any idea what I can test, where I can investigate to understand what is going on ?

    Best regards.


  • Author
  • #49346
    Gwenael Le Barzic

    Just as a note, on the three slaves, the start of /etc/init.d/hdp-gmond start worked fine.

    In the contrary, on the master, even the command “/etc/init.d/hdp-gmetad start ” failed with the following error message :
    [root@master ~]# /etc/init.d/hdp-gmond start
    Starting hdp-gmond...
    Failed to start /usr/sbin/gmond for cluster HDPHistoryServer

    For your information here is the list of components by host :

    <u>On the master :</u>
    HDFS Client
    History Server
    Hive Client
    Hive Metastore
    MapReduce2 Client
    MySQL Server
    Oozie Client
    Oozie Server
    WebHCat Server
    YARN Client
    ZooKeeper Client
    ZooKeeper Serve

    <u>Slave 1</u>
    ZooKeeper Server

    <u>Slave 2</u>
    ZooKeeper Server

    <u>Slave 3</u>
    ZooKeeper Server

    Best regards.


The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.