The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum


  • #37022
    bruno bonnin


    I have installed the last version of HDP2 and I have a problem with Nagios provided by HDP : when I click on any links on the right side of the Web UI of Nagios, I got the same error :

    Error: Could not read object configuration data!

    Here are some things you should check in order to resolve this error:

    Verify configuration options using the -v command-line option to check for errors.
    Check the Nagios log file for messages relating to startup or status data errors.

    Make sure you read the documentation on installing, configuring and running Nagios thoroughly before continuing. If all else fails, try sending a message to one of the mailing lists. More information can be found at

    There is no error in logs (nagios and httpd) and I can receive mails from Nagios each time a problem occurs.
    Thanks for any help !


  • Author
  • #62300
    Emiliano X

    This post is old, but only for future reference, the real problem is in the NAGIOS config and CGI scripts, the parameter “retry_check_interval” must be an integer for CGI to work, but in many lines is 0.5 or 0.25 by default.

    Ambari in HDP2.1 use a template for this configuration located in:


    You need to edit hadoop-services.cfg.j2 and replace every occurrence of “retry_check_interval 0.25” or “retry_check_interval 0.5” with “retry_check_interval 1”.

    Restart NAGIOS.

    Hope this help someone.


The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.