The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Cluster Configuration

  • #28554

    During installation of Hadoop I am required to provide cluster configuration information as well as database details. I believe I have the database details correct but would like more information on clusterproperties.txt file. Below are the areas in question. Thanks

  • Author
  • #28557

    Hi Mark,
    It sounds like you are doing the manual rpm install any reason why you don’t want to use the Ambari to do the installation? Also in regards to your question, I am not quite clear what you are asking. Are you looking for information as to what each of the hosts are? i.e What is the NAMENODE_HOST or OOZIE_SERVER_HOST ?



    Hi Robert,
    I am setting up a POC environment to determine the viability of this approach for our business. I was not aware of the ‘Ambari’ setup you mentioned. Direction to this would be appreciated. Additionally, Yes I am looking for more detail regarding each one these are and where they must reside. Also, a far more detailed description of what to populate in place of NAMENODE_HOST, etc. Thak you for your response.



    Hi Mark,

    You are doing the HDP on Windows install, correct? If so the value after the ‘=’ should be the actual fully qualified host name of the box you are intending to be used as the various service hosts, such as NAMENODE_HOST=<hostname> where hostname is replaced by the computer name of that box.

    And these names must be resolvable via DNS.

    I hope this helps,

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.