HDP on Linux – Installation Forum

Hadoop cluster services

  • #47328
    Vinay Ch s k

    We are planning to build a Hadoop cluster with 20 nodes using Apache Ambari (3master and 17 slaves). What is the perfect plan to distribute the services acrros the nodes.
    For example:NameNode in one machine and secondary namenode and resource manager in other two masters.
    How about hive-server,hbase-master,nagios server etc…
    So, i need a good plan for distributing them across the cluster. And reasons(if possible).
    Thanks and Regards,

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.