Home Forums HDP on Linux – Installation How to configure multi node cluster

This topic contains 4 replies, has 5 voices, and was last updated by  Son Hai Ha 4 months ago.

  • Creator
  • #43839

    Durga Prasad


    Can anyone please provide how to set up multi node cluster?


Viewing 4 replies - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.

  • Author
  • #53994

    Son Hai Ha

    I hope this can help. I summary the manual guide here: http://docs.hortonworks.com/HDPDocuments/Ambari-

    The bellow process is described under the case of installing Ambari 1.5.1 on a cluster of VMs in Open Stack, and there are some ports and resource websites blocked by the company firewall. The VMs running Ambari are using the standard “CentOS 6.4 minimal” image. We intended to install Hadoop 1.3.3 on the cluster.

    + Edit the file /etc/hosts in all hosts to use fully qualified domain name, append the record to end of files like this:
    ###.###.###.### fully.qualified.domain.name hostname node1.hadoop.test node1 node2.hadoop.test node2

    so that nodes can ping each other by hostname.

    + Edit hostname for each node:
    hostname fully.qualified.domain.name
    vi /etc/sysconfig/network


    + Disable iptables for ambari on all hosts
    chkconfig iptables off
    /etc/init.d/iptables stop

    + Disable SELinux all on all hosts
    setenforce 0

    + Set umask value on all host
    umask 022

    + Running NTP server on all hosts
    yum install ntp ntpdate ntp-doc (install)
    chkconfig ntpd on (turn on service)
    ntpdate pool.ntp.org (update time)
    /etc/init.d/ntpd start (start server)

    + Disable ipv6 (optional, in case ambari-server listen on IPv6 port)
    sysctl -w net.ipv6.conf.all.disable_ipv6=1
    sysctl -w net.ipv6.conf.default.disable_ipv6=1

    +Setting up your local repository (optional, if ambari server could not connect to Hortonwork Repositories)
    ++Install Apache Webserver:
    yum install httpd
    /etc/init.d/httpd start

    ++Download HDP packages at: http://public-repo-1.hortonworks.com/HDP-UTILS-
    yum install yum-utils createrepo
    mkdir -p /var/www/html/
    cd /var/www/html/

    untar the file here

    - Open port 8440 and 8441 in security group, otherwise ambari agent couldn’t register to ambari server.
    - Open port 2181, 2888, 3888 for ZooKeeper
    - Open port 60000, 60010, 60020, and 60030 for HBase
    - Open port 50111 for WebHCat
    - Open port 50070, 50470, 8020, 9000, 50075, 50475, 50010, 50020, and 50090 for HDFS
    - Open port 51111, 19888, 50060, 50030, 9021 for MapReduce (13562 and 50300 not specified in the manual guide but should be opened)
    - Open port 10000 and 9083 for Hive

    Run Ambari Server Setup
    ambari-server setup

    Start Ambari Server
    ambari-server start

    Access to Ambari web: http://ambari.server.host:8080/
    Follow the wizards to create your cluster.
    They will ask for the list of nodes that you want to setup, use their FQDN to enter.


    Vidy G

    I am trying to set up a two node cluster using HDP 2.0 sandbox. I believe we need to use two different VM or physical machine to set up a 2 node cluster. Is it correct?

    I set up a sandbox VM and cloned it to create a second VM. I enabled Ambari in sandbox 1 to configure the sandbox2 as the second node in the cluster. But Ambari failed to register the second sandbox. The log file said issues with host name. I tried to modify host-name of second VM with no luck. Has anyone tried this before? If so what will be a simple way of setting up a 2 node cluster of HDP ?


    Robert Molina

    Hi Durga,
    Have you looked into using HDP’s Ambari product to setup a multi node cluster. Here is documentation that have steps of how to do so.



Viewing 4 replies - 1 through 4 (of 4 total)