Hbase and region server

to create new topics or reply. | New User Registration

This topic contains 2 replies, has 2 voices, and was last updated by  Veerabahu 1 year, 4 months ago.

  • Creator
    Topic
  • #43654

    Veerabahu
    Participant

    I have a 2 node cluster HDP2 installed thru Ambari on Linux Rhel 5.8
    Server 1 hbase master and region server.
    I have data nodes on both the servers,but I did not install region server on server 2
    Everything is working fine. Replication factor is default 3
    Question is what’s the downside of not having a region server on server 2?
    Can I add the component thru Ambari as I see that option, when doing so, do I need to bring down all the services and then add?
    Any specific configuration changes I need to do after adding the region manager or will Ambari take care of it.?

    Please advice

Viewing 2 replies - 1 through 2 (of 2 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #44161

    Veerabahu
    Participant

    Thanks Kenny, I added the region server to the server 2 thru Ambari and everything is good.
    Thanks again.

    Collapse
    #44082

    Kenny Zhang
    Moderator

    Hi,

    The downside is you are not fully utilizing the computing resource within your cluster for hbase, and all the regions now are stored in the server 1.
    Yes, you can add component thru Ambari. You don’t need to bring down all the services. Ambari will populate the settings to the server 2 as server 1 for the region server.

    Thanks,
    Kenny

    Collapse
Viewing 2 replies - 1 through 2 (of 2 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.