Java heap size and GC limit exceed error when running Hive query

to create new topics or reply. | New User Registration

This topic contains 0 replies, has 1 voice, and was last updated by  Veerabahu 1 year, 4 months ago.

  • Creator
    Topic
  • #43782

    Veerabahu
    Participant

    I have HDP2 and getting the java heap size error when I run queries joining 2 or more tables ,each has about 1 to 4 million records.
    When I run query without any joins on a single table it works fine, when I do joins on smaller table it works fine.
    The java heap size currently is the default size the Ambari chose during the install
    I have a 2 node cluster and 24 GB ram on each.
    My replication factor is 3 which is the default.

    If I need to increase the heap size, which one should I increase and can I do thru Ambari ?
    Do I need to change the replication factor?

    Please advice
    Thanks

You must be to reply to this topic. | Create Account

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.