Hive / HCatalog Forum
Java heap size and GC limit exceed error when running Hive query
I have HDP2 and getting the java heap size error when I run queries joining 2 or more tables ,each has about 1 to 4 million records.
When I run query without any joins on a single table it works fine, when I do joins on smaller table it works fine.
The java heap size currently is the default size the Ambari chose during the install
I have a 2 node cluster and 24 GB ram on each.
My replication factor is 3 which is the default.
If I need to increase the heap size, which one should I increase and can I do thru Ambari ?
Do I need to change the replication factor?
The forum ‘Hive / HCatalog’ is closed to new topics and replies.
Support from the Experts
A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.
Become HDP Certified
Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world