Task tree is running beyond memory-limits
When I run my hadoop application on HDP-1.3.2, the job dump down for task tree is running beyond memory-limits. How can I set the memory for my mapreduce application with the web UI of ambari? Does anyone have any ideas? Thank you.
Support from the Experts
A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.
Become HDP Certified
Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world