Task tree is running beyond memory-limits

to create new topics or reply. | New User Registration

This topic contains 0 replies, has 1 voice, and was last updated by  Wang Cong 1 year, 4 months ago.

  • Creator
  • #49772

    Wang Cong

    When I run my hadoop application on HDP-1.3.2, the job dump down for task tree is running beyond memory-limits. How can I set the memory for my mapreduce application with the web UI of ambari? Does anyone have any ideas? Thank you.

You must be to reply to this topic. | Create Account

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.