The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Windows – Installation Forum

MapRed Capacity Scheduler

  • #19841
    Ted Malone

    In playing around with HDP on Windows, I note that the value for the mapred.capacity-scheduler.queue.default.capacity value is set to 50, which effectively limits the job system to 50% of capacity. If I change this value on the TaskTracker host, the service throws an unhandled exception and dies.

    I notice that the default value for this setting on HDP for Linux is 100..

    Any thoughts on this?


  • Author
  • #21041

    Hi Ted,
    If you leave the default value of 50 on the tasktracker node, the job runs fine? What value did you change it to and what is the error you received? Also it would be good to know what OS you are using as well.


    Ted Malone

    This is on Windows Server 2012, and the service threw an unhandled exception on starting, no matter what the value was set to other than 50 (i.e 50 works fine, but 51 does not).

    I found the root of the issue. Turns out that HDP on Windows has 2 queues configured by default (as opposed to the single queue by default on HDP Linux) and the configuration must add up to 100% for the capacities. So, if you increase one queue capacity you must reduce the other….


    Hi Ted,

    That makes sense, about needing to change both queues to total 100%. I’m looking into why there’d be two queues configured by default on windows in the first place.


    Ted Malone

    If you look at the mapred-site.conf file you’ll find that there are 2 queues, one named “default” and the other named “joblauncher”. If you remove the joblauncher queue and then set the capacity of the default queue to 100 in the capacity-scheduler.conf file, everything will work as advertised on a single queue…

    Comma separated list of queues configured for this jobtracker.


    Hi Ted,

    Thanks for the info, I do see that there are two queues created there. I am checking into why it was done that way.


The forum ‘HDP on Windows – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.