hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Renaud Delbru <renaud.del...@deri.org>
Subject Re: Best way to limit the number of concurrent tasks per job on hadoop 0.20.2
Date Thu, 27 Jan 2011 10:51:29 GMT
Hi Koji,

thanks for sharing the information,
Is the 0.20-security branch planned to be a official release at some point ?

Cheers
-- 
Renaud Delbru

On 27/01/11 01:50, Koji Noguchi wrote:
> Hi Renaud,
>
> Hopefully it’ll be in 0.20-security branch that Arun is trying to push.
>
> Related (very abstract) Jira.
> https://issues.apache.org/jira/browse/MAPREDUCE-1872
>
> Koji
>
>
>
> On 1/25/11 12:48 PM, "Renaud Delbru" <renaud.delbru@deri.org> wrote:
>
>     As it seems that the capacity and fair schedulers in hadoop 0.20.2 do
>     not allow a hard upper limit in number of concurrent tasks, do anybody
>     know any other solutions to achieve this ?
>     --
>     Renaud Delbru
>
>     On 25/01/11 11:49, Renaud Delbru wrote:
>     > Hi,
>     >
>     > we would like to limit the number of maximum tasks per job on our
>     > hadoop 0.20.2 cluster.
>     > Is the Capacity Scheduler [1] will allow to do this ? Is it correctly
>     > working on hadoop 0.20.2 (I remember a few months ago, we were
>     > looking at it, but it seemed incompatible with hadoop 0.20.2).
>     >
>     > [1]
>     http://hadoop.apache.org/common/docs/r0.20.2/capacity_scheduler.html
>     >
>     > Regards,
>
>


Mime
View raw message