hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sultan Alamro <sultan.ala...@gmail.com>
Subject Re: Running multiple copies of each task
Date Thu, 17 Dec 2015 19:09:57 GMT
Thanks Namikaze!

Another question:

New tasks in hadoop always have higher priority than speculative tasks.
Does anyone know how and where I can change this priority?


Thanks,
Sultan


On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <lloydsensei@gmail.com>
wrote:

> I think you are looking for mapreduce.reduce.speculative
> Be careful, for some reason, this fell into my spam folder.
>
> Regards,
> LLoyd
>
> On 3 December 2015 at 01:05, Sultan Alamro <sultan.alamro@gmail.com>
> wrote:
> > Hi there,
> >
> > I have been looking at the hadoop source code 2.6.0 trying to understand
> the
> > low level details and how the framework is actually working.
> >
> > I have a simple idea and I am trying to figure out where and how the idea
> > can be implemented. The idea can be described in one sentence: "Running
> > multiple copies of each task". However, implementing the idea is not as
> > simple as I think.
> >
> > What I am aware of is that I only need to modify a few classes. But,
> which
> > classes?
> >
> > I just need someone to guide me to the right direction.
> >
> >
> > Best,
> > Sultan
>

Mime
View raw message