hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dmitry Pushkarev" <u...@stanford.edu>
Subject RE: task assignment managemens.
Date Mon, 08 Sep 2008 07:45:18 GMT
How about just specify machines to run the task on? I haven't seen it

-----Original Message-----
From: Devaraj Das [mailto:ddas@yahoo-inc.com] 
Sent: Sunday, September 07, 2008 9:55 PM
To: core-user@hadoop.apache.org
Subject: Re: task assignment managemens.

No that is not possible today. However, you might want to look at the
TaskScheduler to see if you can implement a scheduler to provide this kind
of task scheduling.

In the current hadoop, one point regarding computationally intensive task is
that if the machine is not able to keep up with the rest of the machines
(and the task on that machine is running slower than others), speculative
execution, if enabled, can help a lot. Also, implicitly, faster/better
machines get more work than the slower machines.

On 9/8/08 3:27 AM, "Dmitry Pushkarev" <umka@stanford.edu> wrote:

> Dear Hadoop users,
> Is it possible without using java manage task assignment to implement some
> simple rules?  Like do not launch more that 1 instance of crawling task
> a machine, and do not run data intensive tasks on remote machines, and do
> not run computationally intensive tasks on single-core machines:etc.
> Now it's done by failing tasks that decided to run on a wrong machine, but
> hope to find some solution on jobtracker side..
> ---
> Dmitry

View raw message