hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Greg Roelofs <roel...@yahoo-inc.com>
Subject Re: how to avoid 2 jobs running on the same machine at the same time
Date Tue, 29 Mar 2011 21:59:19 GMT
> If on a TaskTracker, the maximum limit of tasks is set to N, N
> parallel tasks may be run on it. The N is set to two by default (since
> most machines today are 2+ core'd). You can tweak this parameter to
> reflect one, and then you'll see only a maximum of one Task running on
> the TaskTracker at a given time.

Depending on data locality, however, that could actually make things run
more slowly.  (In principle, anyway; I don't have even anecdotal evidence
to back that up. :-)  I'm sure one could construct a particular set of
hardware with a particular data layout for which it would be true, though.)


View raw message