hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <qwertyman...@gmail.com>
Subject Re: Mapper processing gzipped file
Date Wed, 19 Jan 2011 03:46:15 GMT
I don't think it would fail, be it local/non-local for a Map-only job.
AFAIK, input streams are buffered and read (from a local block, or
over the network).

On Wed, Jan 19, 2011 at 5:06 AM, rakesh kothari
<rkothari_iit@hotmail.com> wrote:
> Hi,
>
> There is a gzipped file that needs to be processed by a Map-only hadoop job.
> If the size of this file is more than the space reserved for non-dfs use on
> the tasktracker host processing this file and if it's a non data local map
> task, would this job eventually fail ? Is hadoop jobtracker smart enough to
> not schedule the task on such nodes ?
>
> Thanks,
> -Rakesh
>



-- 
Harsh J
www.harshj.com

Mime
View raw message