hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Susheel Kumar Gadalay <skgada...@gmail.com>
Subject Re: ignoring map task failure
Date Mon, 18 Aug 2014 17:57:34 GMT
Check the parameter yarn.app.mapreduce.client.max-retries.

On 8/18/14, parnab kumar <parnab.2007@gmail.com> wrote:
> Hi All,
>
>        I am running a job where there are between 1300-1400 map tasks. Some
> map task fails due to some error. When 4 such maps fail the job naturally
> gets killed.  How to  ignore the failed tasks and go around executing the
> other map tasks. I am okay with loosing some data for the failed tasks.
>
> Thanks,
> Parnab
>

Mime
View raw message