hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From syed kather <in.ab...@gmail.com>
Subject Re: Disable retries
Date Fri, 03 Aug 2012 00:23:41 GMT
Hi. Marco gallotta,
   Yes , there are some properties available but i had not tried . Let me
know if not works.
   If you using yarn their properties where you specify max retries
" yarn.resourcemanager.am.max-retries" and in reducer end there is a
properies and for node manager that u can specifed per node job.maxtaskfai
lures.per.tracker

If its hadoop 1.0 then "mapred.map.max.attempts" and
"mapred.reduce.max.attempts"

Syed Abdul kather
send from Samsung S3
On Aug 3, 2012 5:22 AM, "Marco Gallotta" <marco@gallotta.co.za> wrote:

> Hi there
>
> Is there a way to disable retries when a mapper/reducer fails? I'm writing
> data in my mapper and I'd rather catch the failure, recover from a backup
> (fairly lightweight in this case, as the output tables aren't big) and
> restart.
>
>
>
> --
> Marco Gallotta | Mountain View, California
> Software Engineer, Infrastructure | Loki Studios
> fb.me/marco.gallotta | twitter.com/marcog
> marco@gallotta.co.za | +1 (650) 417-3313
>
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message