hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bejoy KS" <bejoy.had...@gmail.com>
Subject Re: Disable retries
Date Fri, 03 Aug 2012 00:02:58 GMT
Hi Marco

You can disable retries by setting
mapred.map.max.attempts and mapred.reduce.max.attempts  to 1.

Also if you need to disable speculative execution. You can disable it by setting
mapred.map.tasks.speculative.execution and mapred.reduce.tasks.speculative.execution to false.

With these two steps you can ensure that a task is attempted only once.

These properties to be set in mapred-site.xml or at job level.

Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Marco Gallotta <marco@gallotta.co.za>
Date: Thu, 2 Aug 2012 16:52:00 
To: <common-user@hadoop.apache.org>
Reply-To: common-user@hadoop.apache.org
Subject: Disable retries

Hi there

Is there a way to disable retries when a mapper/reducer fails? I'm writing data in my mapper
and I'd rather catch the failure, recover from a backup (fairly lightweight in this case,
as the output tables aren't big) and restart.

Marco Gallotta | Mountain View, California
Software Engineer, Infrastructure | Loki Studios
fb.me/marco.gallotta | twitter.com/marcog
marco@gallotta.co.za | +1 (650) 417-3313

Sent with Sparrow (http://www.sparrowmailapp.com/?sig)

View raw message