hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: Abort a job when a counter reaches to a threshold
Date Fri, 24 May 2013 08:22:51 GMT
Yes there is a job level end-point upon success via OutputCommitter:
http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/OutputCommitter.html#commitJob(org.apache.hadoop.mapreduce.JobContext)

On Fri, May 24, 2013 at 1:13 PM, abhinav gupta <abhinav_gpt@yahoo.com> wrote:
> Hi,
>
> While running a map-reduce job, that has only mappers, I have a counter that
> counts the number of failed documents .And after all the mappers are done, I
> want the job to fail if the total number of failed documents are above a
> fixed fraction. ( I need it in the end because I don't know the total number
> of documents initially). How can I achieve this without implementing a
> reduce just for this ?
> I know that there are task level cleanup method. But is there any job level
> cleanup method, that can be used to perform this after all the tasks are
> done ?
>
> Thanks
> Abhinav



-- 
Harsh J

Mime
View raw message