hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vinod Kumar Vavilapalli <vino...@hortonworks.com>
Subject Re: Counters across all jobs
Date Mon, 10 Sep 2012 17:46:13 GMT

Counters are per-job in Hadoop MapReduce. You need an external aggregator for such cross-job
counters - for e.g. a node in Zookeeper.

Also, is it just for display or your job-logic depends on this? If it is the earlier, and
if you don't have a problem with waiting till jobs finish, you can do a post-process on the
counters of all jobs and calculate the aggregates.

Thanks,
+Vinod Kumar Vavilapalli
Hortonworks Inc.
http://hortonworks.com/

On Aug 28, 2012, at 1:20 AM, Kasi Subrahmanyam wrote:

> Hi,
> 
> I have around 4 jobs running in a controller.
> How can i have a single unique counter present in all the jobs and incremented where
ever used in a job?
> 
> For example:Consider a counter ACount.
> If job1 is incrementing the counter by2 and job3 by 5 and job 4 by 6.
> Can i have the  counter displayed output in the jobtracker as
> job1:2
> job2:2
> job3:7
> job4:13
> 
> Thanks,
> Subbu
> 


Mime
View raw message