Return-Path: Delivered-To: apmail-hadoop-mapreduce-commits-archive@minotaur.apache.org Received: (qmail 9087 invoked from network); 20 Mar 2010 01:42:26 -0000 Received: from unknown (HELO mail.apache.org) (140.211.11.3) by 140.211.11.9 with SMTP; 20 Mar 2010 01:42:26 -0000 Received: (qmail 75765 invoked by uid 500); 20 Mar 2010 01:42:26 -0000 Delivered-To: apmail-hadoop-mapreduce-commits-archive@hadoop.apache.org Received: (qmail 75728 invoked by uid 500); 20 Mar 2010 01:42:26 -0000 Mailing-List: contact mapreduce-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: mapreduce-dev@hadoop.apache.org Delivered-To: mailing list mapreduce-commits@hadoop.apache.org Received: (qmail 75715 invoked by uid 99); 20 Mar 2010 01:42:26 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 20 Mar 2010 01:42:26 +0000 X-ASF-Spam-Status: No, hits=-1046.2 required=10.0 tests=ALL_TRUSTED,AWL X-Spam-Check-By: apache.org Received: from [140.211.11.4] (HELO eris.apache.org) (140.211.11.4) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 20 Mar 2010 01:42:24 +0000 Received: by eris.apache.org (Postfix, from userid 65534) id CADD923889DD; Sat, 20 Mar 2010 01:42:04 +0000 (UTC) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r925518 - in /hadoop/mapreduce/trunk: CHANGES.txt src/java/org/apache/hadoop/mapreduce/Mapper.java src/java/org/apache/hadoop/mapreduce/Reducer.java Date: Sat, 20 Mar 2010 01:42:04 -0000 To: mapreduce-commits@hadoop.apache.org From: cdouglas@apache.org X-Mailer: svnmailer-1.0.8 Message-Id: <20100320014204.CADD923889DD@eris.apache.org> Author: cdouglas Date: Sat Mar 20 01:42:04 2010 New Revision: 925518 URL: http://svn.apache.org/viewvc?rev=925518&view=rev Log: MAPREDUCE-1407. Update javadoc in mapreduce.{Mapper,Reducer} to match actual usage. Contributed by Benoit Sigoure Modified: hadoop/mapreduce/trunk/CHANGES.txt hadoop/mapreduce/trunk/src/java/org/apache/hadoop/mapreduce/Mapper.java hadoop/mapreduce/trunk/src/java/org/apache/hadoop/mapreduce/Reducer.java Modified: hadoop/mapreduce/trunk/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/CHANGES.txt?rev=925518&r1=925517&r2=925518&view=diff ============================================================================== --- hadoop/mapreduce/trunk/CHANGES.txt (original) +++ hadoop/mapreduce/trunk/CHANGES.txt Sat Mar 20 01:42:04 2010 @@ -1397,3 +1397,6 @@ Release 0.21.0 - Unreleased MAPREDUCE-1522. FileInputFormat may use the default FileSystem for the input path. (Tsz Wo (Nicholas), SZE via cdouglas) + + MAPREDUCE-1407. Update javadoc in mapreduce.{Mapper,Reducer} to match + actual usage. (Benoit Sigoure via cdouglas) Modified: hadoop/mapreduce/trunk/src/java/org/apache/hadoop/mapreduce/Mapper.java URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/java/org/apache/hadoop/mapreduce/Mapper.java?rev=925518&r1=925517&r2=925518&view=diff ============================================================================== --- hadoop/mapreduce/trunk/src/java/org/apache/hadoop/mapreduce/Mapper.java (original) +++ hadoop/mapreduce/trunk/src/java/org/apache/hadoop/mapreduce/Mapper.java Sat Mar 20 01:42:04 2010 @@ -69,16 +69,16 @@ import org.apache.hadoop.mapreduce.task. *

Example:

*

  * public class TokenCounterMapper 
- *     extends Mapper{
+ *     extends Mapper<Object, Text, Text, IntWritable>{
  *    
  *   private final static IntWritable one = new IntWritable(1);
  *   private Text word = new Text();
  *   
- *   public void map(Object key, Text value, Context context) throws IOException {
+ *   public void map(Object key, Text value, Context context) throws IOException, InterruptedException {
  *     StringTokenizer itr = new StringTokenizer(value.toString());
  *     while (itr.hasMoreTokens()) {
  *       word.set(itr.nextToken());
- *       context.collect(word, one);
+ *       context.write(word, one);
  *     }
  *   }
  * }
@@ -141,4 +141,4 @@ public class MapperIn this phase the 
  *   {@link #reduce(Object, Iterable, Context)}
- *   method is called for each <key, (collection of values)> in
+ *   method is called for each <key, (collection of values)> in
  *   the sorted inputs.

*

The output of the reduce task is typically written to a * {@link RecordWriter} via @@ -96,18 +96,18 @@ import org.apache.hadoop.mapred.RawKeyVa * *

Example:

*

- * public class IntSumReducer extends Reducer {
+ * public class IntSumReducer<Key> extends Reducer<Key,IntWritable,
+ *                                                 Key,IntWritable> {
  *   private IntWritable result = new IntWritable();
  * 
- *   public void reduce(Key key, Iterable values, 
- *                      Context context) throws IOException {
+ *   public void reduce(Key key, Iterable<IntWritable> values,
+ *                      Context context) throws IOException, InterruptedException {
  *     int sum = 0;
  *     for (IntWritable val : values) {
  *       sum += val.get();
  *     }
  *     result.set(sum);
- *     context.collect(key, result);
+ *     context.write(key, result);
  *   }
  * }
  *