Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 7060 invoked from network); 4 Jul 2010 19:45:38 -0000 Received: from unknown (HELO mail.apache.org) (140.211.11.3) by 140.211.11.9 with SMTP; 4 Jul 2010 19:45:38 -0000 Received: (qmail 15634 invoked by uid 500); 4 Jul 2010 19:45:35 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 15546 invoked by uid 500); 4 Jul 2010 19:45:34 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 15538 invoked by uid 99); 4 Jul 2010 19:45:34 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 04 Jul 2010 19:45:34 +0000 X-ASF-Spam-Status: No, hits=0.0 required=10.0 tests=FREEMAIL_FROM,RCVD_IN_DNSWL_NONE,SPF_PASS,T_TO_NO_BRKTS_FREEMAIL X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of kengoodhope@gmail.com designates 74.125.83.176 as permitted sender) Received: from [74.125.83.176] (HELO mail-pv0-f176.google.com) (74.125.83.176) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 04 Jul 2010 19:45:26 +0000 Received: by pvc30 with SMTP id 30so1293563pvc.35 for ; Sun, 04 Jul 2010 12:45:05 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:received:received:in-reply-to :references:date:message-id:subject:from:to:content-type; bh=zAUqJCZ0fdUfaFcAkrE6GiApseCOpIhxlYO+7KsX9Tk=; b=UuqAKvbNRdmtdJgsb1JmnYcDzfxZYDYjA1tdLn0lF1PCqi3yor3Z516VANEHPnc9E3 IBB8C3uAWkLHbS4v41n6JDKkh/dSNOhunHdxVbHt1lSJn/PYKBlM+A6GsP5sRsQsZXQC IScCHzYOufWIK4iTCbE7uiD2u+HMyqRILZ7e0= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; b=nKvmU1SyUkZTXKP0a2PQS1wKnpr/n3xBuoHbIHcHDQRznitMmI+WTOU9HrtquZd22P 2HhvgtM9SVWI/gJOXx+oAYu1f7ZaTwfqxi2L948Ji3pgtAQVoRhjPNZ3tFlUmpf+YU3H 4gTRPVSnVHGegetlpFNl2rAG5slTb/0dgrHrY= MIME-Version: 1.0 Received: by 10.142.223.11 with SMTP id v11mr2238493wfg.85.1278272705388; Sun, 04 Jul 2010 12:45:05 -0700 (PDT) Received: by 10.142.148.14 with HTTP; Sun, 4 Jul 2010 12:45:05 -0700 (PDT) In-Reply-To: References: Date: Sun, 4 Jul 2010 12:45:05 -0700 Message-ID: Subject: Re: why my Reduce Class does not work? From: Ken Goodhope To: common-user@hadoop.apache.org Content-Type: text/plain; charset=ISO-8859-1 X-Virus-Checked: Checked by ClamAV on apache.org You need @Override on your reduce method. Right now you are getting the identity reduce method. On 7/4/10, Vitaliy Semochkin wrote: > Hi, > > I rewritten WordCount sample to use new Hadoop API > > however my reduce task doesn't launch. > > the result file always looks like > some_word 1 > some_word 1 > another_word 1 > another_word 1 > > ... > > Here is the code: > > import java.io.IOException; > import java.util.StringTokenizer; > > import org.apache.hadoop.fs.Path; > import org.apache.hadoop.io.IntWritable; > import org.apache.hadoop.io.LongWritable; > import org.apache.hadoop.io.Text; > import org.apache.hadoop.mapreduce.Job; > import org.apache.hadoop.mapreduce.Mapper; > import org.apache.hadoop.mapreduce.Reducer; > import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; > import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; > > public class WordCount { > > public static class WordCountMapper extends Mapper IntWritable> { > > @Override > protected void map(LongWritable key, Text value, Context context) throws > IOException, InterruptedException { > StringTokenizer st = new StringTokenizer(value.toString()); > while (st.hasMoreTokens()) { > context.write(new Text(st.nextToken()), new IntWritable(1)); > } > } > } > > public static class WordCountReduce extends Reducer IntWritable> { > > @SuppressWarnings("unchecked") > public void reduce(Text key, Iterable values, Reducer.Context > context) throws IOException, InterruptedException { > int sum = 0; > for (IntWritable value : values) { > sum += value.get(); > } > context.write(key, new IntWritable(sum)); > } > } > > public static void main(String[] args) throws IOException, > InterruptedException, ClassNotFoundException { > Job job = new Job(); > job.setJobName("WordCounter"); > job.setJarByClass(WordCount.class); > job.setMapperClass(WordCountMapper.class); > job.setReducerClass(WordCountReduce.class); > job.setOutputKeyClass(Text.class); > job.setOutputValueClass(IntWritable.class); > FileInputFormat.setInputPaths(job, new Path(args[0])); > FileOutputFormat.setOutputPath(job, new Path(args[1])); > System.exit(job.waitForCompletion(true)? 0 :1); > } > } > > Looks like WordCountReduce was never launched but I don't see any Warnings > or Errors in log file. > > Any help is highly appreciated. > > Thanks in Advance, > Vitaliy S >