hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Devaraj k <devara...@huawei.com>
Subject RE: Reducer not firing
Date Tue, 17 Apr 2012 09:30:59 GMT
Can you check the task attempt logs in your cluster and find out what is happening in the reduce
phase. By default task attempt logs present in $HADOOP_LOG_DIR/userlogs/<job-id>/. There
could be some bug exist in your reducer which is leading to this output.

Thanks
Devaraj

________________________________________
From: Arko Provo Mukherjee [arkoprovomukherjee@gmail.com]
Sent: Tuesday, April 17, 2012 2:07 PM
To: mapreduce-user@hadoop.apache.org
Subject: Re: Reducer not firing

Hello,

Many thanks for the reply.

The 'no_of_reduce_tasks' is set to 2. I have a print statement before
the code I pasted below to check that.

Also I can find two output files part-r-00000 and part-r-00001. But
they contain the values that has been outputted by the Mapper logic.

Please let me know what I can check further.

Thanks a lot in advance!

Warm regards
Arko

On Tue, Apr 17, 2012 at 12:48 AM, Devaraj k <devaraj.k@huawei.com> wrote:
> Hi Arko,
>
>    What is value of  'no_of_reduce_tasks'?
>
> If no of reduce tasks are 0, then the map task will directly write map output  into the
Job output path.
>
> Thanks
> Devaraj
>
> ________________________________________
> From: Arko Provo Mukherjee [arkoprovomukherjee@gmail.com]
> Sent: Tuesday, April 17, 2012 10:32 AM
> To: mapreduce-user@hadoop.apache.org
> Subject: Reducer not firing
>
> Dear All,
>
> I am porting code from the old API to the new API (Context objects)
> and run on Hadoop 0.20.203.
>
> Job job_first = new Job();
>
> job_first.setJarByClass(My.class);
> job_first.setNumReduceTasks(no_of_reduce_tasks);
> job_first.setJobName("My_Job");
>
> FileInputFormat.addInputPath( job_first, new Path (Input_Path) );
> FileOutputFormat.setOutputPath( job_first, new Path (Output_Path) );
>
> job_first.setMapperClass(Map_First.class);
> job_first.setReducerClass(Reduce_First.class);
>
> job_first.setMapOutputKeyClass(IntWritable.class);
> job_first.setMapOutputValueClass(Text.class);
>
> job_first.setOutputKeyClass(NullWritable.class);
> job_first.setOutputValueClass(Text.class);
>
> job_first.waitForCompletion(true);
>
> The problem I am facing is that instead of emitting values to
> reducers, the mappers are directly writing their output in the
> OutputPath and the reducers and not processing anything.
>
> As read from the online materials that are available both my Map and
> Reduce method uses the context.write method to emit the values.
>
> Please help. Thanks a lot in advance!!
>
> Warm regards
> Arko

Mime
View raw message