hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aseem Anand <aseem.ii...@gmail.com>
Subject Re: Task does not enter reduce function after secondary sort
Date Mon, 05 Nov 2012 10:27:19 GMT
Hey,
That change coupled with a few minor issues seemed to work. Though its
strange that mapreduce programs I wrote with the same API were working till
now without Override. Thanks :).

Thanks,
Aseem
On Mon, Nov 5, 2012 at
 3:33 AM, Harsh J <harsh@cloudera.com> wrote:

> Yep - it will show an error since your reduce(…) signature is wrong
> for the new API:
>
> http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/Reducer.html#reduce(KEYIN,%20java.lang.Iterable,%20org.apache.hadoop.mapreduce.Reducer.Context)
>
> Chuck the Reporter object (its an Old API thing, now built into
> Context itself) and transform it into:
>
> @Override
> public void reduce(Text key, Iterable<NullWritable> values, Context
> output) {
>     …
> }
>
> … and your IDE shouldn't complain anymore.
>
> On Mon, Nov 5, 2012 at 2:45 AM, Aseem Anand <aseem.iiith@gmail.com> wrote:
> > Hey,
> > Here are code snippets.
> >
> > In the driver class :
> >         job.setMapperClass(SkyzKnnMapperT.class);
> >         job.setReducerClass(SkyzKnnReducer.class);
> >         job.setGroupingComparatorClass(GroupComparator.class);
> >         job.setPartitionerClass(MyPartitioner.class);
> >         job.setSortComparatorClass(KeyComparator.class);
> >         job.setMapOutputKeyClass(Text.class);
> >         job.setMapOutputValueClass(NullWritable.class);
> >         job.setOutputKeyClass(Text.class);
> >         job.setOutputValueClass(Text.class);
> >
> > public class GroupComparator extends WritableComparator {
> >
> > protected GroupComparator() {
> > super(Text.class, true);
> > }
> >
> > @Override
> > public int compare(WritableComparable w1, WritableComparable w2) {
> >
> > //consider only zone and day part of the key
> > Text t1 = (Text) w1;
> > Text t2 = (Text) w2;
> > String[] t1Items = t1.toString().split(":");
> > String[] t2Items = t2.toString().split(":");
> > int comp = t1Items[0].compareTo(t2Items[0]);
> > System.out.println("GROUP" + comp);
> > return comp;
> >
> > }
> > }
> > public class SkyzKnnReducer extends Reducer<Text,Iterable,Text,Text> {
> > public void reduce(Text key, Iterable<NullWritable> values,
> > Context output, Reporter reporter)
> > throws IOException, InterruptedException {
> >             String t = key.toString();
> >             t = "HELLO" + t;
> >             output.write(new Text(t),new Text(t));
> >
> > }
> > }
> >
> > The composite key is of the form A:Rest_of_text where A is the natural
> key.
> >
> > Override annotation to this reduce method shows an error in Eclipse. What
> > else could be going wrong ?
> >
> > Thanks,
> > Aseem
> > On Mon, Nov 5, 2012 at 2:33 AM, Harsh J <harsh@cloudera.com> wrote:
> >>
> >> Sounds like an override issue to me. If you can share your code, we
> >> can take a quick look - otherwise, try annotating your reduce(…)
> >> method with @Override and recompiling to see if it really is the right
> >> signature Java expects.
> >>
> >> On Mon, Nov 5, 2012 at 1:48 AM, Aseem Anand <aseem.iiith@gmail.com>
> wrote:
> >> > Hi,
> >> > I am using a Secondary Sort for my Hadoop program. My map function
> emits
> >> > (Text,NullWritable) where Text contains the composite key and
> >> > appropriate
> >> > comparison functions are made and a custom Partitioner . These seem to
> >> > be
> >> > working fine.
> >> >
> >> > I have been struggling with the problem that these values are not
> being
> >> > received by the reduce function and instead automatically get written
> to
> >> > the
> >> > hdfs in x number of files where x is the number of reducers. I have
> made
> >> > sure the reduce function is set to my Reduce function and not identity
> >> > reduce.
> >> >
> >> > Can someone please explain this behavior and what could be possibly
> >> > wrong ?
> >> >
> >> > Thanks & Regards,
> >> > Aseem
> >>
> >>
> >>
> >> --
> >> Harsh J
> >
> >
>
>
>
> --
> Harsh J
>

Mime
View raw message