hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alex Kozlov <ale...@cloudera.com>
Subject Re: Setting num reduce tasks
Date Thu, 21 Oct 2010 21:32:37 GMT
It looks like you do not pass the Configuration object correctly.  Do you
use old or new (mapreduce) API?  Do you have something like

Job job = new Job(conf, "My job with " + conf.get("mapred.reduce.tasks") + "
reducers");

to create the job?  Is it OK to share you job creation code?

Alex K

On Thu, Oct 21, 2010 at 2:25 PM, Matt Tanquary <matt.tanquary@gmail.com>wrote:

> Hi Alex,
>
> Yes, I confirmed from those locations that the job is setting the reducers
> to 1.
>
> Thanks
>
> On Thu, Oct 21, 2010 at 1:45 PM, Alex Kozlov <alexvk@cloudera.com> wrote:
>
> > Hi Matt, it might be that the parameter does not end up in the final
> > configuration for a number of reasons.  Can you check the job config xml
> in
> > jt:/var/log/hadoop/history or in the JT UI and see what the
> > mapred.reduce.tasks setting is?  -- Alex K
> >
> > On Thu, Oct 21, 2010 at 1:39 PM, Matt Tanquary <matt.tanquary@gmail.com
> > >wrote:
> >
> > > I am using the following to set my number of reduce tasks, however when
> I
> > > run my job it's always using just 1 reducer.
> > >
> > > conf.setInt("mapred.reduce.tasks", 20);
> > >
> > > 1 reducer will never finish this job. Please help me to understand why
> > the
> > > setting I choose is not used.
> > >
> > > Thanks,
> > > -M@
> > >
> >
>
>
>
> --
> Have you thanked a teacher today? ---> http://www.liftateacher.org
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message