hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "E. Sammer" <e...@lifeless.net>
Subject Re: Setting #Reducers at runtime
Date Thu, 18 Feb 2010 16:08:43 GMT
On 2/18/10 10:39 AM, Rob Stewart wrote:
> Hi there,
> I am using Hadoop 0.20.1 and I am trying to submit jobs to the cluster
> as jars.
> Here's what I'm trying to do:
>  > hadoop jar $hadoopPath/hadoop-*-examples.jar join
> -Dmapred.reduce.tasks=10 -inFormat
> org.apache.hadoop.mapred.KeyValueTextInputFormat  -outKey
> org.apache.hadoop.io.Text file1.dat file2.dat output.dat
> However, my parameter setting of attempting to state 10 reducers for the
> job is not being honoured by Hadoop, instead choosing some other number.
> Where am I going wrong here? I do not want to have to change this value
> in hadoop/conf.*.xml files, as I am attempting to show the expressive
> power of Hadoop. Note: The power to specify the number or reducers in
> possible in both Pig and Hive.
> Thanks,
> Rob Stewart


It's possible that something inside the jar is calling 
JobConf.setNumReducers(x) after it parses the command line args. That 
would cause this type of behavior. I haven't looked at the source for 
the join example to confirm this, though.

Eric Sammer

View raw message