hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bejoy KS" <bejoy.had...@gmail.com>
Subject Re: Supplying a jar for a map-reduce job
Date Wed, 21 Nov 2012 04:54:05 GMT
Hi Pankaj

AFAIK You can do the same. Just provide the properties like mapper class, reducer class, input
format, output format etc using -D option at run time.

Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Pankaj Gupta <pankaj@brightroll.com>
Date: Tue, 20 Nov 2012 20:49:29 
To: user@hadoop.apache.org<user@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Supplying a jar for a map-reduce job


I am running map-reduce jobs on Hadoop 0.23 cluster. Right now I supply the jar to use for
running the map-reduce job using the setJarByClass function on org.apache.hadoop.mapreduce.Job.
This makes my code depend on a class in the MR job at compile. What I want is to be able to
run an MR job without being dependent on it at compile time. It would be great if I could
use a jar that contains the Mapper and Reducer classes and just pass it to run the map reduce
job. That would make it easy to choose an MR job to run at runtime. Is that possible?

Thanks in Advance,
View raw message