hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Evans <ev...@yahoo-inc.com>
Subject Re: How does Map and Reduce class are sent to remote node by hadoop ??
Date Fri, 27 May 2011 14:54:00 GMT
Francesco,

The mapreduce client will create a jar called job.jar and place it in HDFS in a staging directory.
 This is the jar that you specified to your job conf, or I believe that it tries to guess
the jar based off of the Mapper class and the Reducer class but I am not sure of that.  Once
the job tracker has told a TaskTracker to run a given job the TaskTracker will download the
jar, and then fork off a new JVM to execute the Mapper or Reducer.  If you jar has dependencies
then these usually have to be shipped with it as part of the cache archive interface.

--Bobby Evans

On 5/27/11 9:16 AM, "Francesco De Luca" <f.deluca86@gmail.com> wrote:

Anyone knows the mechanism that hadoop use to load Map and Reduce class on the remote node
where the JobTracker submit the tasks?

In particular, how can hadoop retrieves the .class files ?

Thanks


Mime
View raw message