hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Norbert Burger" <norbert.bur...@gmail.com>
Subject Re: how to pass user parameter for the mapper
Date Wed, 26 Dec 2007 04:09:44 GMT
As I understand, the JobConf getters/setters are best for data static
to the entire job.

What's the recommended way to pass a variable to the mappers/reducers
that might be different for each InputSplit?

For example, let's say I'm using Hadoop's grep example to extract
information from a collection of logfiles, but each logfile has some
header info (which may or may not be part of the current InputSplit)
that every mapper/reducer needs access to.

How should I approach this?  Is overriding InputFileFormat so that the
header data is tacked onto each InputSplit the best way to approach


On Dec 25, 2007 12:34 PM, Ted Dziuba <ted@persai.com> wrote:
> You can get and set variables in the JobConf.  The map task's
> configure() method takes a JobConf as a parameter, and you can keep the
> reference as an instance variable.
> Ted
> helena21 wrote:
> > Hi everybody,
> >
> > please explain me the steps to pass user parameters for the mapper class.
> > thanks.
> >

View raw message