hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kris Nuttycombe <kris.nuttyco...@gmail.com>
Subject Reflective instantiation of Mappers and Reducers
Date Fri, 02 Apr 2010 19:05:49 GMT
Hi, all,

I'm new to Hadoop, and I'm finding myself having a hard time creating
highly configurable Mapper and Reducer instances, due to the fact that
Hadoop seems to require that Mapper and Reducer instances be
instantiated through reflection, so I'm sort of wondering whether I'm
doing things  wrong.

In my application I have a number of composable classes that can be
used to generate mapper and reducer instances by the end-user of the
library. Such composition can be done either in code or at runtime by
interpretation of a script written in a custom DSL, and I'd like to
avoid having separate mapper and reducer classes for the different
types of construction; this doesn't seem like it should be something
that's the responsibility of the map/reduce part of the library.

What I'm wondering is, is there any way to simply serialize a Mapper
or Reducer object, and have the serialized instance copied, passed
around and used everywhere instead of always having the Mapper and
Reducer instantiated by reflection? This would greatly simplify
library design in my case.



View raw message