hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shahab Yunus <shahab.yu...@gmail.com>
Subject Re: Simplifying MapReduce API
Date Tue, 27 Aug 2013 15:48:39 GMT
For starters (experts might have more complex reasons), what if your
respective map and reduce logic becomes complex enough to demand separate
classes? Why tie the clients to implement both by moving these in one Job
interface. In the current design you can always implement both (map and
reduce) interfaces if your logic is simple enough and go the other route,
of separate classes if that is required. I think it is more flexible this
way (you can always build up from and on top of granular design, rather
than other way around.)

I hope I understood your concern correctly...


On Tue, Aug 27, 2013 at 11:35 AM, Andrew Pennebaker

> There seems to be an abundance of boilerplate patterns in MapReduce:
> * Write a class extending Map (1), implementing Mapper (2), with a map
> method (3)
> * Write a class extending Reduce (4), implementing Reducer (5), with a
> reduce method (6)
> Could we achieve the same behavior with a single Job interface requiring
> map() and reduce() methods?

View raw message