hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dhruba Borthakur" <dhr...@gmail.com>
Subject Re: [VOTE] Should we create sub-projects for HDFS and Map/Reduce?
Date Wed, 06 Aug 2008 05:38:45 GMT
Are you talking about sub-projects for core, hdfs and mapreduce? Or is
there another way to allow for having separate mailing lists/jiras for
these components?

I had liked the fact that these pieces are together. It makes the code
compile together, keeps API upgradation simple and enhances developer
community building across all these three pieces of code. Actually,
JIRAs have their own components and we can always filter them using
their component, can't we?

thanks,
dhruba

On Tue, Aug 5, 2008 at 9:18 PM, Owen O'Malley <oom@yahoo-inc.com> wrote:
> I think the time has come to split Hadoop Core into three pieces:
>
>  1. Core (src/core)
>  2. HDFS (src/hdfs)
>  3. Map/Reduce (src/mapred)
>
> There will be lots of details to work out, such as what we do with tools and
> contrib, but I think it is a good idea. This will create separate jiras and
> mailing lists for HDFS and map/reduce, which will make the community much
> more approachable. I would propose that we wait until 0.19.0 is released to
> give us time to plan the split.
>
> -- Owen
>

Mime
View raw message