hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dhruba Borthakur" <dhr...@gmail.com>
Subject Re: [VOTE] Should we create sub-projects for HDFS and Map/Reduce?
Date Wed, 06 Aug 2008 20:29:47 GMT
What about releases? Does this mean that each sub-project will be
released separately?  If so, then the life of an administrator becomes
even more harder :-). he has to pick and choose each package, verify
whether they are compatible with one another, run various installation
utilities to install them, etc.etc.

-dhruba

On Wed, Aug 6, 2008 at 10:19 AM, Dhruba Borthakur <dhruba@gmail.com> wrote:
> +1.
>
> -dhruba
>
> On Wed, Aug 6, 2008 at 10:07 AM, Doug Cutting <cutting@apache.org> wrote:
>> +1
>>
>> I agree that it is time to do this.  Should we start using Ivy, so that the
>> inter-dependencies are easier to manage?
>>
>> Doug
>>
>> Owen O'Malley wrote:
>>>
>>> I think the time has come to split Hadoop Core into three pieces:
>>>
>>>  1. Core (src/core)
>>>  2. HDFS (src/hdfs)
>>>  3. Map/Reduce (src/mapred)
>>>
>>> There will be lots of details to work out, such as what we do with tools
>>> and contrib, but I think it is a good idea. This will create separate jiras
>>> and mailing lists for HDFS and map/reduce, which will make the community
>>> much more approachable. I would propose that we wait until 0.19.0 is
>>> released to give us time to plan the split.
>>>
>>> -- Owen
>>
>

Mime
View raw message