hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Konstantin Boudnik <...@apache.org>
Subject Re: [DISCUSS] Move common, hdfs, mapreduce contrib components to apache-extras.org or elsewhere
Date Mon, 31 Jan 2011 16:49:01 GMT
On Sun, Jan 30, 2011 at 23:19, Owen O'Malley <omalley@apache.org> wrote:
> On Jan 30, 2011, at 7:42 PM, Nigel Daley wrote:
>> Now that http://apache-extras.org is launched
>> (https://blogs.apache.org/foundation/entry/the_apache_software_foundation_launches)
>> I'd like to start a discussion on moving contrib components out of common,
>> mapreduce, and hdfs.
> The PMC can't "move" code to Apache extras. It can only choose to abandon
> code that it doesn't want to support any longer. As a separate action some
> group of developers may create projects in Apache Extras based on the code
> from Hadoop.
> Therefore the question is really what if any code Hadoop wants to abandon.
> That is a good question and one that we should ask ourselves occasionally.
> After a quick consideration, my personal list would look like:
> failmon
> fault injection

This is the best way to kill a project as tightly coupled with the
core code as fault injection.

So, if you really want to kill it - then move it.

> fuse-dfs
> hod
> kfs
> Also note that pushing code out of Hadoop has a high cost. There are at
> least 3 forks of the hadoop-gpl-compression code. That creates a lot of
> confusion for the users. A lot of users never go to the work to figure out
> which fork and branch of hadoop-gpl-compression work with the version of
> Hadoop they installed.
> -- Owen

View raw message