hdt-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Roman Shaposhnik <...@apache.org>
Subject Re: Plugin organization and build system
Date Mon, 14 Jan 2013 02:52:39 GMT
On Fri, Jan 11, 2013 at 11:27 AM, Adam Berry <adamb@apache.org> wrote:
> But, I think however we actually supply the implementation for versions of
> hadoop, they should be contributed via extension points. This would make it
> easiest to add newer versions down the line, and would also make it
> possible for connectors that support things like CDH, if anything special
> is required. Not that I'm suggesting that we would do these vendor
> connectors, just that it makes most sense to make these extensible.

FWIW: virtually all Hadoop ecosystem projects deal with incompatibilities
between hadoop releases via shims. E.g.:
    https://github.com/apache/hive/tree/trunk/shims/src

As for the mixing and matching versions of plugins -- this is a really
tough nut to crack if you want to stay in Hadoop family. Putting my
Bigtop hat on I can say that an idea of a universally useful binary
convenience artifact is usually an uphill battle. The best projects
can hope for is to be compatible with whatever developers tend
to use out of the box and make the rest be dependent on custom
builds provided by either ASF itself (Bigtop) or vendors (Cloudera,
Horton Works, etc.).

Thanks,
Roman.

Mime
View raw message