hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Philip Zeyliger <phi...@cloudera.com>
Subject Re: Developing cross-component patches post-split
Date Wed, 01 Jul 2009 21:10:04 GMT
>  - When we have one of these cross-project changes, how are we supposed to
>> do
>> builds?
> I talked with Owen & Nigel about this, and we thought that, for now, it
> might be reasonable to have the mapreduce and hdfs trunk each contain an svn
> external link to the current common jar.  Then folks can commit new versions
> of hadoop-common-trunk.jar as they commit changes to common's trunk.  We'd
> need to remove or update this svn external link when branching.  Thoughts?

-1 to checking in jars.  It's quite a bit of bloat in the repository (which
admittedly affects the git.apache folks more than the svn folks), but it's
also cumbersome to develop.

It'd be nice to have a one-liner that builds the equivalent of the tarball
built by "ant binary" in the old world.  When you're working on something
that affects both common and hdfs, it'll be pretty painful to make the jars
in common, move them over to hdfs, and then compile hdfs.

Could the build.xml in hdfs call into common's build.xml and build common as
part of building hdfs?  Or perhaps have a separate "top-level" build file
that builds everything?

-- Philip

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message