hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Doug Cutting <cutt...@apache.org>
Subject Re: Developing cross-component patches post-split
Date Wed, 01 Jul 2009 20:54:02 GMT
Todd Lipcon wrote:
> - Whenever a task will need to touch both Common and one of the components
> (Mapred/HDFS) should there be two JIRAs or is it sufficient to have just one
> "HADOOP" JIRA with separate patches uploaded for the two repositories?

Two Jiras, I think.  In the long run, such issues should be few.  E.g., 
we should not be changing the FileSystem API incompatibly much.

> - If we're to do two separate JIRAs, is the best bet to use JIRA's "linking"
> feature to show the dependency between them?


> - When we have one of these cross-project changes, how are we supposed to do
> builds?

I talked with Owen & Nigel about this, and we thought that, for now, it 
might be reasonable to have the mapreduce and hdfs trunk each contain an 
svn external link to the current common jar.  Then folks can commit new 
versions of hadoop-common-trunk.jar as they commit changes to common's 
trunk.  We'd need to remove or update this svn external link when 
branching.  Thoughts?


View raw message