openoffice-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jürgen Schmidt <>
Subject Re: A systematic approach to IP review?
Date Tue, 20 Sep 2011 12:37:34 GMT
On Mon, Sep 19, 2011 at 1:59 PM, Rob Weir <> wrote:

> 2011/9/19 Jürgen Schmidt <>:
> > On Mon, Sep 19, 2011 at 2:27 AM, Rob Weir <> wrote:
> >
> >> If you haven't looked it closely, it is probably worth a few minutes
> >> of your time to review our incubation status page, especially the
> >> items under "Copyright" and "Verify Distribution Rights".  It lists
> >> the things we need to do, including:
> >>
> >>  -- Check and make sure that the papers that transfer rights to the
> >> ASF been received. It is only necessary to transfer rights for the
> >> package, the core code, and any new code produced by the project.
> >>
> >> -- Check and make sure that the files that have been donated have been
> >> updated to reflect the new ASF copyright.
> >>
> >> -- Check and make sure that for all code included with the
> >> distribution that is not under the Apache license, we have the right
> >> to combine with Apache-licensed code and redistribute.
> >>
> >> -- Check and make sure that all source code distributed by the project
> >> is covered by one or more of the following approved licenses: Apache,
> >> BSD, Artistic, MIT/X, MIT/W3C, MPL 1.1, or something with essentially
> >> the same terms.
> >>
> >> Some of this is already going on, but it is hard to get a sense of who
> >> is doing what and how much progress we have made.  I wonder if we can
> >> agree to a more systematic approach?  This will make it easier to see
> >> the progress we're making and it will also make it easier for others
> >> to help.
> >>
> >> Suggestions:
> >>
> >> 1) We need to get all files needed for the build into SVN.  Right now
> >> there are some that are copied down from the website
> >> during the build's bootstrap process.   Until we get the files all in
> >> one place it is hard to get a comprehensive view of our dependencies.
> >>
> >
> > do you mean to check in the files under ext_source into svn and remove it
> > later on when we have cleaned up the code. Or do you mean to put it
> > somehwere on apache extras?
> > I would prefer to save these binary files under apache extra if possible.
> >
> Why not just keep in in SVN?   Moving things to Apache-Extras does not
> help us with the IP review.   In other words, if we have a dependency
> on a OSS module that has an incompatible license, then moving that
> module to Apache Extras does not make that dependency go away.  We
> still need to understand the nature of the dependency: a build tool, a
> dynamic runtime dependency, a statically linked library, an optional
> extensions, a necessary core module.
> If we find out, for example, that something in ext-sources is only
> used as a build tool, and is not part of the release, then there is
> nothing that prevents us from hosting it in SVN.   But if something is
> a necessary library and it is under GPL, then this is a problem even
> if we store it on Apache-Extras,
> i am not really happy with all the binaries in the trunk tree because of
the large binary blobs and i don't expect too many changes of these
dependencies. And i would like to avoid to check them out every time.

What do others think about a structure where we have "ext_sources" besides


If we can agree on such a structure i would move forward to bring in some
new external sources. The proposed ucpp preprocessor -> BSD license, used in
the idlc and of course part of the SDK later on. I made some tests with it
and was able to build the sources on windows in our cygwin environment with
a new gnu make file. I was also able to build udkapi and offapi with this
new and adapted idlc/ucpp without any problems -> generated type library is
equal to the old one.

I have to run some more tests on other platforms as soon as i have other
platforms available for testing. I decided to replace the preprocessor
instead of removing it because of compatibility reasons and it was of course
the easier change. The next step is to check how the process with
ext_sources work in detail in our build process and adapt the new ucpp
module. If anybody is familiar with ext_sources and can point me to
potential hurdles, please let me know (on a new thread) ;-)


> >
> >>
> >> 2) Continue the CWS integrations.  Along with 1) this ensures that all
> >> the code we need for the release is in SVN.
> >>
> >> 3)  Files that Oracle include in their SGA need to have the Apache
> >> license header inserted and the Sun/Oracle copyright migrated to the
> >> NOTICE file.  Apache RAT (Release Audit Tool) [2] can be used to
> >> automate parts of this.
> >>
> >> 4) Once the SGA files have the Apache headers, then we can make
> >> regular use of RAT to report on files that are lacking an Apache
> >> header.  Such files might be in one of the following categories:
> >>
> >> a) Files that Oracle owns the copyright on and which should be
> >> included in an amended SGA
> >>
> >> b) Files that have a compatible OSS license which we are permitted to
> >> use.  This might require that we add a mention of it to the NOTICE
> >> file.
> >>
> >> c) Files that have an incompatible OSS license.  These need to be
> >> removed/replaced.
> >>
> >> d) Files that have an OSS license that has not yet been
> >> reviewed/categorized by Apache legal affairs.  In that case we need to
> >> bring it to their attention.
> >>
> >> e) (Hypothetically) files that are not under an OSS license at all.
> >> E.g., a Microsoft header file.  These must be removed.
> >>
> >> 5) We should to track the resolution of each file, and do this
> >> publicly.  The audit trail is important.  Some ways we could do this
> >> might be:
> >>
> >> a) Track this in SVN properties.  So set ip:sga for the SGA files,
> >> ip:mit for files that are MIT licensed, etc.  This should be reflected
> >> in headers as well, but this is not always possible.  For example, we
> >> might have binary files where we cannot add headers, or cases where
> >> the OSS files do not have headers, but where we can prove their
> >> provenance via other means.
> >>
> >> b) Track this is a spreadsheet, one row per file.
> >>
> >> c) Track this is an text log file checked in SVN
> >>
> >> d) Track this in an annotated script that runs RAT, where the
> >> annotations document the reason for cases where we tell it to ignore a
> >> file or directory.
> >>
> >> 6) Iterate until we have a clean RAT report.
> >>
> >> 7) Goal should be for anyone today to be able to see what work remains
> >> for IP clearance, as well as for someone 5 years from now to be able
> >> to tell what we did.  Tracking this on the community wiki is probably
> >> not good enough, since we've previously talked about dropping that
> >> wiki and going to MWiki.
> >>
> >
> > talked about it yes but did we reached a final decision?
> >
> > The migrated wiki is available under and
> can
> > be used. Do we want to continue with this wiki now? It's still not clear
> for
> > me at the moment.
> >
> > But we need a place to document the IP clearance and under
> > we have already some
> > information.
> >
> This is not really sufficient. The wiki is talking about module-level
> dependencies.   This is a good star and useful for the high level
> discussion. But we need to look file-by-file.  We need to catch the
> case where (hypothetically) there is a single GPL header file sitting
> in a core OOo source directory.  So we need to review 100,000's of
> files.  Too big for a table on the wiki.
> Note also that doing this kind of check is a per-requisite for every
> release we do at Apache.  So agreeing on what tools and techniques we
> want to use for this process is important.  If we do it right, the
> next time we do a review it will be very fast and easy, since we'll be
> able to build upon the review we've already done. That's why I think
> that either using svn properties or scripts with annotated data files
> listing "cleared" files is the best approach.  Make the review process
> be data-driven and reproducible using automated tools.  It won't
> totally eliminate the need for manual inspection, but it will: 1) Help
> parallelize that effort, and 2) Ensure it is only done once per file.
> > Juergen
> >
> >
> >>
> >>
> >> -Rob
> >>
> >>
> >> [1]
> >>
> >> [2]
> >>
> >

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message