geronimo-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Prasad Kashyap" <goyathlay.geron...@gmail.com>
Subject Re: Geronimo build automation status (longish)
Date Mon, 04 Dec 2006 14:53:55 GMT
If I'm getting the picture right, AH is just our solution to
building/testing G in an automated environment. Others are free to
continue to build G the same way they had always been building before.

Cheers
Prasad

On 12/4/06, John Sisson <jrsisson@gmail.com> wrote:
> Hi Jason,
>
> I had a quick look at the AntHill console and it looked pretty cool.  My
> initial thought was whether we would be discouraging potential ISVs to
> use Geronimo as a basis of their solutions by requiring them to license
> AntHill if they want to do their own automated builds/testing of
> Geronimo (e.g. so they can build and ship their own fix releases outside
> the Apache process).  The AntHill site does not list prices, so I can't
> comment on what licensing of AntHill for a non-open source version of
> Geronimo would cost.
>
> If we are always going to be able to build Geronimo and test it manually
> (without AntHill), then maybe it isn't such a biggie.  Thought I'd raise
> it for discussion anyway.
>
> Regards,
> John
>
> Jason Dillon wrote:
> > Sorry, this has been long overdue.  I've been working on some
> > automation systems for Geronimo builds, including the basic server
> > assemblies, cts assemblies, tck testsuite execution as well as soon to
> > run our own testsuite.
> >
> > I have used many different build automation platforms in the past...
> > but IMO they all have some deficiency.  Anyways, I elected to
> > implement a solution using AntHill, who's publisher, Urbancode, has a
> > policy to allow free usage for open-source projects (just like
> > Atlassian's JIRA & Confluence).
> >
> > I've set up the latest version of AntHill 3 on our gbuild hosts, and
> > have been working on a configuration to allow reliable builds of
> > Geronimo.  One of the nice aspects of AntHill3 is its distributed
> > agent system, which allows workload to be split up over a set of
> > nodes.  A downside to this is that it becomes more difficult to link
> > Maven builds, as Maven uses a local repository cache for all
> > integration points.  But, I have gotten around this issue by having AH
> > include all of the artifacts downloaded and produced by a build into a
> > clean local repo by the target project which is building.
> >
> > A nice side effect of this is that there is direct correlation between
> > one build to another.  And aside from any mystical SNAPSHOT mismatches
> > (which I hope to get fixed soon with my mvn patch
> > http://jira.codehaus.org/browse/MNG-2681) it is fairly safe to say
> > that artifacts generated/downloaded by one build will be used by a
> > dependent build.  The down side to this is that sometimes we have to
> > ship about ~512mb of dependencies for larger builds (like the
> > cts-server builds for the TCK which depend on all of the outputs of
> > the server builds, which is a local repo cache of ~512mb).
> >
> > An even nicer side effect to all of this, now that each build has a
> > set of artifacts which can be retrieved by another process... we can
> > then take a successful build of Geronimo and run our testsuite on
> > it... either automatically or manually.  And when the testsuite gets
> > bigger and bigger, we can split up each of the suites and run each one
> > a different system... or even on a different operating system or
> > architecture.
> >
> > Anyways... the options ahead of us are really interesting... and I
> > believe that right now that AntHill3 is the best tool available to our
> > community to build a really rich and powerful build automation system.
> >
> > I am however still working out some of the kinks...
> >
> > For example, to run our console-testsuite automatically on gbuild
> > hosts, we need to setup a virtual X server for Firefox to connect to,
> > which means we need to setup some tasks to execute Xfvb before tests
> > and shut it down afterwards, as well as put firefox-bin on the path,
> > etc.  Minor issues, but still work left to be done.
> >
> > If you'd like to take a peek, you can log into the AntHill console here:
> >
> >     https://gbuild.org:9443
> >
> > Username: guest
> > Password: gbuild
> >
> > (NOTE: This user will not be able to see any of the CTS or TCK related
> > projects due to NDA mucky muck)
> >
> > I hope to have this wrapped up for the main G server builds over the
> > next few days, at which point I will enable the build status
> > notifications to be sent to dev@  But right now since I am testing its
> > probably not meaningful to send out those notifications.
> >
> > But, I have found several build related issues from testing this
> > system, which is usually performed off of a clean svn co with a clean
> > mvn repo... so I'm confident that once its going that we will catch
> > more errors faster, which will hopefully reduce build related errors
> > for the masses.
> >
> >  * * *
> >
> > Anyways, right now I have builds setup for:
> >
> >     Genesis - trunk
> >     Specs - trunk
> >     Geronimo Components (stage=bootstrap) - trunk & 1.2
> >     OpenEJB 2 - trunk & 2.2
> >     Geronimo Server (stage=assemble) - trunk & 1.2
> >     Geronimo CTS 1.2
> >
> > As noted above, these builds use the exact outputs from one build in
> > another, not using a shared local repo, so there is less chance that
> > other builds will cause mvn to behave strangely (or stranger than it
> > already does).
> >
> > I have started working on a workflow to run the server/testsuite/*
> > modules on "Geronimo Server" outputs, which should be closer to being
> > finished early next week.
> >
> > Some of these projects, those that generate Surefire reports, will
> > have a "Surefire_Report" attached to the buildlife.  This is a
> > consolidated report of all the tests for that project.  For example
> > the last build of specs trunk, had 300 tests (all passed).
> >
> >
> > NOTE: Screen shots attached, as currently the state of the install may
> > change as I configure to validate the setup.
> >
> > You can also see & download any of the outputs of the build.
> >
> >
> >  * * *
> >
> > Anyways, as mentioned this all needs a little bit more love to be more
> > of the perfect build automation system which was what I have been
> > really trying to put together for our community.
> >
> > Should have at the very least, nightly deploys of SNAPSHOTs hooked up
> > for G projects by early next week.  Then nightly website updates, and
> > then automated testsuite & tck bits will follow shortly afterwards...
> > and eventually, we could also use AH to generate the RC and release
> > builds, so that all builds are generated from the same environment.
> > But probably sooner than that, we can promote G server builds that
> > pass the TCK or our Testsuite, so that the exact binaries used to
> > build the CTS server or run the testsuite can be used by others for
> > more validation.
> >
> > I will send out email again later, with some brief wiki docs on what
> > all this AH jargon is, and how to spin off builds with a few clicks.
> >
> > --jason
> >
> >
> >
>
>

Mime
View raw message