geronimo-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Christian Trutz <ch...@smilebase.org>
Subject Re: junit tests
Date Mon, 11 Aug 2003 20:15:06 GMT

hi folks,

some mock classes for org.apache.geronimo.Main whould be perhaps very usefull ...

chris




n Mon, Aug 11, 2003 at 04:02:27PM -0400, Henri Yandell wrote:
> 
> Mmmm.
> 
> So we use meta-code to show which piece of code implements which part of
> the spec, then we can generate documentation to show the quality of how
> much of the spec is being handled.
> 
> Hen
> 
> On Mon, 11 Aug 2003, Erin Mulder wrote:
> 
> > Good call.
> >
> > A matrix of this sort is critical to compliance and will probably end up
> > being
> > the basis for a lot of project management.
> >
> > In the long run, for each spec and all of its sections, it would be useful
> > to see:
> >     - Number of requirements
> >     - Percentage of requirements implemented
> >     - Percentage of requirements covered by tests
> >     - Percentage of requirements passing all tests
> >
> > For each requirement, it would be useful to see:
> >     - Development status
> >     - Design/development notes
> >     - Interpretation issues
> >     - References to all source code that helps to implement this requirement
> >     - All planned tests, even if not written
> >     - Tests written
> >     - Tests passing
> >     - Tests failing
> >
> > Obviously, this would all work best with some sort of database-backed
> > project
> > manager with CVS and automated testing integration.   How much of this sort
> > of thing can Maven handle?   Does it do any sort of requirements management?
> > If not, would it be worth extending/implementing something to handle this?
> >
> > In any case, what if we get a head start using the Wiki?  There's already a
> > list
> > of all the specs we're covering.   Perhaps we can expand that (or clone it)
> > for
> > more of a status type list, and start cataloguing the status of each spec
> > section.
> > It would certainly help in visualizing our status and mobilizing the forces
> > for unit
> > tests.
> >
> > Cheers,
> > Erin
> >
> > From: "n. alex rupp" <rupp0035@umn.edu>
> > > In the past I've seen projects where the unit testing effort was poorly
> > > documented and it hindered development.  These projects would ask new
> > > developers to help write unit tests against JSR such-and-such, but nowhere
> > > was there a list or a website which broke down that JSR or specification
> > and
> > > documented which parts of the spec had equivalent unit tests in the
> > project,
> > > what they were named and who was responsible for them.
> > >
> > > I think, given the large and complex scope of J2EE compliance, that we
> > > should break down the Spec requirements into bite-sized pieces and store
> > > them into a requirement tracking database, which we could make available
> > to
> > > the public through a web page.  On the web page we could list each
> > > requirement, it's accompanying unit test (if any) perhaps some brief
> > > documentation on the issue and maybe the names of people who've been
> > working
> > > on it (for purposes of collaboration).
> > >
> > > Then, people who want to help with the project could go digging and find
> > > requirements with no tests or ones whose tests are broken, and write new
> > > ones for submission.  I know it's going to be tricky, but perhaps this
> > might
> > > make it easier to go after compliance.
> > >
> > > Does that sound reasonable?
> > > --
> > > Alex (n_alex_rupp@users.sf.net)
> >

Mime
View raw message