geronimo-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Erin Mulder" <>
Subject Re: junit tests
Date Mon, 11 Aug 2003 19:55:43 GMT
Good call.

A matrix of this sort is critical to compliance and will probably end up
the basis for a lot of project management.

In the long run, for each spec and all of its sections, it would be useful
to see:
    - Number of requirements
    - Percentage of requirements implemented
    - Percentage of requirements covered by tests
    - Percentage of requirements passing all tests

For each requirement, it would be useful to see:
    - Development status
    - Design/development notes
    - Interpretation issues
    - References to all source code that helps to implement this requirement
    - All planned tests, even if not written
    - Tests written
    - Tests passing
    - Tests failing

Obviously, this would all work best with some sort of database-backed
manager with CVS and automated testing integration.   How much of this sort
of thing can Maven handle?   Does it do any sort of requirements management?
If not, would it be worth extending/implementing something to handle this?

In any case, what if we get a head start using the Wiki?  There's already a
of all the specs we're covering.   Perhaps we can expand that (or clone it)
more of a status type list, and start cataloguing the status of each spec
It would certainly help in visualizing our status and mobilizing the forces
for unit


From: "n. alex rupp" <>
> In the past I've seen projects where the unit testing effort was poorly
> documented and it hindered development.  These projects would ask new
> developers to help write unit tests against JSR such-and-such, but nowhere
> was there a list or a website which broke down that JSR or specification
> documented which parts of the spec had equivalent unit tests in the
> what they were named and who was responsible for them.
> I think, given the large and complex scope of J2EE compliance, that we
> should break down the Spec requirements into bite-sized pieces and store
> them into a requirement tracking database, which we could make available
> the public through a web page.  On the web page we could list each
> requirement, it's accompanying unit test (if any) perhaps some brief
> documentation on the issue and maybe the names of people who've been
> on it (for purposes of collaboration).
> Then, people who want to help with the project could go digging and find
> requirements with no tests or ones whose tests are broken, and write new
> ones for submission.  I know it's going to be tricky, but perhaps this
> make it easier to go after compliance.
> Does that sound reasonable?
> --
> Alex (

View raw message