geronimo-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bruce Snyder <>
Subject Re: junit tests
Date Mon, 11 Aug 2003 21:20:06 GMT
This one time, at band camp, Erin Mulder said:

EM>A matrix of this sort is critical to compliance and will probably end up
EM>the basis for a lot of project management.
EM>In the long run, for each spec and all of its sections, it would be useful
EM>to see:
EM>    - Number of requirements
EM>    - Percentage of requirements implemented
EM>    - Percentage of requirements covered by tests
EM>    - Percentage of requirements passing all tests
EM>For each requirement, it would be useful to see:
EM>    - Development status
EM>    - Design/development notes
EM>    - Interpretation issues
EM>    - References to all source code that helps to implement this requirement
EM>    - All planned tests, even if not written
EM>    - Tests written
EM>    - Tests passing
EM>    - Tests failing
EM>Obviously, this would all work best with some sort of database-backed
EM>manager with CVS and automated testing integration.   How much of this sort
EM>of thing can Maven handle?   Does it do any sort of requirements management?
EM>If not, would it be worth extending/implementing something to handle this?
EM>In any case, what if we get a head start using the Wiki?  There's already a
EM>of all the specs we're covering.   Perhaps we can expand that (or clone it)
EM>more of a status type list, and start cataloguing the status of each spec
EM>It would certainly help in visualizing our status and mobilizing the forces
EM>for unit
EM>From: "n. alex rupp" <>
EM>> In the past I've seen projects where the unit testing effort was poorly
EM>> documented and it hindered development.  These projects would ask new
EM>> developers to help write unit tests against JSR such-and-such, but nowhere
EM>> was there a list or a website which broke down that JSR or specification
EM>> documented which parts of the spec had equivalent unit tests in the
EM>> what they were named and who was responsible for them.
EM>> I think, given the large and complex scope of J2EE compliance, that we
EM>> should break down the Spec requirements into bite-sized pieces and store
EM>> them into a requirement tracking database, which we could make available
EM>> the public through a web page.  On the web page we could list each
EM>> requirement, it's accompanying unit test (if any) perhaps some brief
EM>> documentation on the issue and maybe the names of people who've been
EM>> on it (for purposes of collaboration).
EM>> Then, people who want to help with the project could go digging and find
EM>> requirements with no tests or ones whose tests are broken, and write new
EM>> ones for submission.  I know it's going to be tricky, but perhaps this
EM>> make it easier to go after compliance.
EM>> Does that sound reasonable?


This is very good idea. Especially since the whole goal is to be
certified. I was planning on starting to write tests tonight. Maybe
we should all sync up on what needs to be done to achieve the above
suggestion before we begin writing tests willy nilly.


Clover will provide us with the status of which you speak. But to break
down each requirement will certainly take a lot more work. It would be
very nice to try to automate this work because it is so generic.

Sure, in the meantime, we could use the Wiki to track it manually,
but we will need to consider how the Wiki list will be converted to the
final solution.

I'd like to get started writing some tests tonight and get them checked
in so let's decide what we're going to do so that efforts don't go in
the wrong direction or, even worse, get wasted.

perl -e 'print unpack("u30","<0G)U8V4\@4VYY9&5R\"F9E<G)E=\$\!F<FEI+F-O;0\`\`");'

View raw message