xml-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shane_Curc...@lotus.com
Subject RE: Test Infrastructure Project Proposal - Re:Kelly - minitest/smoketest/acceptance test
Date Mon, 12 Feb 2001 22:22:29 GMT
Although it's a bit of a tangent, here's one way that I've seen projects
separate kinds of testing organizationally.
   ---- you Kelly Campbell <camk@channelpoint.com> wrote: ----
   > I think another requirement that you've covered in a roundabout way,
   but
   > should be added explicitely is the test suites should be easily
   runnable by
   > individual developers, and the full suite should be run prior to
   committing
   > code changes.

   - Yes, it must be easy for developers to write individual new tests, to
   run them, and to run any existing tests.
   - Yes, some set of tests should be voted on by each individual project
   as their checkin criteria, or 'Minitest' to run.
   - As to running *all* tests for every checkin: if you use each of the
   implemented ProcessorWrapper 'flavors' currently, you'd spend nearly an
   hour on a P-500 Win32 machine running tests for Xalan.  A little bit
   much, eh?

   Below are a few definitions I've worked on with several different
   proprietary software teams to categorize testing efforts.  Obviously,
   many of the nuances and roles & responsibilites are *significantly*
   different in the open source world!  But it might be useful for some
   people to see one proven way to slice & dice your testing efforts.

   ---- Shane's 1-2-3 definitions of test sets ----
   Minitest
   "automated tests all developers must run and pass before checking in any
   code"

   Constraints: must execute in a limited amount of time (anwhere between
   30 seconds and 5 minutes, traditionally), have no extra dependencies,
   especially environment dependencies, be thoroughly documented and simple
   and one-step to execute.  I.e. the barrier to entry/usage must be low so
   that we can get every developer to use it.
   There are several important maintenance points to the Minitest:
   - Every posted build must pass the Minitest posted with it as-is when
   posted.  This ensures that if a developer runs it and it fails, the
   failure was caused by the developer's changes and not by any other
   problem.
   - The Minitest and it's documentation must be maintained regularly -
   nothing annoys developers more than having a Minitest that changes or is
   hard to use, and is a prime reason for development staff abandoning it's
   use.
   Xalan notes: already (partially) implemented in Xalan-J 2.x:
   xml-xalan\java\build.xml target 'minitest' using the normal development
   Ant makefile


   Smoketest
   "tests the build team/lab must run and pass before posting any builds
   for development (or any other) use"

   Constraints: must execute in a somewhat limited amount of time (between
   5min and 15 ~ 30 minutes, traditionally), have few and well documented
   extra dependencies, be thoroughly documented and simple to execute.
   I.e. the barrier to entry must be moderate, mainly so that maintenance
   of the test structure is kept to a minimum.

   Xalan notes: already (partially) implemented in Xalan-J 2.x:
   xml-xalan\java\build.xml target 'smoketest' using the normal development
   Ant makefile, although this needs updating and better definition.


   Acceptance test
   "tests the QA/QE team will run and require to pass before posting any
   build for general QA/QE use"

   Constraints: whatever the QA/QE team decides upon, since it's normally
   run by the testers themselves.

   ---- Shane's 1-2-3 definitions of test sets ----

   -Shane



Mime
View raw message