db-jdo-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Jdo Wiki] Update of "TechnologyCompatibilityKit" by MichelleCaisse
Date Tue, 13 Sep 2005 23:47:11 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Jdo Wiki" for change notification.

The following page has been changed by MichelleCaisse:
http://wiki.apache.org/jdo/TechnologyCompatibilityKit

The comment on the change is:
Draft How to Develop a Test

------------------------------------------------------------------------------
  
  = How to Develop a Test =
  
- These instructions assume that you already know how to check out and build JDO.  See SubversionRepository
for more information.
+ These instructions assume that you already know how to check out the JDO repository.  See
SubversionRepository for more information.
   1. Choose an assertion or group of related assertions to test (see JDO TCK Assertions,
below).
-  1. Choose a package, new or existing, for the tests.
-  1. Decide which assertions belong in a single test.
-  1. Obtain the test template, copy to the appropriate package, and edit as described below.
-  1. Choose a name for the test.
-  1. Decide which test superclass to subclass for this test.
+  1. Choose a package, new or existing, for the tests. If you are uncertain where to place
the test, raise a discussion on jdo-dev.
+  1. Decide which assertions belong in a single test. Group up to about six related assertions
in one test class, if appropriate. For example, a setter and getter pair should generally
be tested together, with separate assertions for each within the test case.
+  1. Choose a name for the test. Use a descriptive name in camel case. Browse the existing
tests for some examples. 
+  1. Obtain the [http://wiki.apache.org/jdo/TechnologyCompatibilityKit/TestTemplate test
template], copy and paste to a file in the appropriate package, and replace the placeholders:
+ 
+   PACKAGE - The name of the package below org.apache.jdo.tck in which this test is placed.
+ 
+   TEST_SUPERCLASS - The name of the class that his test class extends
+ 
+   TITLE - A descriptive title for this test
+ 
+   KEYWORDS - A list of keywords describing the content of this test
+ 
+   ASSERTION_ID - The assertion id as listed in the assertions spreadsheet 
+ 
+   ASSERTION_TEXT - The assertion text as listed in the assertions spreadsheet
+ 
+   TEST_NAME - The name of this java test
+ 
+   PC_CLASS - The name of the persistence capable class instantiated.  If none, delete localSetUp().
If more than one, add additional addTearDownClass(PC_CLASS.class) invocations.
+ 
+  6. Decide which test superclass to extend for this test. If your test belongs to a package
with its own superclass, you would use it. Check what class other test classes in the package
extend. Otherwise you would extend org.apache.jdo.tck.JDO_Test. If you are starting a new
package, consider whether there are methods or fields that you can factor into a new class
which you would extend.
-  1. Choose existing persistence capable classes from org/apache/jdo/tck/pc/* if your test
requires instantiating a pc class. In rare cases, existing pc classes may not be suitable
and you may write a new pc class or model.
+  1. Choose existing persistence capable classes from org/apache/jdo/tck/pc/* if your test
requires instantiating a pc class. If no existing pc classes are suitable for your test, see
Writing a Persistence Capable Class, below..
   1. Write the test (see Guidelines for Writing Test Code, below).
+  1. If the test requires, provide a schema file, mapping file, xml test data, and a configuration
file (see Configuration Files, Schemas, Metadata, and XML Test Data, below). Otherwise, write
a temporary configuration file for debugging and add an entry to alltests.conf for this test.
The temporary configuration file will look like this:
-  1. If the test requires, provide a schema file, mapping file, and xml test data (see Configuration
Files, Schemas, Metadata, and XML Test Data, below).
-  1. Write a configuration file. If your test requires a non-default schema, mapping, or
xml test data, you will eventually check in the configuration file.  In most cases, this is
a temporary file for debuggin your test and you will later add an entry to alltests.conf.
In this case, add a file called onetest.conf to test/conf with the following content:
    {{{
  jdo.tck.description = Run one test for debugging
  jdo.tck.testdata = 
  jdo.tck.standarddata = 
  jdo.tck.mapping = 0
- jdo.tck.classes = org.apache.jdo.tck.<your package & test name>
+ jdo.tck.classes = org.apache.jdo.tck.your_package_and_test_name
  }}}
   1. Install the database schema for the test
    {{{
-   maven -Djdo.tck.cfglist=onetest.conf installSchema
+   maven -Djdo.tck.cfglist=myConfig.conf installSchema
    }}}
   1. Execute the test with
    {{{
-   maven -Djdo.tck.cfglist=onetest.conf runtck.jdori
+   maven -Djdo.tck.cfglist=myConfig.conf runtck.jdori
    }}}
   1. Debug the test.
-  1. When the test is ready for review, add an entry to test/conf/alltests.conf.
+  1. Run svn add for any new files you have created for check-in. Do '''not''' add your temporary
config file, if you needed one.
+  1. Execute the entire test suite to verify that your changes have not created any regressions.
+   {{{
+   maven build
+   }}}
   1. Create a patch and submit to jdo-dev for review. From the tck20 directory, do:
     {{{
     svn diff > myPatch.patch
@@ -71, +91 @@

  
  = Guidelines for Writing the Test Code =
  
- (more information needed here)
+ Use the [http://wiki.apache.org/geronimo/CodingStandards coding standards] used on the Geronimo
Project.
+ 
+ (more content to be provided)
+ 
+ == Testing Whether a Feature is Supported ==
+ 
+ == Using Facilities Provided by JDO_Test ==
  
  == Cleanup ==
  
@@ -98, +124 @@

  
  '''Note''': The order of adding tear down instances and classes is significant. The default
implementation of "localTearDown" first deletes  all added instances in exactly that order
they have been added. Afterwards it deletes the extents of all classes in exactly that order
they have been added.
  
+ == javadoc ==
+ 
+ Should we have javadoc for test methods?
+ 
  = Configuration Files, Schemas, Metadata, and XML Test Data =
+ 
+ (content to be provided)
+ 
+ = Writing a Persistence Capable Class =
  
  (content to be provided)
  
@@ -106, +140 @@

  
  This is the list of tasks required to complete the JDO 2.0 TCK.
   ||Activity wiki page||Description||Who||Expected Completion Date||
-  ||TestRunner||Rewrite maven.xml so that the same TCK tests can be run in multiple configurations.
For example, the same TCK test program needs to be run with and without security turned on,
and with application identity and datastore identity. When we add different mappings for Chapter
18 (ORM) tests, the same test will also need to be run with different mappings.||Michelle
Caisse||6/10/05||
+  ||TestRunner||Rewrite maven.xml so that the same TCK tests can be run in multiple configurations.
For example, the same TCK test program needs to be run with and without security turned on,
and with application identity and datastore identity. When we add different mappings for Chapter
18 (ORM) tests, the same test will also need to be run with different mappings.||Michelle
Caisse||6/10/05 done||
   ||["XMLMetadata"]||Develop xml metadata tests (Chapter 18)||Michelle Caisse & others
TBD||?/05||
   ||QueryTests11||Finish JDO 1 TCK query test classes||Michael Bouschen||done March 05||
   ||QueryTests||Write tests for the query language enhancements and new query api's||Michael
Bouschen||?/05||
   ||DetachedObjects||Design and write tests for detached objects||Matthew Adams||?/05||
   ||--||Rewrite examples in Chapter 15 to use the Company model; associate each example with
a specific test case via assertion conditional text.||Craig Russell||?/05||
   ||RunRules||Write the rules vendors must follow in running the TCK to demonstrate compliance
with the specification||Craig Russell||9/05||
-  ||TechnologyCompatibilityKit||Complete the list of assertions from the Proposed Final Draft||Michelle
Caisse||9/05||
+  ||AllTheOtherTests||Complete the list of assertions from the Proposed Final Draft, create
wiki page listing assertions needing to be implemented||Michelle Caisse||9/05 in progress||
-  ||WritingTestCases||Write a description of how to write a test case||Michelle Caisse||9/05||
+  ||TechnologyCompatibilityKit||Write a description of how to write a test case||Michelle
Caisse||9/05 in progress||
  

Mime
View raw message