Return-Path: Delivered-To: apmail-incubator-harmony-dev-archive@www.apache.org Received: (qmail 22218 invoked from network); 18 Jul 2006 13:17:10 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (209.237.227.199) by minotaur.apache.org with SMTP; 18 Jul 2006 13:17:10 -0000 Received: (qmail 22125 invoked by uid 500); 18 Jul 2006 13:17:06 -0000 Delivered-To: apmail-incubator-harmony-dev-archive@incubator.apache.org Received: (qmail 22083 invoked by uid 500); 18 Jul 2006 13:17:06 -0000 Mailing-List: contact harmony-dev-help@incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: harmony-dev@incubator.apache.org Delivered-To: mailing list harmony-dev@incubator.apache.org Received: (qmail 22072 invoked by uid 99); 18 Jul 2006 13:17:06 -0000 Received: from asf.osuosl.org (HELO asf.osuosl.org) (140.211.166.49) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 18 Jul 2006 06:17:06 -0700 X-ASF-Spam-Status: No, hits=1.2 required=10.0 tests=RCVD_IN_SORBS_WEB,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (asf.osuosl.org: domain of oliver.deakin@googlemail.com designates 64.233.182.191 as permitted sender) Received: from [64.233.182.191] (HELO nf-out-0910.google.com) (64.233.182.191) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 18 Jul 2006 06:17:05 -0700 Received: by nf-out-0910.google.com with SMTP id x4so150029nfb for ; Tue, 18 Jul 2006 06:16:43 -0700 (PDT) DomainKey-Signature: a=rsa-sha1; q=dns; c=nofws; s=beta; d=googlemail.com; h=received:message-id:date:from:user-agent:mime-version:to:subject:references:in-reply-to:content-type:content-transfer-encoding; b=lMc6xxpcMigoYWGKzP+Ak9iaJAa/KWgh+BF5p8T321Pprb2x66xfAMowXj45sY0zHBbsBLdJ4C4ipMfRrHKK7/xcjzAetD3Avz5NpzaTDbDOZhWIj9bePCkY3HkmpJcPlSGB0zOh0bGAuh9Bpv4OAwqENGKX/kavTQ/4XNendUo= Received: by 10.49.55.13 with SMTP id h13mr702395nfk; Tue, 18 Jul 2006 06:16:43 -0700 (PDT) Received: from ?9.20.183.162? ( [195.212.29.75]) by mx.gmail.com with ESMTP id g1sm835787nfe.2006.07.18.06.16.41; Tue, 18 Jul 2006 06:16:42 -0700 (PDT) Message-ID: <44BCDF38.4040201@googlemail.com> Date: Tue, 18 Jul 2006 14:16:40 +0100 From: Oliver Deakin User-Agent: Thunderbird 1.5.0.4 (Windows/20060516) MIME-Version: 1.0 To: harmony-dev@incubator.apache.org Subject: Re: [classlib] Testing conventions - a proposal References: <44ABB451.30806@googlemail.com> <44B75E41.7020901@googlemail.com> <44BCB1F6.3020208@googlemail.com> <44BCC5B2.6090608@googlemail.com> In-Reply-To: <44BCC5B2.6090608@googlemail.com> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org X-Spam-Rating: minotaur.apache.org 1.6.2 0/1000/N George Harley wrote: > Oliver Deakin wrote: >> George Harley wrote: >>> >>> >>> Here the annotation on MyTestClass applies to all of its test methods. >>> >>> So what are the well-known TestNG groups that we could define for >>> use inside Harmony ? Here are some of my initial thoughts: >>> >>> >>> * type.impl -- tests that are specific to Harmony >> >> So tests are implicitly API unless specified otherwise? >> >> I'm slightly confused by your definition of impl tests as "tests that >> are >> specific to Harmony". Does this mean that impl tests are only >> those that test classes in org.apache.harmony packages? >> I thought that impl was our way of saying "tests that need to go on >> the bootclasspath". >> >> I think I just need a little clarification... >> > > Hi Oliver, > > I was using the definition of implementation-specific tests that we > currently have on the Harmony testing conventions web page. That is, > implementation-specific tests are those that are dependent on some > aspect of the Harmony implementation and would therefore not pass when > run against the RI or other conforming implementations. It's > orthogonal to the classpath/bootclasspath issue. OK, that's what I imagined you meant. IMHO using api and impl in this way makes the most sense (since, as you say, they do not really relate to the classpath/bootclasspath issue). So do we also need a pair of groups for classpath/bootclasspath tests? I'm assuming that this is how we would handle this distinction, rather than organising them into separate directories in the file system. > > >>> * state.broken. -- tests bust on a specific platform >>> >>> * state.broken -- tests broken on every platform but we want to >>> decide whether or not to run from our suite configuration >>> >>> * os. -- tests that are to be run only on the >>> specified platform (a test could be member of more than one of these) >> >> And the defaults for these are an unbroken state and runs on any >> platform. >> That makes sense... >> >> Will the platform ids be organised in a similar way to the platform ids >> we've discussed before for organisation of native code [1]? >> > > The actual string used to identify a particular platform can be > whatever we want it to be, just so long as we are consistent. So, yes, > the ids mentioned in the referenced email would seem a good starting > point. Do we need to include a 32-bit/64-bit identifier ? I cannot immediately think of any obvious 32/64-bit specific tests that we might require in the future (although Id be interested to know if anyone can think of any!). However, if the need did arise, then I would suggest that this is incorporated as another tag on the end of the group name e.g. os.linux.ppc.32. > > >> So all tests are, by default, in an all-platforms (or shared) group. >> If a test fails on all Windows platforms, it is marked with >> state.broken.windows. >> If a test fails on Windows but only on, say, amd hardware, >> it is marked state.broken.windows.amd. >> > > Yes. Agreed. > > >> Then when you come to run tests on your windows amd machine, >> you want to include all tests in the all-platform (shared) group, >> os.windows and os.windows.amd, and exclude all tests in >> the state.broken, state.broken.windows and state.broken.windows.amd >> groups. >> >> Does this tally with what you were thinking? >> > > Yes, that is the idea. > > >>> >>> >>> What does everyone else think ? Does such a scheme sound reasonable ? >> >> I think so - it seems to cover our current requirements. Thanks for >> coming up with this! >> > > Thanks, but I don't see it as final yet really. It would be great to > prove the worth of this by doing a trial on one of the existing > modules, ideally something that contains tests that are > platform-specific. Thanks for volunteering... ;) ...but seriously, do any of our modules currently contain platform specific tests? Have you attempted a TestNG trial on any of the modules (with or without platform specific tests) and, if so, was it simpler/harder/better/worse than our current setup? Regards, Oliver > > Best regards, > George > > >> Regards, >> Oliver >> >> [1] >> http://mail-archives.apache.org/mod_mbox/incubator-harmony-dev/200605.mbox/%3c44687AAA.5080302@googlemail.com%3e >> >> >>> >>> Thanks for reading this far. >>> >>> Best regards, >>> George >>> >>> >>> >>> George Harley wrote: >>>> Hi, >>>> >>>> Just seen Tim's note on test support classes and it really caught >>>> my attention as I have been mulling over this issue for a little >>>> while now. I think that it is a good time for us to return to the >>>> topic of class library test layouts. >>>> >>>> The current proposal [1] sets out to segment our different types of >>>> test by placing them in different file locations. After looking at >>>> the recent changes to the LUNI module tests (where the layout >>>> guidelines were applied) I have a real concern that there are >>>> serious problems with this approach. We have started down a track >>>> of just continually growing the number of test source folders as >>>> new categories of test are identified and IMHO that is going to >>>> bring complexity and maintenance issues with these tests. >>>> >>>> Consider the dimensions of tests that we have ... >>>> >>>> API >>>> Harmony-specific >>>> Platform-specific >>>> Run on classpath >>>> Run on bootclasspath >>>> Behaves different between Harmony and RI >>>> Stress >>>> ...and so on... >>>> >>>> >>>> If you weigh up all of the different possible permutations and then >>>> consider that the above list is highly likely to be extended as >>>> things progress it is obvious that we are eventually heading for >>>> large amounts of related test code scattered or possibly duplicated >>>> across numerous "hard wired" source directories. How maintainable >>>> is that going to be ? >>>> >>>> If we want to run different tests in different configurations then >>>> IMHO we need to be thinking a whole lot smarter. We need to be >>>> thinking about keeping tests for specific areas of functionality >>>> together (thus easing maintenance); we need something quick and >>>> simple to re-configure if necessary (pushing whole directories of >>>> files around the place does not seem a particularly lightweight >>>> approach); and something that is not going to potentially mess up >>>> contributed patches when the file they patch is found to have been >>>> recently pushed from source folder A to B. >>>> >>>> To connect into another recent thread, there have been some posts >>>> lately about handling some test methods that fail on Harmony and >>>> have meant that entire test case classes have been excluded from >>>> our test runs. I have also been noticing some API test methods that >>>> pass fine on Harmony but fail when run against the RI. Are the >>>> different behaviours down to errors in the Harmony implementation ? >>>> An error in the RI implementation ? A bug in the RI Javadoc ? Only >>>> after some investigation has been carried out do we know for sure. >>>> That takes time. What do we do with the test methods in the >>>> meantime ? Do we push them round the file system into yet another >>>> new source folder ? IMHO we need a testing strategy that enables >>>> such "problem" methods to be tracked easily without disruption to >>>> the rest of the other tests. >>>> >>>> A couple of weeks ago I mentioned that the TestNG framework [2] >>>> seemed like a reasonably good way of allowing us to both group >>>> together different kinds of tests and permit the exclusion of >>>> individual tests/groups of tests [3]. I would like to strongly >>>> propose that we consider using TestNG as a means of providing the >>>> different test configurations required by Harmony. Using a >>>> combination of annotations and XML to capture the kinds of >>>> sophisticated test configurations that people need, and that allows >>>> us to specify down to the individual method, has got to be more >>>> scalable and flexible than where we are headed now. >>>> >>>> Thanks for reading this far. >>>> >>>> Best regards, >>>> George >>>> >>>> >>>> [1] >>>> http://incubator.apache.org/harmony/subcomponents/classlibrary/testing.html >>>> >>>> [2] http://testng.org >>>> [3] >>>> http://mail-archives.apache.org/mod_mbox/incubator-harmony-dev/200606.mbox/%3c44A163B3.6080005@googlemail.com%3e >>>> >>>> >>>> >>> >>> >>> --------------------------------------------------------------------- >>> Terms of use : http://incubator.apache.org/harmony/mailing.html >>> To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org >>> For additional commands, e-mail: harmony-dev-help@incubator.apache.org >>> >>> >> > > > --------------------------------------------------------------------- > Terms of use : http://incubator.apache.org/harmony/mailing.html > To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org > For additional commands, e-mail: harmony-dev-help@incubator.apache.org > > -- Oliver Deakin IBM United Kingdom Limited --------------------------------------------------------------------- Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org For additional commands, e-mail: harmony-dev-help@incubator.apache.org