Return-Path: Delivered-To: apmail-incubator-harmony-dev-archive@www.apache.org Received: (qmail 18258 invoked from network); 19 Jul 2006 17:46:12 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (209.237.227.199) by minotaur.apache.org with SMTP; 19 Jul 2006 17:46:12 -0000 Received: (qmail 56953 invoked by uid 500); 19 Jul 2006 17:46:09 -0000 Delivered-To: apmail-incubator-harmony-dev-archive@incubator.apache.org Received: (qmail 56902 invoked by uid 500); 19 Jul 2006 17:46:08 -0000 Mailing-List: contact harmony-dev-help@incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: harmony-dev@incubator.apache.org Delivered-To: mailing list harmony-dev@incubator.apache.org Received: (qmail 56891 invoked by uid 99); 19 Jul 2006 17:46:08 -0000 Received: from asf.osuosl.org (HELO asf.osuosl.org) (140.211.166.49) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 19 Jul 2006 10:46:08 -0700 X-ASF-Spam-Status: No, hits=1.2 required=10.0 tests=RCVD_IN_SORBS_WEB,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (asf.osuosl.org: domain of george.c.harley@googlemail.com designates 64.233.182.184 as permitted sender) Received: from [64.233.182.184] (HELO nf-out-0910.google.com) (64.233.182.184) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 19 Jul 2006 10:46:07 -0700 Received: by nf-out-0910.google.com with SMTP id x4so315745nfb for ; Wed, 19 Jul 2006 10:45:46 -0700 (PDT) DomainKey-Signature: a=rsa-sha1; q=dns; c=nofws; s=beta; d=googlemail.com; h=received:message-id:date:from:reply-to:user-agent:mime-version:to:subject:references:in-reply-to:content-type:content-transfer-encoding; b=ggNYytPAVtuHFaVYVep6QszOzqiwvHQZhRxjDCM7betgtXB2hLmANw27fHnyA8TwdlQcwOX01JxjqHCXIZ5C2PsRPrrjt5ZaTIt/EYAouwmTK1LM0reKrcmsjNze5YDYDWjTSRSG5p++ss0g/TkcT+2uXNQjJzOq/rhE0TRFuiQ= Received: by 10.48.242.9 with SMTP id p9mr1880928nfh; Wed, 19 Jul 2006 10:45:46 -0700 (PDT) Received: from ?9.20.183.73? ( [195.212.29.163]) by mx.gmail.com with ESMTP id c1sm189874nfe.2006.07.19.10.45.44; Wed, 19 Jul 2006 10:45:45 -0700 (PDT) Message-ID: <44BE6FC6.8060603@googlemail.com> Date: Wed, 19 Jul 2006 18:45:42 +0100 From: George Harley Reply-To: harmony-dev@incubator.apache.org User-Agent: Thunderbird 1.5.0.4 (Windows/20060516) MIME-Version: 1.0 To: harmony-dev@incubator.apache.org Subject: Re: [classlib] Testing conventions - a proposal References: <44ABB451.30806@googlemail.com> <44B75E41.7020901@googlemail.com> <44BCB1F6.3020208@googlemail.com> <44BCC5B2.6090608@googlemail.com> <4d0b24970607180647w6cf04a1w385dfbe72e51c441@mail.gmail.com> <44BD04C1.4060908@googlemail.com> <44BDF1E7.9010903@gmail.com> <44BE0721.9050804@googlemail.com> <2c9597b90607190401v6606631fx90ca333ed21cde2d@mail.gmail.com> <44BE1A80.1060409@googlemail.com> <2c9597b90607190514o6feea565mc140940452135b18@mail.gmail.com> In-Reply-To: <2c9597b90607190514o6feea565mc140940452135b18@mail.gmail.com> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org X-Spam-Rating: minotaur.apache.org 1.6.2 0/1000/N Hi Alexei, I just downloaded the latest working build of TestNG 5.0 [1] and support for the "jvm" attribute is in there. This is not the official release build. Best regards, George [1] http://testng.org/testng-5.0.zip Alexei Zakharov wrote: > Hi George, > > Agree, we may experience problems in case of VM hang or crash. I > suggest this only as a temporary solution. BTW, the fact that TestNG > ant task still doesn't have such attributes looks like a sign for me - > TestNG can be still immature in some aspects. Still comparing TestNG > and JUnit. > > Regards, > > 2006/7/19, George Harley : >> Hi Alexei, >> >> It's encouraging to hear that (Ant + TestNG + sample tests) all worked >> fine together on Harmony. In answer to your question I suppose that the >> ability to fork the tests in a separate VM means that we do not run the >> risk of possible bugs in Harmony affecting the test harness and >> therefore the outcome of the tests. >> >> Best regards, >> George >> >> >> Alexei Zakharov wrote: >> > Probably my previous message was not clear enough. >> > Why can't we just invoke everything including ant on top of Harmony >> > for now? At least I was able to build and run test-14 examples from >> > TestNG 4.7 distribution solely on top of j9 + our classlib today. >> > >> > C:\Java\testng-4.7\test-14>set >> > JAVA_HOME=c:\Java\harmony\enhanced\classlib\trunk >> > \deploy\jdk\jre >> > >> > C:\Java\testng-4.7\test-14>ant >> > -Dbuild.compiler=org.eclipse.jdt.core.JDTCompiler >> > Adapter run >> > Buildfile: build.xml >> > >> > prepare: >> > >> > compile: >> > [echo] -- Compiling JDK 1.4 >> tests -- >> > >> > run: >> > [echo] -- Running JDK 1.4 >> tests -- >> > [echo] -- >> testng-4.7-jdk14.jar -- >> > >> > [testng-14] =============================================== >> > [testng-14] TestNG JDK 1.4 >> > [testng-14] Total tests run: 179, Failures: 10, Skips: 0 >> > [testng-14] =============================================== >> > ... >> > >> > Exactly the same results as with Sun JDK 1.4. >> > Note: you may need to hatch the build.xml a little bit to achieve >> this. >> > >> > Thanks, >> > >> > 2006/7/19, George Harley : >> >> Hi Richard, >> >> >> >> Actually the Ant task always runs the tests in a forked VM. At >> present, >> >> however, the task does not support specifying the forked VM (i.e. >> there >> >> is no equivalent to the JUnit Ant task's "jvm" attribute). This >> matter >> >> has already been raised with the TestNG folks who seem happy to >> >> introduce this. >> >> >> >> In the meantime we could run the tests using the Ant java task. >> >> >> >> >> >> Best regards, >> >> George >> >> >> >> >> >> >> >> Richard Liang wrote: >> >> > According to "TestNG Ant Task" [1], it seems that the TestNG Ant >> task >> >> > does not support to fork a new JVM, that is, we must launch ant >> using >> >> > Harmony itself. Any comments? Thanks a lot. >> >> > >> >> > [1]http://testng.org/doc/ant.html >> >> > >> >> > Best regards, >> >> > Richard >> >> > >> >> > George Harley wrote: >> >> >> Andrew Zhang wrote: >> >> >>> On 7/18/06, George Harley wrote: >> >> >>>> >> >> >>>> Oliver Deakin wrote: >> >> >>>> > George Harley wrote: >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> Here the annotation on MyTestClass applies to all of its test >> >> >>>> methods. >> >> >>>> >> >> >> >>>> >> So what are the well-known TestNG groups that we could define >> >> >>>> for use >> >> >>>> >> inside Harmony ? Here are some of my initial thoughts: >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> * type.impl -- tests that are specific to Harmony >> >> >>>> > >> >> >>>> > So tests are implicitly API unless specified otherwise? >> >> >>>> > >> >> >>>> > I'm slightly confused by your definition of impl tests as >> "tests >> >> >>>> that >> >> >>>> are >> >> >>>> > specific to Harmony". Does this mean that impl tests are only >> >> >>>> > those that test classes in org.apache.harmony packages? >> >> >>>> > I thought that impl was our way of saying "tests that need to >> >> go on >> >> >>>> > the bootclasspath". >> >> >>>> > >> >> >>>> > I think I just need a little clarification... >> >> >>>> > >> >> >>>> >> >> >>>> Hi Oliver, >> >> >>>> >> >> >>>> I was using the definition of implementation-specific tests >> that we >> >> >>>> currently have on the Harmony testing conventions web page. That >> >> is, >> >> >>>> implementation-specific tests are those that are dependent on >> some >> >> >>>> aspect of the Harmony implementation and would therefore not >> >> pass when >> >> >>>> run against the RI or other conforming implementations. It's >> >> >>>> orthogonal >> >> >>>> to the classpath/bootclasspath issue. >> >> >>>> >> >> >>>> >> >> >>>> >> * state.broken. -- tests bust on a specific >> >> platform >> >> >>>> >> >> >> >>>> >> * state.broken -- tests broken on every platform but we >> >> want to >> >> >>>> >> decide whether or not to run from our suite configuration >> >> >>>> >> >> >> >>>> >> * os. -- tests that are to be run only on the >> >> >>>> >> specified platform (a test could be member of more than >> one of >> >> >>>> these) >> >> >>>> > >> >> >>>> > And the defaults for these are an unbroken state and runs >> on any >> >> >>>> > platform. >> >> >>>> > That makes sense... >> >> >>>> > >> >> >>>> > Will the platform ids be organised in a similar way to the >> >> >>>> platform ids >> >> >>>> > we've discussed before for organisation of native code [1]? >> >> >>>> > >> >> >>>> >> >> >>>> The actual string used to identify a particular platform can be >> >> >>>> whatever >> >> >>>> we want it to be, just so long as we are consistent. So, yes, >> >> the ids >> >> >>>> mentioned in the referenced email would seem a good starting >> >> point. Do >> >> >>>> we need to include a 32-bit/64-bit identifier ? >> >> >>>> >> >> >>>> >> >> >>>> > So all tests are, by default, in an all-platforms (or shared) >> >> group. >> >> >>>> > If a test fails on all Windows platforms, it is marked with >> >> >>>> > state.broken.windows. >> >> >>>> > If a test fails on Windows but only on, say, amd hardware, >> >> >>>> > it is marked state.broken.windows.amd. >> >> >>>> > >> >> >>>> >> >> >>>> Yes. Agreed. >> >> >>>> >> >> >>>> >> >> >>>> > Then when you come to run tests on your windows amd machine, >> >> >>>> > you want to include all tests in the all-platform (shared) >> group, >> >> >>>> > os.windows and os.windows.amd, and exclude all tests in >> >> >>>> > the state.broken, state.broken.windows and >> >> state.broken.windows.amd >> >> >>>> > groups. >> >> >>>> > >> >> >>>> > Does this tally with what you were thinking? >> >> >>>> > >> >> >>>> >> >> >>>> Yes, that is the idea. >> >> >>>> >> >> >>>> >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> What does everyone else think ? Does such a scheme sound >> >> >>>> reasonable ? >> >> >>>> > >> >> >>>> > I think so - it seems to cover our current requirements. >> >> Thanks for >> >> >>>> > coming up with this! >> >> >>>> > >> >> >>>> >> >> >>>> Thanks, but I don't see it as final yet really. It would be >> >> great to >> >> >>>> prove the worth of this by doing a trial on one of the existing >> >> >>>> modules, >> >> >>>> ideally something that contains tests that are >> platform-specific. >> >> >>> >> >> >>> >> >> >>> Hello George, how about doing a trial on NIO module? >> >> >>> >> >> >>> So far as I know, there are several platform dependent tests >> in NIO >> >> >>> module. >> >> >>> :) >> >> >>> >> >> >>> The assert statements are commented out in these tests, with >> "FIXME" >> >> >>> mark. >> >> >>> >> >> >>> Furthurmore, I also find some platform dependent behaviours of >> >> >>> FileChannel. >> >> >>> If TestNG is applied on NIO, I will supplement new tests for >> >> >>> FileChannel and >> >> >>> fix the bug of source code. >> >> >>> >> >> >>> What's your opnion? Any suggestions/comments? >> >> >>> >> >> >>> Thanks! >> >> >>> >> >> >> >> >> >> Hi Andrew, >> >> >> >> >> >> That sounds like a very good idea. If there is agreement in the >> >> >> project that 5.0 annotations are the way to go (as opposed to the >> >> >> pre-5.0 Javadoc comment support offered by TestNG) then to the >> best >> >> >> of my knowledge all that is stopping us from doing this trial >> is the >> >> >> lack of a 5.0 VM to run the Harmony tests on. Hopefully that >> will be >> >> >> addressed soon. When it is I would be happy to get stuck into this >> >> >> trial. >> >> >> >> >> >> Best regards, >> >> >> George >> >> >> >> >> >> >> >> >>> Best regards, >> >> >>>> George >> >> >>>> >> >> >>>> >> >> >>>> > Regards, >> >> >>>> > Oliver >> >> >>>> > >> >> >>>> > [1] >> >> >>>> > >> >> >>>> >> >> >> http://mail-archives.apache.org/mod_mbox/incubator-harmony-dev/200605.mbox/%3c44687AAA.5080302@googlemail.com%3e >> >> >> >> >> >>>> >> >> >>>> > >> >> >>>> > >> >> >>>> >> >> >> >>>> >> Thanks for reading this far. >> >> >>>> >> >> >> >>>> >> Best regards, >> >> >>>> >> George >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> George Harley wrote: >> >> >>>> >>> Hi, >> >> >>>> >>> >> >> >>>> >>> Just seen Tim's note on test support classes and it really >> >> >>>> caught my >> >> >>>> >>> attention as I have been mulling over this issue for a >> little >> >> >>>> while >> >> >>>> >>> now. I think that it is a good time for us to return to the >> >> >>>> topic of >> >> >>>> >>> class library test layouts. >> >> >>>> >>> >> >> >>>> >>> The current proposal [1] sets out to segment our different >> >> >>>> types of >> >> >>>> >>> test by placing them in different file locations. After >> >> looking at >> >> >>>> >>> the recent changes to the LUNI module tests (where the >> layout >> >> >>>> >>> guidelines were applied) I have a real concern that there >> are >> >> >>>> >>> serious problems with this approach. We have started down a >> >> >>>> track of >> >> >>>> >>> just continually growing the number of test source folders >> >> as new >> >> >>>> >>> categories of test are identified and IMHO that is going to >> >> bring >> >> >>>> >>> complexity and maintenance issues with these tests. >> >> >>>> >>> >> >> >>>> >>> Consider the dimensions of tests that we have ... >> >> >>>> >>> >> >> >>>> >>> API >> >> >>>> >>> Harmony-specific >> >> >>>> >>> Platform-specific >> >> >>>> >>> Run on classpath >> >> >>>> >>> Run on bootclasspath >> >> >>>> >>> Behaves different between Harmony and RI >> >> >>>> >>> Stress >> >> >>>> >>> ...and so on... >> >> >>>> >>> >> >> >>>> >>> >> >> >>>> >>> If you weigh up all of the different possible >> permutations and >> >> >>>> then >> >> >>>> >>> consider that the above list is highly likely to be >> extended as >> >> >>>> >>> things progress it is obvious that we are eventually heading >> >> for >> >> >>>> >>> large amounts of related test code scattered or possibly >> >> >>>> duplicated >> >> >>>> >>> across numerous "hard wired" source directories. How >> >> >>>> maintainable is >> >> >>>> >>> that going to be ? >> >> >>>> >>> >> >> >>>> >>> If we want to run different tests in different >> >> configurations then >> >> >>>> >>> IMHO we need to be thinking a whole lot smarter. We need >> to be >> >> >>>> >>> thinking about keeping tests for specific areas of >> >> functionality >> >> >>>> >>> together (thus easing maintenance); we need something >> quick and >> >> >>>> >>> simple to re-configure if necessary (pushing whole >> >> directories of >> >> >>>> >>> files around the place does not seem a particularly >> lightweight >> >> >>>> >>> approach); and something that is not going to potentially >> >> mess up >> >> >>>> >>> contributed patches when the file they patch is found to >> >> have been >> >> >>>> >>> recently pushed from source folder A to B. >> >> >>>> >>> >> >> >>>> >>> To connect into another recent thread, there have been some >> >> posts >> >> >>>> >>> lately about handling some test methods that fail on Harmony >> >> and >> >> >>>> >>> have meant that entire test case classes have been excluded >> >> >>>> from our >> >> >>>> >>> test runs. I have also been noticing some API test >> methods that >> >> >>>> pass >> >> >>>> >>> fine on Harmony but fail when run against the RI. Are the >> >> >>>> different >> >> >>>> >>> behaviours down to errors in the Harmony implementation ? An >> >> error >> >> >>>> >>> in the RI implementation ? A bug in the RI Javadoc ? Only >> after >> >> >>>> some >> >> >>>> >>> investigation has been carried out do we know for sure. That >> >> takes >> >> >>>> >>> time. What do we do with the test methods in the meantime ? >> >> Do we >> >> >>>> >>> push them round the file system into yet another new source >> >> >>>> folder ? >> >> >>>> >>> IMHO we need a testing strategy that enables such "problem" >> >> >>>> methods >> >> >>>> >>> to be tracked easily without disruption to the rest of the >> >> other >> >> >>>> tests. >> >> >>>> >>> >> >> >>>> >>> A couple of weeks ago I mentioned that the TestNG >> framework [2] >> >> >>>> >>> seemed like a reasonably good way of allowing us to both >> group >> >> >>>> >>> together different kinds of tests and permit the >> exclusion of >> >> >>>> >>> individual tests/groups of tests [3]. I would like to >> strongly >> >> >>>> >>> propose that we consider using TestNG as a means of >> >> providing the >> >> >>>> >>> different test configurations required by Harmony. Using a >> >> >>>> >>> combination of annotations and XML to capture the kinds of >> >> >>>> >>> sophisticated test configurations that people need, and that >> >> >>>> allows >> >> >>>> >>> us to specify down to the individual method, has got to >> be more >> >> >>>> >>> scalable and flexible than where we are headed now. >> >> >>>> >>> >> >> >>>> >>> Thanks for reading this far. >> >> >>>> >>> >> >> >>>> >>> Best regards, >> >> >>>> >>> George >> >> >>>> >>> >> >> >>>> >>> >> >> >>>> >>> [1] >> >> >>>> >>> >> >> >>>> >> >> >> http://incubator.apache.org/harmony/subcomponents/classlibrary/testing.html >> >> >> >> >> >>>> >> >> >>>> >>> >> >> >>>> >>> [2] http://testng.org >> >> >>>> >>> [3] >> >> >>>> >>> >> >> >>>> >> >> >> http://mail-archives.apache.org/mod_mbox/incubator-harmony-dev/200606.mbox/%3c44A163B3.6080005@googlemail.com%3e >> >> >> >> >> >>>> >> >> >>>> >>> >> >> >>>> >>> >> >> >>>> >>> >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> >> --------------------------------------------------------------------- >> > >> > >> > >> >> >> --------------------------------------------------------------------- >> Terms of use : http://incubator.apache.org/harmony/mailing.html >> To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org >> For additional commands, e-mail: harmony-dev-help@incubator.apache.org >> >> > > --------------------------------------------------------------------- Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org For additional commands, e-mail: harmony-dev-help@incubator.apache.org