Return-Path: Delivered-To: apmail-incubator-harmony-dev-archive@www.apache.org Received: (qmail 2703 invoked from network); 22 Mar 2006 12:41:24 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (209.237.227.199) by minotaur.apache.org with SMTP; 22 Mar 2006 12:41:23 -0000 Received: (qmail 34151 invoked by uid 500); 22 Mar 2006 12:41:19 -0000 Delivered-To: apmail-incubator-harmony-dev-archive@incubator.apache.org Received: (qmail 34105 invoked by uid 500); 22 Mar 2006 12:41:19 -0000 Mailing-List: contact harmony-dev-help@incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: harmony-dev@incubator.apache.org Delivered-To: mailing list harmony-dev@incubator.apache.org Received: (qmail 34093 invoked by uid 99); 22 Mar 2006 12:41:19 -0000 Received: from asf.osuosl.org (HELO asf.osuosl.org) (140.211.166.49) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Mar 2006 04:41:19 -0800 X-ASF-Spam-Status: No, hits=-0.0 required=10.0 tests=SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (asf.osuosl.org: domain of mail@leosimons.com designates 216.218.185.16 as permitted sender) Received: from [216.218.185.16] (HELO bali.sjc.webweaving.org) (216.218.185.16) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Mar 2006 04:41:18 -0800 Received: from bali.sjc.webweaving.org (localhost [127.0.0.1]) by bali.sjc.webweaving.org (8.12.11/8.12.11) with ESMTP id k2MCeu2E053024 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=NO) for ; Wed, 22 Mar 2006 04:40:56 -0800 (PST) (envelope-from mail@leosimons.com) Received: (from lsimons@localhost) by bali.sjc.webweaving.org (8.12.11/8.12.11/Submit) id k2MCeups053023 for harmony-dev@incubator.apache.org; Wed, 22 Mar 2006 04:40:56 -0800 (PST) (envelope-from mail@leosimons.com) X-Authentication-Warning: bali.sjc.webweaving.org: lsimons set sender to mail@leosimons.com using -f Date: Wed, 22 Mar 2006 04:40:56 -0800 From: Leo Simons To: harmony-dev@incubator.apache.org Subject: Re: [classlib] Testing Message-ID: <20060322124056.GL45952@bali.sjc.webweaving.org> Mail-Followup-To: Leo Simons , harmony-dev@incubator.apache.org References: <44213FE0.4070103@pobox.com> Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline In-Reply-To: <44213FE0.4070103@pobox.com> User-Agent: Mutt/1.5.10i X-Spam-Checker-Version: SpamAssassin 3.0.4 (2005-06-05) on bali.sjc.webweaving.org X-Virus-Checked: Checked by ClamAV on apache.org X-Old-Spam-Status: No, score=-5.8 required=5.0 tests=ALL_TRUSTED,AWL,BAYES_00 autolearn=ham version=3.0.4 X-Spam-Rating: minotaur.apache.org 1.6.2 0/1000/N On Wed, Mar 22, 2006 at 07:15:28AM -0500, Geir Magnusson Jr wrote: > Pulling out of the various threads where we have been discussing, can we > agree on the problem : > > We have unique problems compared to other Java projects because we need > to find a way to reliably test the things that are commonly expected to > be a solid point of reference - namely the core class library. > > Further, we've been implicitly doing "integration testing" because - so > far - the only way we've been testing our code has been 'in situ' in the > VM - not in an isolated test harness. To me, this turns it into an > integration test. > > Sure, we're using JUnit, but because of the fact we are implmenting core > java.* APIs, we aren't testing with a framework that has been > independently tested for correctness, like we would when testing any > other code. > > I hope I got that idea across - I believe that we have to go beyond > normal testing approaches because we don't have a normal situation. Where we define 'normal situation' as "running a test framework on top of the sun jdk and expecting any bugs to not be in that jdk". There's plenty of projects out there that have to test things without having such a "stable reference JDK" luxury.....I imagine that testing GCC is just as hard as this problem we have here :-) > So I think there are three things we want to do (adopting the > terminology that came from the discussion with Tim and Leo ) : > > 1) implementation tests > 2) spec/API tests (I'll bundle together) > 3) integration/functional tests > > I believe that for #1, the issues related to being on the bootclasspath > don't matter, because we aren't testing that aspect of the classes > (which is how they behave integrated w/ the VM and security system) but > rather the basic internal functioning. > > I'm not sure how to approach this, but I'll try. I'd love to hear how > Sun, IBM or BEA deals with this, or be told why it isn't an issue :) > > Implementation tests : I'd like to see us be able to do #1 via the > standard same-package technique (i.e. testing a.b.C w/ a.b.CTest) but > we'll run into a tangle of classloader problems, I suspect, becuase we > want to be testing java.* code in a system that already has java.* code. > Can anyone see a way we can do this - test the classlibrary from the > integration point of view - using some test harness + any known-good > JRE, like Sun's or IBM's? Ew, that won't work in the end since we should assume our own JRE is going to be "known-better" :-). But it might be a nice way to "bootstrap" (eg we test with an external JRE until we satisfy the tests and then we switch to testing with an earlier build). For code that has side effects or for which we can conceivably create verifiable side effects (where side effect is something outside of the whole "java environment") we can try and produce known-good input and output. There's a variety of ways to automate things like that, for example by using tracing on the relevant bits, manually verifying a "known-good" trace, storing it, and comparing future runs. But I suspect there is a whole lot of code that is either inherently all but side-effect-free, or where testing the side effects automatically amounts to doing an integration test. > Spec/API tests : these are, IMO, a kind of integration test, because > proper spec/API behavior *is* dependent on factors beyond the actual > code itself (like classloader configuration, and security context). > Because of this, the *.test.* pattern makes perfect sense. Assuming we > could produce something useful for #1 (i.e. a test harness/framework), > could we then augment it to simulate the classloader config + security > config that we'd get in a real VM? That will give us the ability to > test in isolation of the VM, and also let us 'break' the environment to > ensure that the code fails in a predictable way. > > Intgration/functional : this is a whole range of things, from doing the > Spec/API tests in an actual VM, to the tests that exercise the code > through interaction with external systems (like network, RMI, GUI, etc) > > *** > > Now, it might be suggested that we just ignore the implementation > testing (#1) and just do #2 and #3 as we are now, and hope we have a > good enough test suite. It could be argued that when Sun started, they > didn't have a known-good platform to do implementation testing on like > we do now. I don't know if that's true. > > The difference is that we need to produce something of the same quality > as Sun's Java 5, not Sun's Java 1.0. We've had 11 years since 1.0 to > learn about testing, but they've had 11 years to get things solid. > > What to do.... No idea! Cool! We should do #2 and #3 regardless. Identifying which-is-which (#1, #2, #3) in all the current test suites seems like a good next step. Obviously that doesn't really help us get that implementation testing framework you describe but it will help more unambigously define the needs. Further ideas... -> look at how the native world does testing (hint: it usually has #ifdefs, uses perl along the way, and it is certainly "messy") -> emulate that -> build a bigger, better specification test -> and somehow "prove" it is "good enough" -> build a bigger, better integration test -> and somehow "prove" it is "good enough" I'll admit my primary interest is the last one... Leo