Return-Path: Delivered-To: apmail-incubator-harmony-dev-archive@www.apache.org Received: (qmail 75059 invoked from network); 25 Jan 2006 15:17:00 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (209.237.227.199) by minotaur.apache.org with SMTP; 25 Jan 2006 15:17:00 -0000 Received: (qmail 60182 invoked by uid 500); 25 Jan 2006 15:16:55 -0000 Delivered-To: apmail-incubator-harmony-dev-archive@incubator.apache.org Received: (qmail 59910 invoked by uid 500); 25 Jan 2006 15:16:53 -0000 Mailing-List: contact harmony-dev-help@incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: harmony-dev@incubator.apache.org Delivered-To: mailing list harmony-dev@incubator.apache.org Received: (qmail 59891 invoked by uid 99); 25 Jan 2006 15:16:53 -0000 Received: from asf.osuosl.org (HELO asf.osuosl.org) (140.211.166.49) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 25 Jan 2006 07:16:53 -0800 X-ASF-Spam-Status: No, hits=-0.0 required=10.0 tests=SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (asf.osuosl.org: domain of mloenko@gmail.com designates 66.249.92.198 as permitted sender) Received: from [66.249.92.198] (HELO uproxy.gmail.com) (66.249.92.198) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 25 Jan 2006 07:16:52 -0800 Received: by uproxy.gmail.com with SMTP id m3so205197ugc for ; Wed, 25 Jan 2006 07:16:31 -0800 (PST) DomainKey-Signature: a=rsa-sha1; q=dns; c=nofws; s=beta; d=gmail.com; h=received:message-id:date:from:to:subject:in-reply-to:mime-version:content-type:content-transfer-encoding:content-disposition:references; b=DqUbSdXGUmZ4tUrJiV9Y6TVvN2E45IMo+T2UL9aiVy3QE1kDXaLxEKeT39S4WENCNn7D72HKWKUNdRqQfyW/u/h4erJDHMR0pQIVGPr1+9TI3LVjomcn9PLfGRI1ZQUXkjc7OSujZg7HNX7Ab/bLPtKfgKEKpu7PClsPlufrPE8= Received: by 10.66.225.18 with SMTP id x18mr371529ugg; Wed, 25 Jan 2006 07:16:30 -0800 (PST) Received: by 10.66.244.18 with HTTP; Wed, 25 Jan 2006 07:16:27 -0800 (PST) Message-ID: <906dd82e0601250716l45967ef2tf62fba4648b3fc47@mail.gmail.com> Date: Wed, 25 Jan 2006 21:16:27 +0600 From: Mikhail Loenko To: harmony-dev@incubator.apache.org Subject: Re: test suite correctness (was: Re: [classlib] Unit and performance testing) In-Reply-To: <43D787D9.2060702@gmail.com> MIME-Version: 1.0 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Content-Disposition: inline References: <906dd82e0601232340n5de6f561v8bb6db7ac4756f1c@mail.gmail.com> <43D60A4A.7030107@gmail.com> <906dd82e0601240421v507ecc43x75b4d42cd728def7@mail.gmail.com> <43D74548.3020301@gmail.com> <46d21a9a0601250204x4bc129f8u69940700d3522ae2@mail.gmail.com> <906dd82e0601250348p7bb5a17bgd30c59b1aced745d@mail.gmail.com> <43D76AA8.3070204@gmail.com> <906dd82e0601250424t6ca228e9s7ffaa0eca3edb590@mail.gmail.com> <43D787D9.2060702@gmail.com> X-Virus-Checked: Checked by ClamAV on apache.org X-Spam-Rating: minotaur.apache.org 1.6.2 0/1000/N On 1/25/06, Tim Ellison wrote: > Mikhail Loenko wrote: > > So how will you distinguish if a properly written test passed > > or skipped? > > by testing the test suite as the tests are developed (i.e. running it in > different configurations and ensuring it does the right thing). > > (Of course, this is the recursive problem of determining test suite > correctness, at some point you just have to define the base case.) > > > Both of them will say 'passed' and you will never > > know either it really passed or you config is slightly wrong and > > the test was skipped > > I won't be reading the log messages of each automated test suite run to > see if I spot any errors, but if you want to ... If all the tests > pass then the build system (and I) will say the code is good. Well, test suite might be correct, but the problem is how to make sure that QA runs in the right configuration. IMO one of the best solutions would be separating status for skipped tests. Then we could track them and figure out whether some functionality remains untested. Otherwise we have to log and then grep log messages. > As Anton wrote earlier, there *is* an argument for doing some logging to > help describe the configuration as determined by the tests where it is > significant, but most of the time general logging messages ("test > started", "reached this point", etc.) are unnecessary. Agreed Thanks, Mikhail > > Regards, > Tim > > > > Thanks, > > Mikhail > > > > On 1/25/06, Tim Ellison wrote: > >> Mikhail Loenko wrote: > >>> One more reason when logs are necessary: > >>> > >>> If testing is possible in some configurations only > >>> (like set of providers contains something or default encoding is ...)= , then > >>> 1) build failing in all different configs would be annoying > >> huh? if you can determine that the test is bogus in a given > >> configuration, then simply skip the test. Logging the 'expected' > >> failure is no help (who's going to read it?!) and if you don't know th= e > >> failure is expected then you have much bigger problems. > >> > >>> 2) One has to be able to scan logs for warnings to verify that > >>> functionality is tested > >>> when the config is as expected > >> No, please, just let the tests pass if they are expected to pass. We > >> don't need to log conditionals to prove they were taken -- just write > >> the tests properly. > >> > >> Regards, > >> Tim > >> > >>> A different exit status for the tests that can test in the given > >>> configuration would > >>> help. > >>> > >>> Thanks, > >>> Mikhail > >>> > >>> On 1/25/06, Anton Avtamonov wrote: > >>>> On 1/25/06, Thorbj=F8rn Ravn Andersen wrote= : > >>>>> Mikhail Loenko wrote: > >>>>> > >>>>>> fail() is not always convinient, for example, how would you print > >>>>>> > >>>>>> stack trace to fail()? Meanwhile stacktrace is most often enough > >>>>>> > >>>>>> > >>>>> If you need a stacktrace, why not just throw a RuntimeException at = that > >>>>> point? JUnit will then include the stack trace in the report. > >>>>> > >>>>> -- > >>>>> Thorbj=F8rn > >>>>> > >>>> Absolutely agree > >>>> As I know 'standard' test case signature is: > >>>> > >>>> public void testSomeTestName() throws Exception { > >>>> } > >>>> > >>>> So that all checked and runtime exceptions are passed directly to > >>>> JUnit framework (which properly logs them). > >>>> > >>>> I do beleive logging is very useful feature. However I think that th= e > >>>> preferrable place to do logging is code rather than tests. JUnit > >>>> provides lots of fucntionality to write well-documented tests and we > >>>> don't have to add extra-code for logging (which obviously made test > >>>> cases longer and harder for understanding). > >>>> > >>>> I beleive the right place to use logging are try/catch sections wher= e > >>>> catch does nothing (the most ususal case), so that we just silently > >>>> ignore some error situations. Having logs there will allow us to > >>>> understand the system execution paths and what was going wrong and > >>>> where. For such purpose different logging levels work really fine. > >>>> > >>>> -- > >>>> Anton Avtamonov, > >>>> Intel Managed Runtime Division > >>>> > >> -- > >> > >> Tim Ellison (t.p.ellison@gmail.com) > >> IBM Java technology centre, UK. > >> > > > > -- > > Tim Ellison (t.p.ellison@gmail.com) > IBM Java technology centre, UK. >