harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tim Ellison <t.p.elli...@gmail.com>
Subject test suite correctness (was: Re: [classlib] Unit and performance testing)
Date Wed, 25 Jan 2006 14:14:49 GMT
Mikhail Loenko wrote:
> So how will you distinguish if a properly written test passed
> or skipped? 

by testing the test suite as the tests are developed (i.e. running it in
different configurations and ensuring it does the right thing).

(Of course, this is the recursive problem of determining test suite
correctness, at some point you just have to define the base case.)

> Both of them will say 'passed' and you will never
> know either it really passed or you config is slightly wrong and
> the test was skipped

I won't be reading the log messages of each automated test suite run to
see if I spot any errors, but if you want to ... <g>  If all the tests
pass then the build system (and I) will say the code is good.

As Anton wrote earlier, there *is* an argument for doing some logging to
help describe the configuration as determined by the tests where it is
significant, but most of the time general logging messages ("test
started", "reached this point", etc.) are unnecessary.

Regards,
Tim


> Thanks,
> Mikhail
> 
> On 1/25/06, Tim Ellison <t.p.ellison@gmail.com> wrote:
>> Mikhail Loenko wrote:
>>> One more reason when logs are necessary:
>>>
>>> If testing is possible in some configurations only
>>> (like set of providers contains something or default encoding is ...), then
>>> 1) build failing in all different configs would be annoying
>> huh? if you can determine that the test is bogus in a given
>> configuration, then simply skip the test.  Logging the 'expected'
>> failure is no help (who's going to read it?!) and if you don't know the
>> failure is expected then you have much bigger problems.
>>
>>> 2) One has to be able to scan logs for warnings to verify that
>>> functionality is tested
>>> when the config is as expected
>> No, please, just let the tests pass if they are expected to pass.  We
>> don't need to log conditionals to prove they were taken -- just write
>> the tests properly.
>>
>> Regards,
>> Tim
>>
>>> A different exit status for the tests that can test in the given
>>> configuration would
>>> help.
>>>
>>> Thanks,
>>> Mikhail
>>>
>>> On 1/25/06, Anton Avtamonov <anton.avtamonov@gmail.com> wrote:
>>>> On 1/25/06, Thorbjørn Ravn Andersen <thunderaxiom@gmail.com> wrote:
>>>>> Mikhail Loenko wrote:
>>>>>
>>>>>> fail() is not always convinient, for example, how would you print
>>>>>>
>>>>>> stack trace to fail()? Meanwhile stacktrace is most often enough
>>>>>>
>>>>>>
>>>>> If you need a stacktrace, why not just throw a RuntimeException at that
>>>>> point?  JUnit will then include the stack trace in the report.
>>>>>
>>>>> --
>>>>> Thorbjørn
>>>>>
>>>> Absolutely agree
>>>> As I know 'standard' test case signature is:
>>>>
>>>> public void testSomeTestName() throws Exception {
>>>> }
>>>>
>>>> So that all checked and runtime exceptions are passed directly to
>>>> JUnit framework (which properly logs them).
>>>>
>>>> I do beleive logging is very useful feature. However I think that the
>>>> preferrable place to do logging is code rather than tests. JUnit
>>>> provides lots of fucntionality to write well-documented tests and we
>>>> don't have to add extra-code for logging (which obviously made test
>>>> cases longer and harder for understanding).
>>>>
>>>> I beleive the right place to use logging are try/catch sections where
>>>> catch does nothing (the most ususal case), so that we just silently
>>>> ignore some error situations. Having logs there will allow us to
>>>> understand the system execution paths and what was going wrong and
>>>> where. For such purpose different logging levels work really fine.
>>>>
>>>> --
>>>> Anton Avtamonov,
>>>> Intel Managed Runtime Division
>>>>
>> --
>>
>> Tim Ellison (t.p.ellison@gmail.com)
>> IBM Java technology centre, UK.
>>
> 

-- 

Tim Ellison (t.p.ellison@gmail.com)
IBM Java technology centre, UK.

Mime
View raw message