harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Geir Magnusson Jr <g...@pobox.com>
Subject Re: Unit testing revisited
Date Wed, 22 Mar 2006 12:34:16 GMT


Leo Simons wrote:
> On Wed, Mar 22, 2006 at 06:41:56AM -0500, Geir Magnusson Jr wrote:
>>
[SNIP]
>> You forgot one - "integration test", which is a unit test that's been 
>> around long enough to shave. :)   (It's actually not a unit test...)
> 
>   "integration test" --> any test that is not an implementation test or
>         specification test. Typically these test the interactions between
>         multiple pieces rather than the correct behaviour of a single
>         piece.
> 
> I forgot another one:
> 
>   "gump run using harmony" --> the biggest frigging integration test you
>         can think of. Tests the interaction between harmony and millions
>         of lines of userland code.

     "frigging integration test" -->  A kind of integration test that
           uses a "frig", or "functional rig".  See
           http://gump.apache.org/

:)

> 
>>>>> We already see lots of errors caused by
>>>>> oversight of the classloader differences.
>>>> Right.  And I think the solution is to think about this in some other 
>>>> way than just running things in a VM, like a test harness that does the 
>>>> right thing in terms of the classes being tested (what would be in the 
>>>> boot classloader) and the classes doing the testing.
>>> I don't know about that. I'm sure that if the problem is well-defined
>>> enough solutions will become apparent, and I still don't quite get why it
>>> is the subject of continuous debate (eg can't someone just go out and try
>>> and do what you propose and show it works?).
>> The problem is 'completeness' because we have multiple problems to 
>> solve.
> 
> Uh-oh. Completeness is a scary word. I didn't see that coming.
> 
> <snip a couple of hackiness details />
>> I think that both of these solutions are
>>
>> a) messy - since only XP psycho's really *enjoy* creating unit tests, we 
>> want to make it as painless as possible as to not disincentivize 
>> developers.  Look at what we have so far.  IBM had to go off to the Unit 
>> Test Mines they run in a Secret Undisclosed Location in the Principality 
>> of BigBlueLand to provide unit tests for stuff they had already donated! 
>> :) [Thanks, btw]
> 
> The class library design is messy. Testing it will, one way or another, be
> a messy subject.
> 
>> b) subject to "mechanical failure" - we're doing all sorts of unnatural 
>> acts on code that is usually the "rock solid" basis for doing these 
>> unnatural things to other code (like in app servers), and I worry that 
>> such complexity will lead to very hard or impossible to find failures or 
>> bugs
> 
> Heh. You find *those* by running the app server tests :-). I suspect that
> running the J2EE TCK against geronimo running on harmony and comparing it
> with running the J2EE TCK against geronimo running on the sun jdk is
> going to be pretty insightful...

Like a mortar attack is insightful. :)

It will be an interesting test of "The Algebra of TCK-ness"

If A = Sun JDK passes Java SE TCK
If B(A) = Geronimo passes Java EE TCK on compliant Sun JDK
If C = Harmony JDK passes on Java SE TCK

then it should be true that B(C).   No need to test!

:)


> 
>>> There is also the possibility that all the package-private materials in
>>> reality are fully exercised if you test the public parts of the package
>>> thoroughly enough. A coverage utility like clover can show that. XP
>>> (extreme programming) purists (like me) might argue that if you have
>>> package-private stuff that is not exerciseable through the public API
>>> that the package-private stuff needs to be factored out. But lets try not
>>> to argue too much :-)
>> I agree with the latter part.  What I worry about though is that despite 
>> the best of intentions, unit testing tends not to ever be complete and 
>> thorough.  I don't know if things like clover indicate the quality of 
>> the coverage - but simply having coverage just isn't enough, IMO, as you 
>> may not exercise completely enough so that all internal functionality is 
>> directly exercised.  Dunno.
> 
> You've never had the pleasure of being part of a project that was fully
> XP-run from the start, have you? Its not a pipe dream but its also not
> likely to be attainable for harmony (if we want to get anything running
> before 2020).

No, I haven't.  I don't think you could do Jave SE as XP because design 
and planning is needed :)

> 
>>>>>> I
>>>>>> couldn't imagine that the Eclipse tests don't test package protected
>>>>>> things.
>>>>> The only thing shared with Eclipse-land here is the *.tests.* package
>>>>> name element, hardly significant or unique I expect.
>>>> Well, it is around here. While I haven't done a survey, I'm used to 
>>>> projects keeping things in parallel trees to make it easy to test. 
>>> If with "here" you mean "the ASF" I'm happy to challenge the assertion :-)
>> Please point me to it!  I always want to see new ways of doing this. 
>> Challenge away!
> 
> Okay :-), top-of-head,
> 
> http://svn.apache.org/repos/asf/excalibur/trunk/framework/impl/src/test/org/apache/avalon/framework/context/test/ContextTestCase.java
> 
> (one of the last remaining bits of code that can be traced back to apache
> jserv which was tested using testlet which was around before JUnit). In
> general, the parts of jakarta and what grew out of it that are derivants of
> the JServ branch of working (including avalon, now excalibur, cocoon) often
> do thingsl ike this.
> 
> The fact I typed that URL from memory and was right is kinda scary, isn't
> it? I've not worked on that code for *years* and its moved a few dozen
> times...

That is scary.  It's also scary that you proposed Avalon as an example :)

> 
>> So the problem boils down to the fact that we are implicitly doing 
>> integration testing.  That's why I've been suggesting the framework - 
>> let us test the code in isolation first, using "implementation tests". 
>> Then, if our isolation framework is sexy enough, lets try to reproduce 
>> the same classloader/security model we would experience in a VM, and do 
>> spec/API testing.  *Then* we can do integration testing by running the 
>> code in the VM ("in situ") and do the proper (aka (*.test.*) ) 
>> spec/API/tck testing.
>>
>> I'll post this as a separate message because this one is way too woolly 
>> at this point.
> 
> Okay, this does sound like "the core" of the matter. There you go.
> 
> I'll point out that every time you restrict to an ordered sequence of
> taking care of things in an open souce community you do slow them down just
> a little (hey, that's an interesting assertion. Hmm. Should write a book
> about it I guess) so make sure its what you want :-).

Huh?

geir



Mime
View raw message