ant-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Stephane Bailliez" <>
Subject Re: JUNIT RUN
Date Tue, 10 Dec 2002 18:54:55 GMT
----- Original Message -----
From: <>

> Thank you Stephane. Are these Tools available for free?

Yes and no. Clover is available for free for Open Sources project only.
Otherwise you'll have to pay a reasonable fee for a developer license and
more for a full integration license. JProbe is not available for free and I
don't remember the exact prices though you have free evaluation available.

For more look at and

There might be other but that's all I can think about right now from the top
of my head, just google to check for more.

> Don't you think ANT JUNIT Tasks extend such functionality to support the
> report stating
> the number of classes that were run against the number of classes that
> left out?

Yes and no. The information itself is not really of good value alone. It is
just like the number of tests of a class, you can write 15000 tests of the
same class and cover it by only 10%. This is what the coverage tool is about
it gives you this information.

> Do you think it will be wise decision if I can re-extend Report for JUNIT
> Tasks to have more information?

It depends what information.
It is not a particulary trivial task. Do not forget that you have LOTS of
information to display and doing this in a single page would require a 36"
inch monitor and an alien brain unless you find a clever way to do so, which
I did not but I'm not really specialized in human interaction design.

I did this a long time ago in my previous company where I had the daily
build spitting out tons of information and thought it was best to split
especially as the information is of interest to different category of
people. In short the big bosses were interested in metrics and a little bit
on coverage, the manager and QA were interested in tests results, coverage
and metrics on a same level, the developers were more focusing on test
results and coverage and a little bit on metrics.

This is what I observed so far in my shop, that does not mean it's like that
everywhere but in short when doing your brain work on metrics as a manager,
you absolutely don't want to be distracted about the stacktrace of the tests
results that fail.

To help comparisons, there was an aggregation phase for the results of all
projects and this was presented in a single page, managers knew everyday the
results of the build, or the tests coverage, or the metrics on a project and
this was side by side with other projects, so this was kind of introducing
fun competition between projects members, should they be developer or

To unsubscribe, e-mail:   <>
For additional commands, e-mail: <>

View raw message