directory-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Emmanuel Lecharny" <elecha...@gmail.com>
Subject Speeding up tests
Date Tue, 18 Sep 2007 23:10:07 GMT
Hi guys,

we currently have insanely long tests, especially when it comes to
integration tests. This is not only due to the fact that our server is
slow ;), but mainly because we are using an old version of JUnit,
which run setup() and teardown() for _each_ test. We have around 3000
tests currently, with around 420 core-unit tests and 180 server-unit
tests.

Those last tests are really time consuming, as the server is started
and stopped for each single test, and that cost around 2 seconds for
each server start. It cost more than 15 minutes on my laptop ...

Junit 4, the latest version we are currently using, offers a new
annotation system where you can add a @BeforeClass and @AfterClass,
applied on a startup and teardown methods only once for all the tests
in a simple class. If we tune the tests correctly, we can then reduce
the number of server startup.shutdown to 100 startup instead if 600.
The gain will be quite important, as we may save 70% of the total
cost.

I have done some test with JUnit 3.8, using a very ugly hack, and the
result is that for the SearchTest, I went down from 46 seconds to 6
seconds.

There are a few things to do now :
- I have not tested JUnit 4 right now, but there are everything we
need to speed up the test, as far as I can see
- if we are to switch to JUnit 4, we have to figure out which impact
it has on the existing code, on Eclipse integration, and more
important, on Maven (does Surefire support Junit 4 natively ?)
- another alternative would be to use the more evolved TestNG
framework ; the very same steps should be followed (see previous
point)
- If we want to start the server only once, we have to be very careful
when writing a new test : it should not impact the existing data
loaded. That mean that a test should left the data in the same state
when it has been completed (successfully or not) than when it started.

There may be some other options, like scripting tests (I was initially
thinking about using JMeter, but it's far from being perfect), or
adding a very simple framework to compare expected entries with those
get from the server.

At this point, I would like to get your opinion.

Thanks for any suggestion !

-- 
Regards,
Cordialement,
Emmanuel L├ęcharny
www.iktek.com

Mime
View raw message