activemq-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "James Strachan" <>
Subject automated performance & system testing
Date Wed, 07 Jun 2006 19:21:30 GMT
For the longest time I've wanted a good distributed
performance/integration/system testing mechanism. Back in the day I
helped found this project...

which was a neat idea at the time - though the distribution &
coordination & management & reporting was a bit tricky.

If you don't know gbuild is a kinda distributed continuum - it uses
ActiveMQ to distribute builds across a cluster of machines. It all
started due to issues with Geronimo's TCK taking days to run; so
gbuild splits the build up into small chunks and load balances them
across a cluster.

After tinkering with the gbuild stuff a light went on - we could use a
cluster of Contiuum servers as our distributed testing framework -
then just use maven builds as the agents (which take care of
dependencies, classpaths and deploying results to a repo).

The source for gbuild is here...

Also have BeanFlow for writing little orchestrations in Java code too...

So we're getting closer to having the pieces in place.

A little while ago I hacked up some ideas on how we could build this
in a wiki page..

There's an early of an m2 plugin which lets you run brokers,
performance producer/consumer tests

which now has some documentation too

I'm sure there's gonna be some rough edges along the way to get a
great distributed performance testing suite - e.g. we may have to
tweak the gbuild stuff to be able to handle dependent sets of builds a
little better; so that say 10 different builds which have to run
concurrently on different machines can be scheduled properly etc. But
its getting closer.

So right now its easy to run performance tests of
producer/consumer/broker type models specifying the connection URL and
arguments to use in the test.

e.g. try following the user manual and see if you can run some basic
performance tests on your hardware.

then we can hopefully start adding tools such as to graph the
performance results over time; or compare performance results on
different hardware, or compare the effects of different optimisation
flags etc.

So far the plugin just deals with performance testing; I'm sure we
could extend the concept to automated integration/system testing too.


View raw message