db-derby-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Merlin Beedell at Demon" <mer...@thebeedells.demon.co.uk>
Subject Performance tests for very large data sets?
Date Tue, 13 Jun 2006 16:32:53 GMT
I hope that this is a common enough request that some results and comments 
already exist. Just point me in the right direction!

I would like to know how Derby performs (inserts and search) where the 
record count goes from 1 to 1,000 million or greater, particularly if there 
are foreign key constraints included.  And in comparison with other 

I fully recognise that such tests are quite subjective and vary greatly 
depending on various settings that do not change the sql used.

We have a situation where Derby performs well up to 100,000 records or so, 
then (insert) performance plummets. It uses a fairly large text foreign key, 
and we are changing this to an auto-number key instead, which we know will 
help.  The aim to to have good performance up to 6Gb worth of data (approx 
60-120 million rows over a small number of tables).

Some experience from other developers may well help us move forward, as well 
as any reasoned test results that shows that Derby is a good choice for us. 

View raw message