db-derby-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jean T. Anderson" <...@bristowhill.com>
Subject Re: Performance tests for very large data sets?
Date Tue, 13 Jun 2006 22:23:24 GMT
You might be interested in Olav Sandstaa's ApacheCon US 2005
presentation "Performance Analysis of Apache Derby". You can download it
from here:

http://wiki.apache.org/apachecon/Us2005OnlineSessionSlides

 -jean


Merlin Beedell at Demon wrote:
> I hope that this is a common enough request that some results and
> comments already exist. Just point me in the right direction!
> 
> I would like to know how Derby performs (inserts and search) where the
> record count goes from 1 to 1,000 million or greater, particularly if
> there are foreign key constraints included.  And in comparison with
> other databases.
> 
> I fully recognise that such tests are quite subjective and vary greatly
> depending on various settings that do not change the sql used.
> 
> We have a situation where Derby performs well up to 100,000 records or
> so, then (insert) performance plummets. It uses a fairly large text
> foreign key, and we are changing this to an auto-number key instead,
> which we know will help.  The aim to to have good performance up to 6Gb
> worth of data (approx 60-120 million rows over a small number of tables).
> 
> Some experience from other developers may well help us move forward, as
> well as any reasoned test results that shows that Derby is a good choice
> for us.
> 


Mime
View raw message