db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Steve Pannier" <steve.pann...@qlogic.com>
Subject RE: Performance question
Date Thu, 12 Jul 2007 19:43:11 GMT
Bryan,

Thanks for your reply.

>> > footprint, and is Java-based.  I have been testing Derby to see how
it
>> > performs storing large amounts of data in this environment, and
I've
>> > seen a slight degradation of performance over time.  I'm using
Derby
>> 
>> Hi Steve, I think this is very interesting data.
>> 
>> Can you run your experiment out farther, say to 2M, 5M, or 10M rows?

My last test of 1 million "iterations" inserted 400M records to my
database (since each iteration is 400 inserts).  Did you mean to suggest
re-running the test with 2M, 5M, or 10M *iterations* as opposed to
*rows*?

>> 
>> Can you characterize the overall behavior of the machine during the
>> experiment? What is the CPU load like? What is the disk load like?
>> How does the Derby memory profile look? Do any of these things appear
>> to change as the experiment goes on?

I didn't watch cpu load over the entire test run, which lasted the
entire weekend.  But I did see (from 'top') that the load was at ~82%
for the first several minutes after starting the test.  One note: My
test program doesn't store data on a per-second basis (i.e. it doesn't
sleep between iterations).  It just inserts the data as fast as the
system will allow. (This is not a requirement, though - I just do it
this way to get lots of data into the database as fast as possible since
we want to run retrieval tests also.)  So higher cpu loads are expected
during the test runs.

As far as the Derby memory profile - how would I go about measuring
that?

And not being much of a system monitor guru, do you have any tips as to
how to monitor the cpu/disk/memprofile over the life of the experiment?

>> 
>> Lastly, can you try your experiment with Derby 10.3, which is in beta
>> testing now?  There were a lot of performance changes in 10.3,
although
>> nothing that was specifically targeted at exactly your scenario.
>> 
>> I think there could be a lot of possible explanations of the behavior
>> you're seeing.
>> 
>> On the other hand, starting with a completely fresh system and seeing
>> only a 8% slowdown as it gets dirty is, overall, pretty good, don't
you
>> think?

Well, the slow down I saw was minimal over the one weekend that I ran
the test.  And if that were the end of the maximum continuous amount of
time to store data, it would be acceptable.  But our requirement is to
store/save statistical data for up to 13 months (sorry, I didn't make
that clear in my original post), so if the performance continues to
degrade over that time period, then it would be problematic.

I was hoping that there would be a tuning parameter I could tweak -
maybe bump up pageSize or pageCacheSize or set a different parameter
that would make a difference.

>> 
>> thanks,
>> 
>> bryan

On other comment about the tests I'm running.  Their purpose is to test
the maximum limits of our data storage requirements, so they are not
meant to simulate normal usage of our application.  But we do need to
know what the limits are, and if we can even meet the requirements set
forth by our marketing group, so thus the tests.

I hope to be able to continue running these tests.  But we have other
projects that keep pulling me away from this one.  If time permits, I'll
be able to try some of your suggestions.

Regards.
 
Steve


Mime
View raw message