db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Stavros Macrakis" <macra...@alum.mit.edu>
Subject Efficient loading of calculated data
Date Wed, 12 Dec 2007 21:06:20 GMT
Hi, I have an application whose output is about 500,000 pairs (string,
integer) -- this is the result of some fairly fancy text processing.
I'd like to put this data into a (new) Derby table. Using individual
Inserts for each row takes over an hour, which seems much too long.
Using the bulk import feature involves writing out to a file and then
importing from the file, which seems rather roundabout.

So... What is the recommended way to insert a large number of rows
from an application? Is the answer the same for 10^3 or 10^8 rows? Do
the data types involved (e.g. large text field with newlines) make any
difference to the answer?

Thanks,

         -s

Mime
View raw message