db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sedillo, Derek \(Mission Systems\)" <Derek.Sedi...@ngc.com>
Subject Large multi-record insert performance
Date Wed, 14 Mar 2007 15:05:39 GMT
Hello,

I am trying to find the most efficient way to handle multi-records
inserts into a single table in Derby.  Currently we use Oracle to
perform multi-record inserts with 'one' insert statement for large
amounts of data.  This bulk loading process greatly enhances
performance.  For example:

INSERT INTO SSJ 
VALUES (:tmp_ssj_data);  //where tmp_ssj_data is an array (100s or
1000s) of C structured records 

Is there any way in Java DB to perform something similar?  So lets say
table t has three integer columns. 
INSERT INTO t VALUES (?,?,?) 
Now you have 1000 rows to insert into t that look like this:
(1,1,1), (2,2,2), (3,3,3), (4,4,4), ...
How would you pass these 1000 records in as VALUES with a 'single'
insert statement?
I am thinking of two options so far:
1.  Use a prepared statement and 'set' all 1000 record values.  Then
execute.  
     But I question whether calling the set method for all 1000 records
would be buy anything in terms of performance.

2.  The other option would be to iterate an array of 1000 records and
create a VALUES clause string to look like:
String records = "(1,1,1), (2,2,2), (3,3,3), (4,4,4), ...(1000, 1000,
1000)";
    Then append the insert statement with the records and execute.
String query = "INSERT INTO t VALUES " + records;
Execute query;

Which method would be the most efficient way to perform these 1000
inserts in Derby?  Is there a more optimized way to do this?

Thank you for any experience or ideas you may have regarding large data
inserts.

Derek Sedillo 
SWAFS DBA / Software Engineer 
Northrop Grumman Missions Systems 
Tel: (719) 570-8256 



Mime
View raw message