Hi Derek,
Have you tried batch inserts?
PreparedStatement pstmt = connection.prepareStatement("INSERT INTO t VALUES (?, ?, ?)");
for (int i=1; i<1000; i++) {
  pstmt.setInt(1, i);
  pstmt.setInt(2, i);
  pstmt.setInt(3, i);
int[] results = pstmt.executeBatch();

From: Sedillo, Derek (Mission Systems) [mailto:Derek.Sedillo@ngc.com]
Sent: Wednesday, March 14, 2007 11:06 AM
To: derby-user@db.apache.org
Subject: Large multi-record insert performance


I am trying to find the most efficient way to handle multi-records inserts into a single table in Derby.  Currently we use Oracle to perform multi-record inserts with 'one' insert statement for large amounts of data.  This bulk loading process greatly enhances performance.  For example:

VALUES (:tmp_ssj_data); 
//where tmp_ssj_data is an array (100s or 1000s) of C structured records

Is there any way in Java DB to perform something similar?  So lets say table t has three integer columns.
Now you have 1000 rows to insert into t that look like this:
(1,1,1), (2,2,2), (3,3,3), (4,4,4), …

How would you pass these 1000 records in as VALUES with a 'single' insert statement?
I am thinking of two options so far:
1.  Use a prepared statement and 'set' all 1000 record values.  Then execute. 
     But I question whether calling the set method for all 1000 records would be buy anything in terms of performance.

2.  The other option would be to iterate an array of 1000 records and create a VALUES clause string to look like:
String records = "(1,1,1), (2,2,2), (3,3,3), (4,4,4), …(1000, 1000, 1000)";
    Then append the insert statement with the records and execute.
String query = "INSERT INTO t VALUES " + records;
Execute query;

Which method would be the most efficient way to perform these 1000 inserts in Derby?  Is there a more optimized way to do this?

Thank you for any experience or ideas you may have regarding large data inserts.

Derek Sedillo
SWAFS DBA / Software Engineer
Northrop Grumman Missions Systems
Tel: (719) 570-8256