db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Lance J. Andersen" <Lance.Ander...@Sun.COM>
Subject Re: Large multi-record insert performance
Date Wed, 14 Mar 2007 22:33:46 GMT


Even if the backend does not provide optimization for batch processing, 
i would hope that there would be still some efficiency especially in a 
networked environment vs building the strings, invoking execute() 1000 
times in the amount of data on the wire...

Mike Matrigali wrote:
> Did you try the most straight forward approach, ie just do 1000 
> inserts, using Derby in embedded mode? ie.:
>
> > PreparedStatement pstmt = connection.prepareStatement("INSERT INTO t
> > VALUES (?, ?, ?)");
> > for (int i=1; i<1000; i++) {
> > pstmt.setInt(1, i);
> > pstmt.setInt(2, i);
> > pstmt.setInt(3, i);
> > pstmt.execute();
> > }
> commit;
>
> I haven't done the comparison, but I don't think derby is going to do
> much optimization with the batch. I would definitely start there. The
> usual problem is the commit, so turning autocommit off and doing as
> many inserts as possible per commit is the best place to start.
>
> If you are not using embedded derby then network overhead may be more of
> a problem.
>
>
> Sedillo, Derek (Mission Systems) wrote:
>> Thomas,
>>
>> Thank you, I have not tried the batch feature yet. But you have given 
>> me some great stuff to work with.
>>
>> I will have to try this and compare the timings to the single INSERT 
>> string approach below. I may also have to adjust the commit number 
>> up/down to achieve optimal performance.
>>
>> Iterate an array of 1000 records and create a VALUES clause string to 
>> look like:
>> /String records = "(1,1,1), (2,2,2), (3,3,3), (4,4,4), …(1000, 1000, 
>> 1000)";/
>> Then append the insert statement with the records and execute.
>> /String query = "INSERT INTO t VALUES " + records;/
>> /Execute query;/
>>
>> - Derek
>> ------------------------------------------------------------------------
>> *From:* Thomas J. Taylor [mailto:thomas.taylor@itqa.miami.edu]
>> *Sent:* Wednesday, March 14, 2007 12:01 PM
>> *To:* 'Derby Discussion'
>> *Subject:* RE: Large multi-record insert performance
>>
>> Hi Derek,
>>
>> Have you tried batch inserts?
>>
>> PreparedStatement pstmt = connection.prepareStatement("INSERT INTO t 
>> VALUES (?, ?, ?)");
>> for (int i=1; i<1000; i++) {
>> pstmt.setInt(1, i);
>> pstmt.setInt(2, i);
>> pstmt.setInt(3, i);
>> pstmt.addBatch();
>> }
>> int[] results = pstmt.executeBatch();
>>
>> Thomas
>> ------------------------------------------------------------------------
>> *From:* Sedillo, Derek (Mission Systems) [mailto:Derek.Sedillo@ngc.com]
>> *Sent:* Wednesday, March 14, 2007 11:06 AM
>> *To:* derby-user@db.apache.org
>> *Subject:* Large multi-record insert performance
>>
>> Hello,
>>
>> I am trying to find the most efficient way to handle multi-records 
>> inserts into a single table in Derby. Currently we use Oracle to 
>> perform multi-record inserts with 'one' insert statement for large 
>> amounts of data. This bulk loading process greatly enhances 
>> performance. For example:
>>
>> /INSERT INTO SSJ
>> VALUES (:tmp_ssj_data); / //where* tmp_ssj_data* is an array (100s or 
>> 1000s) of C structured records
>>
>> Is there any way in Java DB to perform something similar? So lets say 
>> table t has three integer columns.
>> /INSERT INTO t VALUES (?,?,?)/
>> Now you have 1000 rows to insert into t that look like this:
>> (1,1,1), (2,2,2), (3,3,3), (4,4,4), …
>> How would you pass these 1000 records in as VALUES with a 'single' 
>> insert statement?
>> I am thinking of two options so far:
>> 1. Use a prepared statement and 'set' all 1000 record values. Then 
>> execute. But I question whether calling the set method for all 1000 
>> records would be buy anything in terms of performance.
>>
>> 2. The other option would be to iterate an array of 1000 records and 
>> create a VALUES clause string to look like:
>> /String records = "(1,1,1), (2,2,2), (3,3,3), (4,4,4), …(1000, 1000, 
>> 1000)";/
>> Then append the insert statement with the records and execute.
>> /String query = "INSERT INTO t VALUES " + records;/
>> /Execute query;/
>>
>> Which method would be the most efficient way to perform these 1000 
>> inserts in Derby? Is there a more optimized way to do this?
>>
>> Thank you for any experience or ideas you may have regarding large 
>> data inserts.
>>
>> Derek Sedillo
>> SWAFS DBA / Software Engineer
>> *Northrop Grumman Missions Systems*
>> Tel: (719) 570-8256
>>
>>
>

Mime
View raw message