db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nurullah Akkaya <nurullah_akk...@yahoo.com>
Subject Re: Large multi-record insert performance
Date Thu, 15 Mar 2007 10:20:50 GMT
using prepared statement + batch will be your fastest approach even  
tough you pass 1000 records per query derby has to still parse that  
query whereas with prepared statements they are parsed once. and if  
you are using batch all your data will be passed at once so derby  
will make better use of your disk drive( instead of flushing index  
after each insert it will do it once for all batch this was the case  
in mysql i maybe wrong for derby) using prepared statements and batch  
of 100 000 in embedded mode on a amd 1800 1 gb ram ide hard drive  i  
can achive up to 500 ( 15byte) inserts per second all day long.( i  
even have composite index on the table that includes all the columns  
so they are pretty much written twice.)



On Mar 15, 2007, at 12:24 AM, Mike Matrigali wrote:

> Did you try the most straight forward approach, ie just do 1000  
> inserts, using Derby in embedded mode?  ie.:
>
> > PreparedStatement pstmt = connection.prepareStatement("INSERT INTO t
> > VALUES (?, ?, ?)");
> > for (int i=1; i<1000; i++) {
> >   pstmt.setInt(1, i);
> >   pstmt.setInt(2, i);
> >   pstmt.setInt(3, i);
> >   pstmt.execute();
> > }
>   commit;
>
> I haven't done the comparison, but I don't think derby is going to do
> much optimization with the batch.  I would definitely start there.   
> The
> usual problem is the commit, so turning autocommit off and doing as
> many inserts as possible per commit is the best place to start.
>
> If you are not using embedded derby then network overhead may be  
> more of
> a problem.
>
>
> Sedillo, Derek (Mission Systems) wrote:
>> Thomas,
>>  Thank you, I have not tried the batch feature yet.  But you have  
>> given me some great stuff to work with.
>>  I will have to try this and compare the timings to the single  
>> INSERT string approach below.  I may also have to adjust the  
>> commit number up/down to achieve optimal performance.
>>  Iterate an array of 1000 records and create a VALUES clause  
>> string to look like:
>> /String records = "(1,1,1), (2,2,2), (3,3,3), (4,4,4), …(1000,  
>> 1000, 1000)";/
>>     Then append the insert statement with the records and execute.
>> /String query = "INSERT INTO t VALUES " + records;/
>> /Execute query;/
>>  - Derek
>> --------------------------------------------------------------------- 
>> ---
>> *From:* Thomas J. Taylor [mailto:thomas.taylor@itqa.miami.edu]
>> *Sent:* Wednesday, March 14, 2007 12:01 PM
>> *To:* 'Derby Discussion'
>> *Subject:* RE: Large multi-record insert performance
>> Hi Derek,
>>  Have you tried batch inserts?
>>  PreparedStatement pstmt = connection.prepareStatement("INSERT  
>> INTO t VALUES (?, ?, ?)");
>> for (int i=1; i<1000; i++) {
>>   pstmt.setInt(1, i);
>>   pstmt.setInt(2, i);
>>   pstmt.setInt(3, i);
>>   pstmt.addBatch();
>> }
>> int[] results = pstmt.executeBatch();
>>  Thomas
>> --------------------------------------------------------------------- 
>> ---
>> *From:* Sedillo, Derek (Mission Systems)  
>> [mailto:Derek.Sedillo@ngc.com]
>> *Sent:* Wednesday, March 14, 2007 11:06 AM
>> *To:* derby-user@db.apache.org
>> *Subject:* Large multi-record insert performance
>> Hello,
>> I am trying to find the most efficient way to handle multi-records  
>> inserts into a single table in Derby.  Currently we use Oracle to  
>> perform multi-record inserts with 'one' insert statement for large  
>> amounts of data.  This bulk loading process greatly enhances  
>> performance.  For example:
>> /INSERT INTO SSJ
>> VALUES (:tmp_ssj_data); / //where* tmp_ssj_data* is an array (100s  
>> or 1000s) of C structured records
>> Is there any way in Java DB to perform something similar?  So lets  
>> say table t has three integer columns.
>> /INSERT INTO t VALUES (?,?,?)/
>> Now you have 1000 rows to insert into t that look like this:
>> (1,1,1), (2,2,2), (3,3,3), (4,4,4), …
>> How would you pass these 1000 records in as VALUES with a 'single'  
>> insert statement?
>> I am thinking of two options so far:
>> 1.  Use a prepared statement and 'set' all 1000 record values.   
>> Then execute.      But I question whether calling the set method  
>> for all 1000 records would be buy anything in terms of performance.
>> 2.  The other option would be to iterate an array of 1000 records  
>> and create a VALUES clause string to look like:
>> /String records = "(1,1,1), (2,2,2), (3,3,3), (4,4,4), …(1000,  
>> 1000, 1000)";/
>>     Then append the insert statement with the records and execute.
>> /String query = "INSERT INTO t VALUES " + records;/
>> /Execute query;/
>> Which method would be the most efficient way to perform these 1000  
>> inserts in Derby?  Is there a more optimized way to do this?
>> Thank you for any experience or ideas you may have regarding large  
>> data inserts.
>> Derek Sedillo
>> SWAFS DBA / Software Engineer
>> *Northrop Grumman Missions Systems*
>> Tel: (719) 570-8256
>


Mime
View raw message