db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mike Matrigali <mikem_...@sbcglobal.net>
Subject Re: Large multi-record insert performance
Date Wed, 14 Mar 2007 22:40:52 GMT


Lance J. Andersen wrote:
> 
> 
> Even if the backend does not provide optimization for batch processing, 
> i would hope that there would be still some efficiency especially in a 
> networked environment vs building the strings, invoking execute() 1000 
> times in the amount of data on the wire...
> 
>
I could not tell from the question whether this was network or not.  I
agree in network then limiting execution probably is best.  In embedded
I am not sure - I would not be surprised if doing 1000 in batch is
slower than just doing the executes.

In either case I really would stay away from string manipulation as much
as possible and also stay away from things that create very long SQL
statements like 1000 term values clauses.


Mime
View raw message