db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bloois, Rene de" <rene.de.blo...@logica.com>
Subject Re: OutOfMemoryException when executing 180,000 batches
Date Fri, 02 Dec 2011 10:56:46 GMT


You could also try to split the batch into smaller batches, like this:

int batchSize = 0;
while( ... there is data ... )
{
	statement.setXxxx( .... );
	// etc...
	
	statement.addBatch();
	batchSize++;
	
	if( batchSize >= 1000 ) // Maximum batch size
	{
		statement.executeBatch();
		batchSize = 0;
	}
}
if( batchSize > 0 )
	statement.executeBatch();

That solved it for me, trying to insert 1 million records into Oracle, and
it is still really fast. Should work on Derby too, it is standard JDBC.

The advantage of this approach is that you don't need to split up your
transaction if you don't want to.

Regards,
René



Mikael Aronsson wrote:
> 
> How about the suggestion of splitting it up in smaller transactions ? or
> are 
> you allready doing this ?
> 
> ----- Original Message ----- 
> From: "Ritu" <ritu.goel@tcs.com>
> To: <derby-user@db.apache.org>
> Sent: Thursday, December 01, 2011 5:45 AM
> Subject: Re: OutOfMemoryException when executing 180,000 batches
> 
> 
>> Dag H. Wanvik <dag.wanvik@...> writes:
>>
>>> Hi
>> I am not using in-memory database but using disk based only.
>> I am trying to import 150mb .csv file into database but getting error as
>> java.lang.outOfMemoryError:java heap space error
>>
>> How to remove this problem?
>>
>>> >  Thanks
>>> > Ritu
>>>
>>>
>>
>>
>>
>> 
> 
> 
> 

-- 
View this message in context: http://old.nabble.com/OutOfMemoryException-when-executing-180%2C000-batches-tp32807353p32901370.html
Sent from the Apache Derby Users mailing list archive at Nabble.com.


Mime
View raw message