cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrus Adamchik <>
Subject Re: raw dataport
Date Wed, 14 Jun 2006 13:45:31 GMT
Probably will be a nice addition after 1.2 (especially if we could  
preserve the DataRows and avoid raw JDBC)


On Jun 14, 2006, at 9:47 AM, Tore Halset wrote:

> Hello.
> Anyone got dataport to work on huge databases with lots of rows and  
> lots of blobs/clobs? I had problems porting over one of our  
> databases yesterday. One of the tables has ~12M rows with clobs.  
> Even though INSERT_BATCH_SIZE are 1000, it would just go on forever  
> without committing the first 1000 rows. It would also gladly throw  
> away OutOfMemoryExceptions..
> I ended up writing a new DataPort.processInsert that use the model  
> to create plain jdbc sql statements. I also changed the partially  
> commit algorithm to commit based on the number of bytes read/ 
> written since the previous commit instead of the number of rows.
> After the change, DataPort would port anything without problems :)  
> The 17GB MS SQL Database got over to PostgreSQL on my old PowerBook  
> in a few hours without any memory problems.
> So, what do you think? Am I using the current DataPort incorrectly?  
> Should this feature replace the current dataport, be enabled with a  
> raw-flag, or perhaps be availiable as a new ant task? It is at  
> least useful for me :) After 1.2 of course.
> Regards,
>  - Tore.

View raw message