db-torque-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brian McCallister <mccallis...@forthillcompany.com>
Subject datadump and large databases
Date Wed, 16 Apr 2003 15:02:17 GMT
I posed this question on the users list, but it really belongs here.

Are there any significant differences between the datadump task in 3.0 
and 3.1? It presently fails with memory problems on large databases - 
while I realize it was never really meant for large database work, it 
sure as heck could be convenient tool for database migration if it did 
work nicely =)

The exact situation I am hitting is:

Using PostgreSQL 7.3.2, the org.postgresql.Driver current release 
version, about 600 megs of data on an uncompressed sql dump and 
allocating a 1.5 GB heap to ant I out of memory error part way through. 
This is on Torque 3.0 release

I haven't had a chance to look at the code, just wanted to ask if by 
chance 3.1 uses streamed data instead of apparently keeping a fairly 
large in-memory model of the whole thing =)

-Brian McCallister


Mime
View raw message