cocoon-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Boisvert, Éric" <>
Subject processing large files
Date Fri, 28 Oct 2005 15:30:50 GMT
Hi all

I need to process large xml file and as I tested with increasingly larger
file, the time to process suddently increased a lot.  For instance, 200 K
files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it
jumps to 30 seconds (nearly 10 times, for twice the size).. I played with
the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to
512 Megs, nothing helped.  I guess this is related to the fact that at some
point the incoming document cannot be loaded entirely in memory.

Anyone has an idea to fix this ?

Cheers and thanks

Eric Boisvert
Spécialiste TI-GI / IT-IM specialist, 418-654-3705, facsimile/télécopieur 
490, rue de la Couronne, Québec (Québec), G1K 9A9
490, rue de la Couronne, Quebec, Quebec, G1K 9A9

Laboratoire de cartographie numérique et de photogrammétrie (LCNP)
Digital Cartography and Photogrammetry Laboratory (DCPL)
Commission géologique du Canada (Québec) / Geological Survey of Canada
Ressources naturelles Canada / Natural Resources Canada
Gouvernement du Canada / Government of Canada

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message