commons-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mario Ivankovits <>
Subject Re: [commons vfs]: Handling large packages, memory error
Date Thu, 01 Jun 2006 09:11:31 GMT
> When extracting large zip packages (~1.6Gb), Java heap space error is
> thrown. Currently used attribute -Xmx500M isn't enough. Memory usage rises
> gradually during extraction (slowly, but surely). We have a need to handle
> packages up to 2Gb size.
Do this zip file contains many entries?
If this is the case then we are out of luck currently. VFS reads the
whole directory structure in memory.
I dont see what we can do to avoid this without introducing a HUGE
performance impact.

If this is not the case it would be nice if you could start up a
profiler to see what uses so much memory, maybe its again a limitation? Did you try some simple code using directly?


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message