commons-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mario Ivankovits <ma...@ops.co.at>
Subject Re: [commons vfs]: Handling large packages, memory error
Date Thu, 01 Jun 2006 09:11:31 GMT
Hi!
> When extracting large zip packages (~1.6Gb), Java heap space error is
> thrown. Currently used attribute -Xmx500M isn't enough. Memory usage rises
> gradually during extraction (slowly, but surely). We have a need to handle
> packages up to 2Gb size.
>   
Do this zip file contains many entries?
If this is the case then we are out of luck currently. VFS reads the
whole directory structure in memory.
I dont see what we can do to avoid this without introducing a HUGE
performance impact.

If this is not the case it would be nice if you could start up a
profiler to see what uses so much memory, maybe its again a
java.util.zip limitation? Did you try some simple code using
java.util.zip directly?

Ciao,
Mario


---------------------------------------------------------------------
To unsubscribe, e-mail: commons-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: commons-user-help@jakarta.apache.org


Mime
View raw message