commons-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "hrjk" <har...@zeniitti.net>
Subject Re: [commons vfs]: Handling large packages, memory error
Date Fri, 02 Jun 2006 04:47:19 GMT
Hi.

> Hi!
>> When extracting large zip packages (~1.6Gb), Java heap space error is
>> thrown. Currently used attribute -Xmx500M isn't enough. Memory usage
>> rises
>> gradually during extraction (slowly, but surely). We have a need to
>> handle
>> packages up to 2Gb size.
>>
> Do this zip file contains many entries?
> If this is the case then we are out of luck currently. VFS reads the
> whole directory structure in memory.
> I dont see what we can do to avoid this without introducing a HUGE
> performance impact.

Yes, the zip files contain a lot of entries. However, we're not
experiencing problems when we extract say 1 big zip. The problems occur
later, when we have extracted like 5 of them. It seems like the VFS
doesn't free the used memory, even when the object used is out of scope.

Also, we've tried to clear cache with
getFilesCache().clear(usedFileObject.getFileSystem()). Even this doesn't
seem to help.

Do you see this as a potential VFS problem?

> If this is not the case it would be nice if you could start up a
> profiler to see what uses so much memory, maybe its again a
> java.util.zip limitation? Did you try some simple code using
> java.util.zip directly?

Could you point me to a working profiler? I have tried to found some for
Eclipse 3.1.2, but I'm out of luck.

Thanks a lot for your help.

BR,
-hrjk



---------------------------------------------------------------------
To unsubscribe, e-mail: commons-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: commons-user-help@jakarta.apache.org


Mime
View raw message