commons-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Christian Grobmeier (JIRA)" <>
Subject [jira] Updated: (COMPRESS-16) unable to extract a TAR file that contains an entry which is 10 GB in size
Date Mon, 04 May 2009 16:03:30 GMT


Christian Grobmeier updated COMPRESS-16:

    Attachment: patch-for-compress.txt

Based on the ant patch by David Wartell I created a compress-patch.
I didn't commit it till I got it tested - if somebody has > 8 gig files around, please
help with testing :-)

> unable to extract a TAR file that contains an entry which is 10 GB in size
> --------------------------------------------------------------------------
>                 Key: COMPRESS-16
>                 URL:
>             Project: Commons Compress
>          Issue Type: Bug
>         Environment: I am using win xp sp3, but this should be platform independent.
>            Reporter: Sam Smith
>             Fix For: 1.1
>         Attachments: ant-8GB-tar.patch, patch-for-compress.txt
> I made a TAR file which contains a file entry where the file is 10 GB in size.
> When I attempt to extract the file using TarInputStream, it fails with the following
stack trace:
> unexpected EOF with 24064 bytes unread
> 		at
> 		at org.apache.commons.compress.archivers.tar.TarInputStream.copyEntryContents(
> So, TarInputStream does not seem to support large (> 8 GB?) files.
> Here is something else to note: I created that TAR file using TarOutputStream , which
did not complain when asked to write a 10 GB file into the TAR file, so I assume that TarOutputStream
has no file size limits?  That, or does it silently create corrupted TAR files (which would
be the worst situation of all...)?

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message