commons-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Stefan Bodewig (Commented) (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (COMPRESS-16) unable to extract a TAR file that contains an entry which is 10 GB in size
Date Tue, 28 Feb 2012 11:57:49 GMT

    [ https://issues.apache.org/jira/browse/COMPRESS-16?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13218088#comment-13218088
] 

Stefan Bodewig commented on COMPRESS-16:
----------------------------------------

Our code is wrong.

to_chars in src/create.c in GNU tar only uses the remaining bytes and sets the first one to
255 or 128 for negative/positive numbers.  Negative numbers only occur in time fields where
we don't support anything non-octal ATM anyway, so this isn't a real problem right now.  It
becomes one if we support star/GNU tar/POSIX dialects for the other numeric fields as well.
 This would be required for COMPRESS-177.

I suggest to broaden and reopen COMPRESS-177 to something like "extend STAR/POSIX support
to all numeric fields" or alternatively create a new issue and close this one again.
                
> unable to extract a TAR file that contains an entry which is 10 GB in size
> --------------------------------------------------------------------------
>
>                 Key: COMPRESS-16
>                 URL: https://issues.apache.org/jira/browse/COMPRESS-16
>             Project: Commons Compress
>          Issue Type: Bug
>          Components: Archivers
>         Environment: I am using win xp sp3, but this should be platform independent.
>            Reporter: Sam Smith
>             Fix For: 1.4
>
>         Attachments: 0001-Accept-GNU-tar-files-with-entries-over-8GB-in-size.patch, 0002-Allow-creating-tar-archives-with-files-over-8GB.patch,
0004-Prefer-octal-over-binary-size-representation.patch, ant-8GB-tar.patch, patch-for-compress.txt
>
>
> I made a TAR file which contains a file entry where the file is 10 GB in size.
> When I attempt to extract the file using TarInputStream, it fails with the following
stack trace:
> 	java.io.IOException: unexpected EOF with 24064 bytes unread
> 		at org.apache.commons.compress.archivers.tar.TarInputStream.read(TarInputStream.java:348)
> 		at org.apache.commons.compress.archivers.tar.TarInputStream.copyEntryContents(TarInputStream.java:388)
> So, TarInputStream does not seem to support large (> 8 GB?) files.
> Here is something else to note: I created that TAR file using TarOutputStream , which
did not complain when asked to write a 10 GB file into the TAR file, so I assume that TarOutputStream
has no file size limits?  That, or does it silently create corrupted TAR files (which would
be the worst situation of all...)?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Mime
View raw message