hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Eli Collins (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (HADOOP-502) Summer buffer overflow exception
Date Thu, 11 Aug 2011 19:00:28 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-502?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel

Eli Collins resolved HADOOP-502.

    Resolution: Won't Fix

Out of date.

> Summer buffer overflow exception
> --------------------------------
>                 Key: HADOOP-502
>                 URL: https://issues.apache.org/jira/browse/HADOOP-502
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 0.5.0
>            Reporter: Owen O'Malley
>            Assignee: Owen O'Malley
> The extended error message with the offending values finally paid off and I was able
to get the values that were causing the Summber buffer overflow exception.
> java.lang.RuntimeException: Summer buffer overflow b.len=4096, off=0, summed=512, read=2880,
bytesPerSum=1, inSum=512
>         at org.apache.hadoop.fs.FSDataInputStream$Checker.read(FSDataInputStream.java:100)
>         at org.apache.hadoop.fs.FSDataInputStream$PositionCache.read(FSDataInputStream.java:170)
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
>         at java.io.DataInputStream.read(DataInputStream.java:80)
>         at org.apache.hadoop.util.CopyFiles$DFSCopyFilesMapper.copy(CopyFiles.java:190)
>         at org.apache.hadoop.util.CopyFiles$DFSCopyFilesMapper.map(CopyFiles.java:391)
>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:46)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:196)
>         at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1075)
> Caused by: java.lang.ArrayIndexOutOfBoundsException
>         at java.util.zip.CRC32.update(CRC32.java:43)
>         at org.apache.hadoop.fs.FSDataInputStream$Checker.read(FSDataInputStream.java:98)
>         ... 9 more
> Tracking through the code, what happens is inside of FSDataInputStream.Checker.read()
the verifySum gets an  EOF Exception and turns off the summing. Among other things this sets
the bytesPerSum to 1. Unfortunately, that leads to the ArrayIndexOutOfBoundsException.
> I think the problem is that the original EOF exception was logged and ignored. I propose
that we allow the original EOF to propagate back to the caller. (So that file not found will
still disable the checksum checking, but we will detect truncated checksum files.)

This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


View raw message