hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hardik Pandya <smarty.ju...@gmail.com>
Subject Re: Spill Failed Caused by ArrayIndexOutOfBoundsException
Date Mon, 06 Jan 2014 22:31:53 GMT
The error is happening during Sort And Spill phase

org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill

It seems like you are trying to compare two Int values and it fails during
compare

Caused by: java.lang.ArrayIndexOutOfBoundsException: 99614720
        at
org.apache.hadoop.io.WritableComparator.readInt(WritableComparator.java:158)
        at
org.apache.hadoop.io.BooleanWritable$Comparator.
compare(BooleanWritable.java:103)
        at
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.compare(MapTask.java:1116)
        at org.apache.hadoop.util.QuickSort.sortInternal(QuickSort.java:95)
        at org.apache.hadoop.util.QuickSort.sort(QuickSort.java:59)
        at
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:
1404)


On Mon, Jan 6, 2014 at 3:21 PM, Paul Mahon <pmahon@decarta.com> wrote:

> I have a hadoop program that I'm running with version 1.2.1 which
> fails in a peculiar place. Most mappers complete without error, but
> some fail with this stack trace:
>
> java.io.IOException: Spill failed
>         at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1297)
>         at
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:698)
>         at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1793)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:779)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 99614720
>         at
>
> org.apache.hadoop.io.WritableComparator.readInt(WritableComparator.java:158)
>         at
>
> org.apache.hadoop.io.BooleanWritable$Comparator.compare(BooleanWritable.java:103)
>         at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.compare(MapTask.java:1116)
>         at org.apache.hadoop.util.QuickSort.sortInternal(QuickSort.java:95)
>         at org.apache.hadoop.util.QuickSort.sort(QuickSort.java:59)
>         at
>
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1404)
>         at
>
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$1800(MapTask.java:858)
>         at
>
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1349)
>
> I've noticed that that array index is exactly the size of the bufvoid,
> but I'm not sure if that has any significance.
>
> The exception isn't happening in my WritableComparable or any of my
> code, it's all in hadoop. I'm not sure what to do to track down what
> I'm doing to cause the problem. Has anyone seen a problem like this or
> have any suggestions of where to look for the problem in my code?
>

Mime
View raw message