lucene-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Michael McCandless (JIRA)" <>
Subject [jira] [Updated] (LUCENE-2501) ArrayIndexOutOfBoundsException in ByteBlockPool.allocSlice
Date Thu, 02 Aug 2012 18:26:02 GMT


Michael McCandless updated LUCENE-2501:

    Attachment: LUCENE-2501.patch

OK I found a possible cause behind this ... it was something I had
fixed but didn't pull out and backport to 3.x LUCENE-3684.

It's a thread safety issue, when FielfInfo.indexOptions changes from
DOCS_AND_FREQS_AND_POSITIONS to not indexing positions.  If this
happens in one thread while a new thread is suddenly indexing a that
same field there's a narrow window where the 2nd thread's
FreqProxTermsWriterPerField can mis-report the streamCount as 1 when
it should be 2.

Attached patch (3.6.x) should fix it.  I tried to get a thread test to
provoke this but couldn't ... I think the window is too small (if I
forcefully add sleeps at the "right time" in
FreqProxTermsWriterPerField then I could provoke it...).

> ArrayIndexOutOfBoundsException in ByteBlockPool.allocSlice
> ----------------------------------------------------------
>                 Key: LUCENE-2501
>                 URL:
>             Project: Lucene - Core
>          Issue Type: Bug
>          Components: core/index
>    Affects Versions: 3.0.1
>            Reporter: Tim Smith
>         Attachments: LUCENE-2501.patch
> I'm seeing the following exception during indexing:
> {code}
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 14
> at org.apache.lucene.index.ByteBlockPool.allocSlice(
> at org.apache.lucene.index.TermsHashPerField.writeByte(
> at org.apache.lucene.index.TermsHashPerField.writeVInt(
> at org.apache.lucene.index.FreqProxTermsWriterPerField.writeProx(
> at org.apache.lucene.index.FreqProxTermsWriterPerField.newTerm(
> at org.apache.lucene.index.TermsHashPerField.add(
> at org.apache.lucene.index.DocInverterPerField.processFields(
> at org.apache.lucene.index.DocFieldProcessorPerThread.processDocument(
> at org.apache.lucene.index.DocumentsWriter.updateDocument(
> at org.apache.lucene.index.DocumentsWriter.updateDocument(
> at org.apache.lucene.index.IndexWriter.updateDocument(
> ... 37 more
> {code}
> This seems to be caused by the following code:
> {code}
>     final int level = slice[upto] & 15;
>     final int newLevel = nextLevelArray[level];
>     final int newSize = levelSizeArray[newLevel];
> {code}
> this can result in "level" being a value between 0 and 14
> the array nextLevelArray is only of size 10
> i suspect the solution would be to either max the level to 10, or to add more entries
to the nextLevelArray so it has 15 entries
> however, i don't know if something more is going wrong here and this is just where the
exception hits from a deeper issue

This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:!default.jspa
For more information on JIRA, see:


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message