lucene-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Uwe Schindler" <...@thetaphi.de>
Subject RE: svn commit: r928283 - in /lucene/java/branches/flex_1458: ./ backwards/src/ backwards/src/java/org/apache/lucene/search/ backwards/src/test/org/apache/lucene/analysis/ backwards/src/test/org/apache/lucene/document/ backwards/src/test/org/apache/lucene
Date Sat, 27 Mar 2010 21:24:11 GMT
Thanks! Was it hard?

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: uwe@thetaphi.de


> -----Original Message-----
> From: mikemccand@apache.org [mailto:mikemccand@apache.org]
> Sent: Saturday, March 27, 2010 10:21 PM
> To: java-commits@lucene.apache.org
> Subject: svn commit: r928283 - in /lucene/java/branches/flex_1458: ./
> backwards/src/ backwards/src/java/org/apache/lucene/search/
> backwards/src/test/org/apache/lucene/analysis/
> backwards/src/test/org/apache/lucene/document/
> backwards/src/test/org/apache/lucene/...
> 
> Author: mikemccand
> Date: Sat Mar 27 21:21:26 2010
> New Revision: 928283
> 
> URL: http://svn.apache.org/viewvc?rev=928283&view=rev
> Log:
> LUCENE-2329: merge to flex branch (from trunk rev 926791)
> 
> Added:
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/Paralle
> lPostingsArray.java
>       - copied unchanged from r926791,
> lucene/dev/trunk/lucene/src/java/org/apache/lucene/index/ParallelPostin
> gsArray.java
> Removed:
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/RawPost
> ingList.java
> Modified:
>     lucene/java/branches/flex_1458/   (props changed)
>     lucene/java/branches/flex_1458/CHANGES.txt
>     lucene/java/branches/flex_1458/backwards/src/   (props changed)
> 
> lucene/java/branches/flex_1458/backwards/src/java/org/apache/lucene/sea
> rch/MultiTermQueryWrapperFilter.java   (props changed)
> 
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/ana
> lysis/TestISOLatin1AccentFilter.java   (props changed)
> 
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/doc
> ument/TestDateTools.java   (props changed)
> 
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/doc
> ument/TestNumberTools.java   (props changed)
> 
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/ind
> ex/TestBackwardsCompatibility.java   (props changed)
> 
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/uti
> l/TestAttributeSource.java   (props changed)
>     lucene/java/branches/flex_1458/build.xml   (props changed)
>     lucene/java/branches/flex_1458/contrib/   (props changed)
>     lucene/java/branches/flex_1458/contrib/CHANGES.txt   (props
> changed)
>     lucene/java/branches/flex_1458/contrib/highlighter/src/test/
> (props changed)
> 
> lucene/java/branches/flex_1458/contrib/instantiated/src/test/org/apache
> /lucene/store/instantiated/TestIndicesEquals.java   (props changed)
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/analysis/Toke
> nizer.java   (props changed)
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xFieldMergeState.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xTermsWriter.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xTermsWriterPerField.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermVec
> torsTermsWriter.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermVec
> torsTermsWriterPerField.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> sh.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shConsumer.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shConsumerPerField.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shPerField.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shPerThread.java
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/search/MultiT
> ermQueryWrapperFilter.java   (props changed)
> 
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/store/Directo
> ry.java
> 
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/analysis/Test
> ISOLatin1AccentFilter.java   (props changed)
> 
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/document/Test
> DateTools.java   (props changed)
> 
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/document/Test
> NumberTools.java   (props changed)
> 
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/index/TestBac
> kwardsCompatibility.java   (props changed)
> 
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/util/TestAttr
> ibuteSource.java   (props changed)
> 
> Propchange: lucene/java/branches/flex_1458/
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -/lucene/dev/trunk/lucene:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene:926653,926791,926948-928243,928255
>  /lucene/java/branches/lucene_2_4:748824
>  /lucene/java/branches/lucene_2_9:817269-
> 818600,829134,829881,831036,896850,909334
>  /lucene/java/branches/lucene_2_9_back_compat_tests:818601-821336
> 
> Modified: lucene/java/branches/flex_1458/CHANGES.txt
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/CHANGES.txt
> ?rev=928283&r1=928282&r2=928283&view=diff
> =======================================================================
> =======
> --- lucene/java/branches/flex_1458/CHANGES.txt (original)
> +++ lucene/java/branches/flex_1458/CHANGES.txt Sat Mar 27 21:21:26 2010
> @@ -6,9 +6,11 @@ Changes in backwards compatibility polic
> 
>  * LUCENE-1458, LUCENE-2111: Changes from flexible indexing:
>    - MultiReader ctor now throws IOException
> -  - Directory.copy now copies all files (not just index files), since
> -    what is and isn't and index file is now dependent on the codec
> -    used. (Mike McCandless)
> +
> +  - Directory.copy/Directory.copyTo now copies all files (not just
> +    index files), since what is and isn't and index file is now
> +    dependent on the codecs used. (Mike McCandless)
> +
>    - UnicodeUtil now uses BytesRef for UTF-8 output, and some method
>      signatures have changed to CharSequence.  These are advanced APIs
>      and subject to change suddenly.  (Robert Muir, Mike McCandless)
> @@ -319,6 +321,15 @@ Optimizations
>    TermAttributeImpl, move DEFAULT_TYPE constant to TypeInterface,
> improve
>    null-handling for TypeAttribute.  (Uwe Schindler)
> 
> +* LUCENE-2329: Switch TermsHash* from using a PostingList object per
> unique
> +  term to parallel arrays, indexed by termID. This reduces garbage
> collection
> +  overhead significantly, which results in great indexing performance
> wins
> +  when the available JVM heap space is low. This will become even more
> +  important when the DocumentsWriter RAM buffer is searchable in the
> future,
> +  because then it will make sense to make the RAM buffers as large as
> +  possible. (Mike McCandless, Michael Busch)
> +
> +
>  Build
> 
>  * LUCENE-2124: Moved the JDK-based collation support from
> contrib/collation
> 
> Propchange: lucene/java/branches/flex_1458/backwards/src/
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -/lucene/dev/trunk/lucene/backwards/src:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/backwards/src:926653,926791,926948-
> 928243,928255
>  /lucene/java/branches/lucene_2_4/src:748824
>  /lucene/java/branches/lucene_2_9/src:817269-
> 818600,825998,829134,829881,831036
>  /lucene/java/branches/lucene_2_9_back_compat_tests/src:818601-821336
> 
> Propchange:
> lucene/java/branches/flex_1458/backwards/src/java/org/apache/lucene/sea
> rch/MultiTermQueryWrapperFilter.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/backwards/src/java/org/apache/lucene/search/Mu
> ltiTermQueryWrapperFilter.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/backwards/src/java/org/apache/lucene/search/M
> ultiTermQueryWrapperFilter.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/java/org/apache/lucene/search/Mult
> iTermQueryWrapperFilter.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/java/org/apache/lucene/search/Mult
> iTermQueryWrapperFilter.java:817269-818600,825998,829134,829881,831036
> 
> /lucene/java/branches/lucene_2_9_back_compat_tests/src/java/org/apache/
> lucene/search/MultiTermQueryWrapperFilter.java:818601-821336
> 
> Propchange:
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/ana
> lysis/TestISOLatin1AccentFilter.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/analysis/
> TestISOLatin1AccentFilter.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/analysis
> /TestISOLatin1AccentFilter.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/analysis/Te
> stISOLatin1AccentFilter.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/analysis/Te
> stISOLatin1AccentFilter.java:825998,829134,829881,831036
> 
> /lucene/java/branches/lucene_3_0/src/test/org/apache/lucene/analysis/Te
> stISOLatin1AccentFilter.java:880754,880793,880823,881216,881317,881376,
> 881473,881549,881820,882467,882890,883076,883080,912383
> 
> Propchange:
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/doc
> ument/TestDateTools.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/document/
> TestDateTools.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/document
> /TestDateTools.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/document/Te
> stDateTools.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/document/Te
> stDateTools.java:825998,829134,829881,831036
> 
> /lucene/java/branches/lucene_3_0/src/test/org/apache/lucene/document/Te
> stDateTools.java:880754,880793,880823,881216,881317,881376,881473,88154
> 9,881820,882467,882890,883076,883080,912383
> 
> Propchange:
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/doc
> ument/TestNumberTools.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/document/
> TestNumberTools.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/document
> /TestNumberTools.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/document/Te
> stNumberTools.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/document/Te
> stNumberTools.java:825998,829134,829881,831036
> 
> /lucene/java/branches/lucene_3_0/src/test/org/apache/lucene/document/Te
> stNumberTools.java:880754,880793,880823,881216,881317,881376,881473,881
> 549,881820,882467,882890,883076,883080,912383
> 
> Propchange:
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/ind
> ex/TestBackwardsCompatibility.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/index/Tes
> tBackwardsCompatibility.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/index/Te
> stBackwardsCompatibility.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/index/TestB
> ackwardsCompatibility.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/index/TestB
> ackwardsCompatibility.java:825998,829134,829881,831036
> 
> /lucene/java/branches/lucene_3_0/src/test/org/apache/lucene/index/TestB
> ackwardsCompatibility.java:880754,880793,880823,881216,881317,881376,88
> 1473,881549,881820,882467,882890,883076,883080,912383
> 
> Propchange:
> lucene/java/branches/flex_1458/backwards/src/test/org/apache/lucene/uti
> l/TestAttributeSource.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/util/Test
> AttributeSource.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/backwards/src/test/org/apache/lucene/util/Tes
> tAttributeSource.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/util/TestAt
> tributeSource.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/util/TestAt
> tributeSource.java:817269-818600,825998,829134,829881,831036
> 
> /lucene/java/branches/lucene_2_9_back_compat_tests/src/test/org/apache/
> lucene/util/TestAttributeSource.java:818601-821336
> 
> Propchange: lucene/java/branches/flex_1458/build.xml
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -/lucene/dev/trunk/lucene/build.xml:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/build.xml:926653,926791,926948-928243,928255
>  /lucene/java/branches/lucene_2_9/build.xml:909334
> 
> /lucene/java/trunk/build.xml:916596,916685,916755,916769,917019,917148,
> 917154,917161,917203-917204,919060,919119,919718,919731,919869,920237-
> 920499,922013,922277,922525,922528,922583,922593,922602,922624,922797,9
> 22799,922806,922886-922887,923112,923238,924151,924207,924286,924321-
> 924731,924781,925176-925462
>  /lucene/solr/branches/newtrunk/lucene/build.xml:926020-926289
> 
> Propchange: lucene/java/branches/flex_1458/contrib/
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -/lucene/dev/trunk/lucene/contrib:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/contrib:926653,926791,926948-928243,928255
>  /lucene/java/branches/lucene_2_4/contrib:748824
>  /lucene/java/branches/lucene_2_9/contrib:817269-
> 818600,825998,829134,829816,829881,831036,896850,909334
>  /lucene/java/branches/lucene_2_9_back_compat_tests/contrib:818601-
> 821336
> 
> Propchange: lucene/java/branches/flex_1458/contrib/CHANGES.txt
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -/lucene/dev/trunk/lucene/contrib/CHANGES.txt:926653,926948-
> 928243,928255
> +/lucene/dev/trunk/lucene/contrib/CHANGES.txt:926653,926791,926948-
> 928243,928255
>  /lucene/java/branches/lucene_2_4/contrib/CHANGES.txt:748824
>  /lucene/java/branches/lucene_2_9/contrib/CHANGES.txt:817269-
> 818600,825998,826775,829134,829816,829881,831036,896850
> 
> /lucene/java/branches/lucene_2_9_back_compat_tests/contrib/CHANGES.txt:
> 818601-821336
> 
> Propchange:
> lucene/java/branches/flex_1458/contrib/highlighter/src/test/
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -/lucene/dev/trunk/lucene/contrib/highlighter/src/test:926653,926948-
> 928243,928255
> +/lucene/dev/trunk/lucene/contrib/highlighter/src/test:926653,926791,92
> 6948-928243,928255
>  /lucene/java/branches/lucene_2_4/contrib/highlighter/src/test:748824
>  /lucene/java/branches/lucene_2_9/contrib/highlighter/src/test:817269-
> 818600,825998,826775,829134,829816,829881,831036,896850,909334
> 
> /lucene/java/branches/lucene_2_9_back_compat_tests/contrib/highlighter/
> src/test:818601-821336
> 
> Propchange:
> lucene/java/branches/flex_1458/contrib/instantiated/src/test/org/apache
> /lucene/store/instantiated/TestIndicesEquals.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/contrib/instantiated/src/test/org/apache/lucen
> e/store/instantiated/TestIndicesEquals.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/contrib/instantiated/src/test/org/apache/luce
> ne/store/instantiated/TestIndicesEquals.java:926653,926791,926948-
> 928243,928255
> 
> /lucene/java/branches/lucene_2_9/contrib/instantiated/src/test/org/apac
> he/lucene/store/instantiated/TestIndicesEquals.java:909334
> 
> /lucene/java/trunk/contrib/instantiated/src/test/org/apache/lucene/stor
> e/instantiated/TestIndicesEquals.java:916596,916685,916755,916769,91701
> 9,917148,917154,917161,917203-
> 917204,919060,919119,919718,919731,919869,920237-
> 920499,922013,922277,922525,922528,922583,922593,922602,922624,922797,9
> 22799,922806,922886-922887,923112,923238,924151,924207,924286,924321-
> 924731,924781,925176-925462
> 
> /lucene/solr/branches/newtrunk/lucene/contrib/instantiated/src/test/org
> /apache/lucene/store/instantiated/TestIndicesEquals.java:926020-926289
> 
> Propchange:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/analysis/Toke
> nizer.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/src/java/org/apache/lucene/analysis/Tokenizer.
> java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/src/java/org/apache/lucene/analysis/Tokenizer
> .java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_9/src/java/org/apache/lucene/analysis/To
> kenizer.java:909334
> 
> /lucene/java/trunk/src/java/org/apache/lucene/analysis/Tokenizer.java:9
> 16596,916685,916755,916769,917019,917148,917154,917161,917203-
> 917204,919060,919119,919718,919731,919869,920237-
> 920499,922013,922277,922525,922528,922583,922593,922602,922624,922797,9
> 22799,922806,922886-922887,923112,923238,924151,924207,924286,924321-
> 924731,924781,925176-925462
> 
> /lucene/solr/branches/newtrunk/lucene/src/java/org/apache/lucene/analys
> is/Tokenizer.java:926020-926289
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xFieldMergeState.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/FreqProxFieldMergeState.java?rev=928283&r1=928282
> &r2=928283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xFieldMergeState.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xFieldMergeState.java Sat Mar 27 21:21:26 2010
> @@ -21,6 +21,8 @@ import java.io.IOException;
>  import java.util.Comparator;
>  import org.apache.lucene.util.BytesRef;
> 
> +import
> org.apache.lucene.index.FreqProxTermsWriterPerField.FreqProxPostingsArr
> ay;
> +
>  // TODO FI: some of this is "generic" to TermsHash* so we
>  // should factor it out so other consumers don't have to
>  // duplicate this code
> @@ -32,9 +34,10 @@ final class FreqProxFieldMergeState {
>    final FreqProxTermsWriterPerField field;
>    final int numPostings;
>    private final ByteBlockPool bytePool;
> -  final RawPostingList[] postings;
> -
> -  private FreqProxTermsWriter.PostingList p;
> +  final int[] termIDs;
> +  final FreqProxPostingsArray postings;
> +  int currentTermID;
> +
>    final BytesRef text = new BytesRef();
> 
>    private int postingUpto = -1;
> @@ -48,8 +51,9 @@ final class FreqProxFieldMergeState {
>    public FreqProxFieldMergeState(FreqProxTermsWriterPerField field,
> Comparator<BytesRef> termComp) {
>      this.field = field;
>      this.numPostings = field.termsHashPerField.numPostings;
> -    this.postings = field.termsHashPerField.sortPostings(termComp);
>      this.bytePool = field.perThread.termsHashPerThread.bytePool;
> +    this.termIDs = field.termsHashPerField.sortPostings(termComp);
> +    this.postings = (FreqProxPostingsArray)
> field.termsHashPerField.postingsArray;
>    }
> 
>    boolean nextTerm() throws IOException {
> @@ -58,15 +62,16 @@ final class FreqProxFieldMergeState {
>        return false;
>      }
> 
> -    p = (FreqProxTermsWriter.PostingList) postings[postingUpto];
> +    currentTermID = termIDs[postingUpto];
>      docID = 0;
> 
>      // Get BytesRef
> -    bytePool.setBytesRef(text, p.textStart);
> +    final int textStart = postings.textStarts[currentTermID];
> +    bytePool.setBytesRef(text, textStart);
> 
> -    field.termsHashPerField.initReader(freq, p, 0);
> +    field.termsHashPerField.initReader(freq, currentTermID, 0);
>      if (!field.fieldInfo.omitTermFreqAndPositions) {
> -      field.termsHashPerField.initReader(prox, p, 1);
> +      field.termsHashPerField.initReader(prox, currentTermID, 1);
>      }
> 
>      // Should always be true
> @@ -78,12 +83,12 @@ final class FreqProxFieldMergeState {
> 
>    public boolean nextDoc() throws IOException {
>      if (freq.eof()) {
> -      if (p.lastDocCode != -1) {
> +      if (postings.lastDocCodes[currentTermID] != -1) {
>          // Return last doc
> -        docID = p.lastDocID;
> +        docID = postings.lastDocIDs[currentTermID];
>          if (!field.omitTermFreqAndPositions)
> -          termFreq = p.docFreq;
> -        p.lastDocCode = -1;
> +          termFreq = postings.docFreqs[currentTermID];
> +        postings.lastDocCodes[currentTermID] = -1;
>          return true;
>        } else
>          // EOF
> @@ -101,7 +106,7 @@ final class FreqProxFieldMergeState {
>          termFreq = freq.readVInt();
>      }
> 
> -    assert docID != p.lastDocID;
> +    assert docID != postings.lastDocIDs[currentTermID];
> 
>      return true;
>    }
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xTermsWriter.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/FreqProxTermsWriter.java?rev=928283&r1=928282&r2=
> 928283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xTermsWriter.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xTermsWriter.java Sat Mar 27 21:21:26 2010
> @@ -39,13 +39,6 @@ final class FreqProxTermsWriter extends
>    }
> 
>    @Override
> -  void createPostings(RawPostingList[] postings, int start, int count)
> {
> -    final int end = start + count;
> -    for(int i=start;i<end;i++)
> -      postings[i] = new PostingList();
> -  }
> -
> -  @Override
>    void closeDocStore(SegmentWriteState state) {}
> 
>    @Override
> @@ -307,16 +300,4 @@ final class FreqProxTermsWriter extends
> 
>      termsConsumer.finish();
>    }
> -
> -  static final class PostingList extends RawPostingList {
> -    int docFreq;                                    // # times this
> term occurs in the current doc
> -    int lastDocID;                                  // Last docID
> where this term occurred
> -    int lastDocCode;                                // Code for prior
> doc
> -    int lastPosition;                               // Last position
> where this term occurred
> -  }
> -
> -  @Override
> -  int bytesPerPosting() {
> -    return RawPostingList.BYTES_SIZE + 4 *
> DocumentsWriter.INT_NUM_BYTE;
> -  }
>  }
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xTermsWriterPerField.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/FreqProxTermsWriterPerField.java?rev=928283&r1=92
> 8282&r2=928283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xTermsWriterPerField.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/FreqPro
> xTermsWriterPerField.java Sat Mar 27 21:21:26 2010
> @@ -18,8 +18,9 @@ package org.apache.lucene.index;
>   */
> 
>  import java.io.IOException;
> -import org.apache.lucene.document.Fieldable;
> +
>  import org.apache.lucene.analysis.tokenattributes.PayloadAttribute;
> +import org.apache.lucene.document.Fieldable;
> 
>  // TODO: break into separate freq and prox writers as
>  // codecs; make separate container (tii/tis/skip/*) that can
> @@ -87,7 +88,7 @@ final class FreqProxTermsWriterPerField
>      }
>    }
> 
> -  final void writeProx(FreqProxTermsWriter.PostingList p, int
> proxCode) {
> +  final void writeProx(final int termID, int proxCode) {
>      final Payload payload;
>      if (payloadAttribute == null) {
>        payload = null;
> @@ -102,66 +103,111 @@ final class FreqProxTermsWriterPerField
>        hasPayloads = true;
>      } else
>        termsHashPerField.writeVInt(1, proxCode<<1);
> -    p.lastPosition = fieldState.position;
> +
> +    FreqProxPostingsArray postings = (FreqProxPostingsArray)
> termsHashPerField.postingsArray;
> +    postings.lastPositions[termID] = fieldState.position;
> +
>    }
> 
>    @Override
> -  final void newTerm(RawPostingList p0) {
> +  final void newTerm(final int termID) {
>      // First time we're seeing this term since the last
>      // flush
>      assert docState.testPoint("FreqProxTermsWriterPerField.newTerm
> start");
> -    FreqProxTermsWriter.PostingList p =
> (FreqProxTermsWriter.PostingList) p0;
> -    p.lastDocID = docState.docID;
> +
> +    FreqProxPostingsArray postings = (FreqProxPostingsArray)
> termsHashPerField.postingsArray;
> +    postings.lastDocIDs[termID] = docState.docID;
>      if (omitTermFreqAndPositions) {
> -      p.lastDocCode = docState.docID;
> +      postings.lastDocCodes[termID] = docState.docID;
>      } else {
> -      p.lastDocCode = docState.docID << 1;
> -      p.docFreq = 1;
> -      writeProx(p, fieldState.position);
> +      postings.lastDocCodes[termID] = docState.docID << 1;
> +      postings.docFreqs[termID] = 1;
> +      writeProx(termID, fieldState.position);
>      }
>    }
> 
>    @Override
> -  final void addTerm(RawPostingList p0) {
> +  final void addTerm(final int termID) {
> 
>      assert docState.testPoint("FreqProxTermsWriterPerField.addTerm
> start");
> -
> -    FreqProxTermsWriter.PostingList p =
> (FreqProxTermsWriter.PostingList) p0;
> -
> -    assert omitTermFreqAndPositions || p.docFreq > 0;
> +
> +    FreqProxPostingsArray postings = (FreqProxPostingsArray)
> termsHashPerField.postingsArray;
> +
> +    assert omitTermFreqAndPositions || postings.docFreqs[termID] > 0;
> 
>      if (omitTermFreqAndPositions) {
> -      if (docState.docID != p.lastDocID) {
> -        assert docState.docID > p.lastDocID;
> -        termsHashPerField.writeVInt(0, p.lastDocCode);
> -        p.lastDocCode = docState.docID - p.lastDocID;
> -        p.lastDocID = docState.docID;
> +      if (docState.docID != postings.lastDocIDs[termID]) {
> +        assert docState.docID > postings.lastDocIDs[termID];
> +        termsHashPerField.writeVInt(0, postings.lastDocCodes[termID]);
> +        postings.lastDocCodes[termID] = docState.docID -
> postings.lastDocIDs[termID];
> +        postings.lastDocIDs[termID] = docState.docID;
>        }
>      } else {
> -      if (docState.docID != p.lastDocID) {
> -        assert docState.docID > p.lastDocID;
> +      if (docState.docID != postings.lastDocIDs[termID]) {
> +        assert docState.docID > postings.lastDocIDs[termID];
>          // Term not yet seen in the current doc but previously
>          // seen in other doc(s) since the last flush
> 
>          // Now that we know doc freq for previous doc,
>          // write it & lastDocCode
> -        if (1 == p.docFreq)
> -          termsHashPerField.writeVInt(0, p.lastDocCode|1);
> +        if (1 == postings.docFreqs[termID])
> +          termsHashPerField.writeVInt(0,
> postings.lastDocCodes[termID]|1);
>          else {
> -          termsHashPerField.writeVInt(0, p.lastDocCode);
> -          termsHashPerField.writeVInt(0, p.docFreq);
> +          termsHashPerField.writeVInt(0,
> postings.lastDocCodes[termID]);
> +          termsHashPerField.writeVInt(0, postings.docFreqs[termID]);
>          }
> -        p.docFreq = 1;
> -        p.lastDocCode = (docState.docID - p.lastDocID) << 1;
> -        p.lastDocID = docState.docID;
> -        writeProx(p, fieldState.position);
> +        postings.docFreqs[termID] = 1;
> +        postings.lastDocCodes[termID] = (docState.docID -
> postings.lastDocIDs[termID]) << 1;
> +        postings.lastDocIDs[termID] = docState.docID;
> +        writeProx(termID, fieldState.position);
>        } else {
> -        p.docFreq++;
> -        writeProx(p, fieldState.position-p.lastPosition);
> +        postings.docFreqs[termID]++;
> +        writeProx(termID, fieldState.position-
> postings.lastPositions[termID]);
>        }
>      }
>    }
> +
> +  @Override
> +  ParallelPostingsArray createPostingsArray(int size) {
> +    return new FreqProxPostingsArray(size);
> +  }
> 
> +  static final class FreqProxPostingsArray extends
> ParallelPostingsArray {
> +    public FreqProxPostingsArray(int size) {
> +      super(size);
> +      docFreqs = new int[size];
> +      lastDocIDs = new int[size];
> +      lastDocCodes = new int[size];
> +      lastPositions = new int[size];
> +    }
> +
> +    int docFreqs[];                                    // # times this
> term occurs in the current doc
> +    int lastDocIDs[];                                  // Last docID
> where this term occurred
> +    int lastDocCodes[];                                // Code for
> prior doc
> +    int lastPositions[];                               // Last
> position where this term occurred
> +
> +    @Override
> +    ParallelPostingsArray resize(int newSize) {
> +      FreqProxPostingsArray newArray = new
> FreqProxPostingsArray(newSize);
> +      copy(this, newArray);
> +      return newArray;
> +    }
> +
> +    void copy(FreqProxPostingsArray fromArray, FreqProxPostingsArray
> toArray) {
> +      super.copy(fromArray, toArray);
> +      System.arraycopy(fromArray.docFreqs, 0, toArray.docFreqs, 0,
> fromArray.docFreqs.length);
> +      System.arraycopy(fromArray.lastDocIDs, 0, toArray.lastDocIDs, 0,
> fromArray.lastDocIDs.length);
> +      System.arraycopy(fromArray.lastDocCodes, 0,
> toArray.lastDocCodes, 0, fromArray.lastDocCodes.length);
> +      System.arraycopy(fromArray.lastPositions, 0,
> toArray.lastPositions, 0, fromArray.lastPositions.length);
> +    }
> +
> +  }
> +
> +  @Override
> +  int bytesPerPosting() {
> +    return ParallelPostingsArray.BYTES_PER_POSTING + 4 *
> DocumentsWriter.INT_NUM_BYTE;
> +  }
> +
>    public void abort() {}
>  }
> 
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermVec
> torsTermsWriter.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/TermVectorsTermsWriter.java?rev=928283&r1=928282&
> r2=928283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermVec
> torsTermsWriter.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermVec
> torsTermsWriter.java Sat Mar 27 21:21:26 2010
> @@ -48,13 +48,6 @@ final class TermVectorsTermsWriter exten
>    }
> 
>    @Override
> -  void createPostings(RawPostingList[] postings, int start, int count)
> {
> -    final int end = start + count;
> -    for(int i=start;i<end;i++)
> -      postings[i] = new PostingList();
> -  }
> -
> -  @Override
>    synchronized void
> flush(Map<TermsHashConsumerPerThread,Collection<TermsHashConsumerPerFie
> ld>> threadsAndFields, final SegmentWriteState state) throws
> IOException {
> 
>      if (tvx != null) {
> @@ -290,15 +283,4 @@ final class TermVectorsTermsWriter exten
>        finishDocument(this);
>      }
>    }
> -
> -  static final class PostingList extends RawPostingList {
> -    int freq;                                       // How many times
> this term occurred in the current doc
> -    int lastOffset;                                 // Last offset we
> saw
> -    int lastPosition;                               // Last position
> where this term occurred
> -  }
> -
> -  @Override
> -  int bytesPerPosting() {
> -    return RawPostingList.BYTES_SIZE + 3 *
> DocumentsWriter.INT_NUM_BYTE;
> -  }
>  }
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermVec
> torsTermsWriterPerField.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/TermVectorsTermsWriterPerField.java?rev=928283&r1
> =928282&r2=928283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermVec
> torsTermsWriterPerField.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermVec
> torsTermsWriterPerField.java Sat Mar 27 21:21:26 2010
> @@ -126,10 +126,11 @@ final class TermVectorsTermsWriterPerFie
>      assert perThread.vectorFieldsInOrder(fieldInfo);
> 
>      perThread.doc.addField(termsHashPerField.fieldInfo.number);
> +    TermVectorsPostingsArray postings = (TermVectorsPostingsArray)
> termsHashPerField.postingsArray;
> 
>      // TODO: we may want to make this sort in same order
>      // as Codec's terms dict?
> -    final RawPostingList[] postings =
> termsHashPerField.sortPostings(BytesRef.getUTF8SortedAsUTF16Comparator(
> ));
> +    final int[] termIDs =
> termsHashPerField.sortPostings(BytesRef.getUTF8SortedAsUTF16Comparator(
> ));
> 
>      tvf.writeVInt(numPostings);
>      byte bits = 0x0;
> @@ -147,11 +148,11 @@ final class TermVectorsTermsWriterPerFie
>      final ByteBlockPool termBytePool =
> perThread.termsHashPerThread.termBytePool;
> 
>      for(int j=0;j<numPostings;j++) {
> -      final TermVectorsTermsWriter.PostingList posting =
> (TermVectorsTermsWriter.PostingList) postings[j];
> -      final int freq = posting.freq;
> -
> +      final int termID = termIDs[j];
> +      final int freq = postings.freqs[termID];
> +
>        // Get BytesRef
> -      termBytePool.setBytesRef(flushTerm, posting.textStart);
> +      termBytePool.setBytesRef(flushTerm,
> postings.textStarts[termID]);
> 
>        // Compute common byte prefix between last term and
>        // this term
> @@ -176,12 +177,12 @@ final class TermVectorsTermsWriterPerFie
>        tvf.writeVInt(freq);
> 
>        if (doVectorPositions) {
> -        termsHashPerField.initReader(reader, posting, 0);
> +        termsHashPerField.initReader(reader, termID, 0);
>          reader.writeTo(tvf);
>        }
> 
>        if (doVectorOffsets) {
> -        termsHashPerField.initReader(reader, posting, 1);
> +        termsHashPerField.initReader(reader, termID, 1);
>          reader.writeTo(tvf);
>        }
>      }
> @@ -205,11 +206,11 @@ final class TermVectorsTermsWriterPerFie
>    }
> 
>    @Override
> -  void newTerm(RawPostingList p0) {
> +  void newTerm(final int termID) {
>      assert docState.testPoint("TermVectorsTermsWriterPerField.newTerm
> start");
> -    TermVectorsTermsWriter.PostingList p =
> (TermVectorsTermsWriter.PostingList) p0;
> +    TermVectorsPostingsArray postings = (TermVectorsPostingsArray)
> termsHashPerField.postingsArray;
> 
> -    p.freq = 1;
> +    postings.freqs[termID] = 1;
> 
>      if (doVectorOffsets) {
>        int startOffset = fieldState.offset +
> offsetAttribute.startOffset();
> @@ -217,38 +218,76 @@ final class TermVectorsTermsWriterPerFie
> 
>        termsHashPerField.writeVInt(1, startOffset);
>        termsHashPerField.writeVInt(1, endOffset - startOffset);
> -      p.lastOffset = endOffset;
> +      postings.lastOffsets[termID] = endOffset;
>      }
> 
>      if (doVectorPositions) {
>        termsHashPerField.writeVInt(0, fieldState.position);
> -      p.lastPosition = fieldState.position;
> +      postings.lastPositions[termID] = fieldState.position;
>      }
>    }
> 
>    @Override
> -  void addTerm(RawPostingList p0) {
> +  void addTerm(final int termID) {
> 
>      assert docState.testPoint("TermVectorsTermsWriterPerField.addTerm
> start");
> 
> -    TermVectorsTermsWriter.PostingList p =
> (TermVectorsTermsWriter.PostingList) p0;
> -    p.freq++;
> +    TermVectorsPostingsArray postings = (TermVectorsPostingsArray)
> termsHashPerField.postingsArray;
> +
> +    postings.freqs[termID]++;
> 
>      if (doVectorOffsets) {
>        int startOffset = fieldState.offset +
> offsetAttribute.startOffset();
>        int endOffset = fieldState.offset + offsetAttribute.endOffset();
> 
> -      termsHashPerField.writeVInt(1, startOffset - p.lastOffset);
> +      termsHashPerField.writeVInt(1, startOffset -
> postings.lastOffsets[termID]);
>        termsHashPerField.writeVInt(1, endOffset - startOffset);
> -      p.lastOffset = endOffset;
> +      postings.lastOffsets[termID] = endOffset;
>      }
> 
>      if (doVectorPositions) {
> -      termsHashPerField.writeVInt(0, fieldState.position -
> p.lastPosition);
> -      p.lastPosition = fieldState.position;
> +      termsHashPerField.writeVInt(0, fieldState.position -
> postings.lastPositions[termID]);
> +      postings.lastPositions[termID] = fieldState.position;
>      }
>    }
> 
>    @Override
>    void skippingLongTerm() {}
> +
> +  @Override
> +  ParallelPostingsArray createPostingsArray(int size) {
> +    return new TermVectorsPostingsArray(size);
> +  }
> +
> +  static final class TermVectorsPostingsArray extends
> ParallelPostingsArray {
> +    public TermVectorsPostingsArray(int size) {
> +      super(size);
> +      freqs = new int[size];
> +      lastOffsets = new int[size];
> +      lastPositions = new int[size];
> +    }
> +
> +    int[] freqs;                                       // How many
> times this term occurred in the current doc
> +    int[] lastOffsets;                                 // Last offset
> we saw
> +    int[] lastPositions;                               // Last
> position where this term occurred
> +
> +    @Override
> +    ParallelPostingsArray resize(int newSize) {
> +      TermVectorsPostingsArray newArray = new
> TermVectorsPostingsArray(newSize);
> +      copy(this, newArray);
> +      return newArray;
> +    }
> +
> +    void copy(TermVectorsPostingsArray fromArray,
> TermVectorsPostingsArray toArray) {
> +      super.copy(fromArray, toArray);
> +      System.arraycopy(fromArray.freqs, 0, toArray.freqs, 0,
> fromArray.freqs.length);
> +      System.arraycopy(fromArray.lastOffsets, 0, toArray.lastOffsets,
> 0, fromArray.lastOffsets.length);
> +      System.arraycopy(fromArray.lastPositions, 0,
> toArray.lastPositions, 0, fromArray.lastPositions.length);
> +    }
> +  }
> +
> +  @Override
> +  int bytesPerPosting() {
> +    return ParallelPostingsArray.BYTES_PER_POSTING + 3 *
> DocumentsWriter.INT_NUM_BYTE;
> +  }
>  }
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> sh.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/TermsHash.java?rev=928283&r1=928282&r2=928283&vie
> w=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> sh.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> sh.java Sat Mar 27 21:21:26 2010
> @@ -17,16 +17,12 @@ package org.apache.lucene.index;
>   * limitations under the License.
>   */
> 
> +import java.io.IOException;
>  import java.util.Collection;
> -import java.util.Map;
>  import java.util.HashMap;
> -import java.util.Iterator;
>  import java.util.HashSet;
> -import java.util.Arrays;
> -import java.io.IOException;
> -
> -import org.apache.lucene.util.ArrayUtil;
> -import org.apache.lucene.util.RamUsageEstimator;
> +import java.util.Iterator;
> +import java.util.Map;
> 
>  /** This class implements {@link InvertedDocConsumer}, which
>   *  is passed each token produced by the analyzer on each
> @@ -40,13 +36,8 @@ final class TermsHash extends InvertedDo
> 
>    final TermsHashConsumer consumer;
>    final TermsHash nextTermsHash;
> -  final int bytesPerPosting;
> -  final int postingsFreeChunk;
>    final DocumentsWriter docWriter;
> 
> -  private RawPostingList[] postingsFreeList = new RawPostingList[1];
> -  private int postingsFreeCount;
> -  private int postingsAllocCount;
>    boolean trackAllocations;
> 
>    public TermsHash(final DocumentsWriter docWriter, boolean
> trackAllocations, final TermsHashConsumer consumer, final TermsHash
> nextTermsHash) {
> @@ -54,14 +45,6 @@ final class TermsHash extends InvertedDo
>      this.consumer = consumer;
>      this.nextTermsHash = nextTermsHash;
>      this.trackAllocations = trackAllocations;
> -
> -    // Why + 4*POINTER_NUM_BYTE below?
> -    //   +1: Posting is referenced by postingsFreeList array
> -    //   +3: Posting is referenced by hash, which
> -    //       targets 25-50% fill factor; approximate this
> -    //       as 3X # pointers
> -    bytesPerPosting = consumer.bytesPerPosting() +
> 4*DocumentsWriter.POINTER_NUM_BYTE;
> -    postingsFreeChunk = (DocumentsWriter.BYTE_BLOCK_SIZE /
> bytesPerPosting);
>    }
> 
>    @Override
> @@ -86,18 +69,6 @@ final class TermsHash extends InvertedDo
>        nextTermsHash.abort();
>    }
> 
> -  void
> shrinkFreePostings(Map<InvertedDocConsumerPerThread,Collection<Inverted
> DocConsumerPerField>> threadsAndFields, SegmentWriteState state) {
> -
> -    assert postingsFreeCount == postingsAllocCount:
> Thread.currentThread().getName() + ": postingsFreeCount=" +
> postingsFreeCount + " postingsAllocCount=" + postingsAllocCount + "
> consumer=" + consumer;
> -
> -    final int newSize =
> ArrayUtil.getShrinkSize(postingsFreeList.length, postingsAllocCount,
> RamUsageEstimator.NUM_BYTES_OBJECT_REF);
> -    if (newSize != postingsFreeList.length) {
> -      RawPostingList[] newArray = new RawPostingList[newSize];
> -      System.arraycopy(postingsFreeList, 0, newArray, 0,
> postingsFreeCount);
> -      postingsFreeList = newArray;
> -    }
> -  }
> -
>    @Override
>    synchronized void closeDocStore(SegmentWriteState state) throws
> IOException {
>      consumer.closeDocStore(state);
> @@ -144,91 +115,12 @@ final class TermsHash extends InvertedDo
> 
>      consumer.flush(childThreadsAndFields, state);
> 
> -    shrinkFreePostings(threadsAndFields, state);
> -
>      if (nextTermsHash != null)
>        nextTermsHash.flush(nextThreadsAndFields, state);
>    }
> 
>    @Override
>    synchronized public boolean freeRAM() {
> -
> -    if (!trackAllocations)
> -      return false;
> -
> -    boolean any;
> -    final int numToFree;
> -    if (postingsFreeCount >= postingsFreeChunk)
> -      numToFree = postingsFreeChunk;
> -    else
> -      numToFree = postingsFreeCount;
> -    any = numToFree > 0;
> -    if (any) {
> -      Arrays.fill(postingsFreeList, postingsFreeCount-numToFree,
> postingsFreeCount, null);
> -      postingsFreeCount -= numToFree;
> -      postingsAllocCount -= numToFree;
> -      docWriter.bytesAllocated(-numToFree * bytesPerPosting);
> -      any = true;
> -    }
> -
> -    if (nextTermsHash != null)
> -      any |= nextTermsHash.freeRAM();
> -
> -    return any;
> -  }
> -
> -  synchronized public void recyclePostings(final RawPostingList[]
> postings, final int numPostings) {
> -
> -    assert postings.length >= numPostings;
> -
> -    // Move all Postings from this ThreadState back to our
> -    // free list.  We pre-allocated this array while we were
> -    // creating Postings to make sure it's large enough
> -    assert postingsFreeCount + numPostings <= postingsFreeList.length;
> -    System.arraycopy(postings, 0, postingsFreeList, postingsFreeCount,
> numPostings);
> -    postingsFreeCount += numPostings;
> -  }
> -
> -  synchronized public void getPostings(final RawPostingList[]
> postings) {
> -
> -    assert docWriter.writer.testPoint("TermsHash.getPostings start");
> -
> -    assert postingsFreeCount <= postingsFreeList.length;
> -    assert postingsFreeCount <= postingsAllocCount:
> "postingsFreeCount=" + postingsFreeCount + " postingsAllocCount=" +
> postingsAllocCount;
> -
> -    final int numToCopy;
> -    if (postingsFreeCount < postings.length)
> -      numToCopy = postingsFreeCount;
> -    else
> -      numToCopy = postings.length;
> -    final int start = postingsFreeCount-numToCopy;
> -    assert start >= 0;
> -    assert start + numToCopy <= postingsFreeList.length;
> -    assert numToCopy <= postings.length;
> -    System.arraycopy(postingsFreeList, start,
> -                     postings, 0, numToCopy);
> -
> -    // Directly allocate the remainder if any
> -    if (numToCopy != postings.length) {
> -      final int extra = postings.length - numToCopy;
> -      final int newPostingsAllocCount = postingsAllocCount + extra;
> -
> -      consumer.createPostings(postings, numToCopy, extra);
> -      assert docWriter.writer.testPoint("TermsHash.getPostings after
> create");
> -      postingsAllocCount += extra;
> -
> -      if (trackAllocations)
> -        docWriter.bytesAllocated(extra * bytesPerPosting);
> -
> -      if (newPostingsAllocCount > postingsFreeList.length)
> -        // Pre-allocate the postingsFreeList so it's large
> -        // enough to hold all postings we've given out
> -        postingsFreeList = new
> RawPostingList[ArrayUtil.oversize(newPostingsAllocCount,
> RamUsageEstimator.NUM_BYTES_OBJECT_REF)];
> -    }
> -
> -    postingsFreeCount -= numToCopy;
> -
> -    if (trackAllocations)
> -      docWriter.bytesUsed(postings.length * bytesPerPosting);
> +    return false;
>    }
>  }
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shConsumer.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/TermsHashConsumer.java?rev=928283&r1=928282&r2=92
> 8283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shConsumer.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shConsumer.java Sat Mar 27 21:21:26 2010
> @@ -22,8 +22,6 @@ import java.util.Collection;
>  import java.util.Map;
> 
>  abstract class TermsHashConsumer {
> -  abstract int bytesPerPosting();
> -  abstract void createPostings(RawPostingList[] postings, int start,
> int count);
>    abstract TermsHashConsumerPerThread addThread(TermsHashPerThread
> perThread);
>    abstract void
> flush(Map<TermsHashConsumerPerThread,Collection<TermsHashConsumerPerFie
> ld>> threadsAndFields, final SegmentWriteState state) throws
> IOException;
>    abstract void abort();
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shConsumerPerField.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/TermsHashConsumerPerField.java?rev=928283&r1=9282
> 82&r2=928283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shConsumerPerField.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shConsumerPerField.java Sat Mar 27 21:21:26 2010
> @@ -31,7 +31,11 @@ abstract class TermsHashConsumerPerField
>    abstract void finish() throws IOException;
>    abstract void skippingLongTerm() throws IOException;
>    abstract void start(Fieldable field);
> -  abstract void newTerm(RawPostingList p) throws IOException;
> -  abstract void addTerm(RawPostingList p) throws IOException;
> +  abstract void newTerm(int termID) throws IOException;
> +  abstract void addTerm(int termID) throws IOException;
>    abstract int getStreamCount();
> +
> +  abstract ParallelPostingsArray createPostingsArray(int size);
> +  abstract int bytesPerPosting();
> +
>  }
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shPerField.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/TermsHashPerField.java?rev=928283&r1=928282&r2=92
> 8283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shPerField.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shPerField.java Sat Mar 27 21:21:26 2010
> @@ -29,6 +29,7 @@ import org.apache.lucene.util.BytesRef;
>  final class TermsHashPerField extends InvertedDocConsumerPerField {
> 
>    final TermsHashConsumerPerField consumer;
> +
>    final TermsHashPerField nextPerField;
>    final TermsHashPerThread perThread;
>    final DocumentsWriter.DocState docState;
> @@ -50,17 +51,22 @@ final class TermsHashPerField extends In
>    private int postingsHashSize = 4;
>    private int postingsHashHalfSize = postingsHashSize/2;
>    private int postingsHashMask = postingsHashSize-1;
> -  private RawPostingList[] postingsHash = new
> RawPostingList[postingsHashSize];
> -  private RawPostingList p;
> +  private int[] postingsHash;
> +
> +  ParallelPostingsArray postingsArray;
>    private final BytesRef utf8;
>    private Comparator<BytesRef> termComp;
> 
> +  private final int bytesPerPosting;
> +
>    public TermsHashPerField(DocInverterPerField docInverterPerField,
> final TermsHashPerThread perThread, final TermsHashPerThread
> nextPerThread, final FieldInfo fieldInfo) {
>      this.perThread = perThread;
>      intPool = perThread.intPool;
>      bytePool = perThread.bytePool;
>      termBytePool = perThread.termBytePool;
>      docState = perThread.docState;
> +    postingsHash = new int[postingsHashSize];
> +    Arrays.fill(postingsHash, -1);
>      fieldState = docInverterPerField.fieldState;
>      this.consumer = perThread.consumer.addField(this, fieldInfo);
> 
> @@ -72,6 +78,21 @@ final class TermsHashPerField extends In
>        nextPerField = (TermsHashPerField)
> nextPerThread.addField(docInverterPerField, fieldInfo);
>      else
>        nextPerField = null;
> +
> +    //   +3: Posting is referenced by hash, which
> +    //       targets 25-50% fill factor; approximate this
> +    //       as 3X # pointers
> +    bytesPerPosting = consumer.bytesPerPosting() +
> 3*DocumentsWriter.INT_NUM_BYTE;
> +  }
> +
> +  void initPostingsArray() {
> +    assert postingsArray == null;
> +
> +    postingsArray = consumer.createPostingsArray(postingsHashSize);
> +
> +    if (perThread.termsHash.trackAllocations) {
> +      perThread.termsHash.docWriter.bytesAllocated(bytesPerPosting *
> postingsHashSize);
> +    }
>    }
> 
>    void shrinkHash(int targetSize) {
> @@ -85,7 +106,9 @@ final class TermsHashPerField extends In
>      }
> 
>      if (newSize != postingsHash.length) {
> -      postingsHash = new RawPostingList[newSize];
> +      postingsHash = new int[newSize];
> +      Arrays.fill(postingsHash, -1);
> +      postingsArray = null;
>        postingsHashSize = newSize;
>        postingsHashHalfSize = newSize/2;
>        postingsHashMask = newSize-1;
> @@ -97,8 +120,7 @@ final class TermsHashPerField extends In
>        compactPostings();
>      assert numPostings <= postingsHash.length;
>      if (numPostings > 0) {
> -      perThread.termsHash.recyclePostings(postingsHash, numPostings);
> -      Arrays.fill(postingsHash, 0, numPostings, null);
> +      Arrays.fill(postingsHash, 0, numPostings, -1);
>        numPostings = 0;
>      }
>      postingsCompacted = false;
> @@ -112,23 +134,34 @@ final class TermsHashPerField extends In
>      if (nextPerField != null)
>        nextPerField.abort();
>    }
> +
> +  private void growParallelPostingsArray() {
> +    int oldSize = postingsArray.byteStarts.length;
> +    int newSize = (int) (oldSize * 1.5);
> +    this.postingsArray = this.postingsArray.resize(newSize);
> +
> +    if (perThread.termsHash.trackAllocations) {
> +      perThread.termsHash.docWriter.bytesAllocated(bytesPerPosting *
> (newSize - oldSize));
> +    }
> +  }
> 
> -  public void initReader(ByteSliceReader reader, RawPostingList p, int
> stream) {
> +  public void initReader(ByteSliceReader reader, int termID, int
> stream) {
>      assert stream < streamCount;
> -    final int[] ints = intPool.buffers[p.intStart >>
> DocumentsWriter.INT_BLOCK_SHIFT];
> -    final int upto = p.intStart & DocumentsWriter.INT_BLOCK_MASK;
> +    int intStart = postingsArray.intStarts[termID];
> +    final int[] ints = intPool.buffers[intStart >>
> DocumentsWriter.INT_BLOCK_SHIFT];
> +    final int upto = intStart & DocumentsWriter.INT_BLOCK_MASK;
>      reader.init(bytePool,
> -                p.byteStart+stream*ByteBlockPool.FIRST_LEVEL_SIZE,
> +
> postingsArray.byteStarts[termID]+stream*ByteBlockPool.FIRST_LEVEL_SIZE,
>                  ints[upto+stream]);
>    }
> 
>    private synchronized void compactPostings() {
>      int upto = 0;
>      for(int i=0;i<postingsHashSize;i++) {
> -      if (postingsHash[i] != null) {
> +      if (postingsHash[i] != -1) {
>          if (upto < i) {
>            postingsHash[upto] = postingsHash[i];
> -          postingsHash[i] = null;
> +          postingsHash[i] = -1;
>          }
>          upto++;
>        }
> @@ -139,42 +172,42 @@ final class TermsHashPerField extends In
>    }
> 
>    /** Collapse the hash table & sort in-place. */
> -  public RawPostingList[] sortPostings(Comparator<BytesRef> termComp)
> {
> +  public int[] sortPostings(Comparator<BytesRef> termComp) {
>      this.termComp = termComp;
>      compactPostings();
>      quickSort(postingsHash, 0, numPostings-1);
>      return postingsHash;
>    }
> 
> -  void quickSort(RawPostingList[] postings, int lo, int hi) {
> +  void quickSort(int[] termIDs, int lo, int hi) {
>      if (lo >= hi)
>        return;
>      else if (hi == 1+lo) {
> -      if (comparePostings(postings[lo], postings[hi]) > 0) {
> -        final RawPostingList tmp = postings[lo];
> -        postings[lo] = postings[hi];
> -        postings[hi] = tmp;
> +      if (comparePostings(termIDs[lo], termIDs[hi]) > 0) {
> +        final int tmp = termIDs[lo];
> +        termIDs[lo] = termIDs[hi];
> +        termIDs[hi] = tmp;
>        }
>        return;
>      }
> 
>      int mid = (lo + hi) >>> 1;
> 
> -    if (comparePostings(postings[lo], postings[mid]) > 0) {
> -      RawPostingList tmp = postings[lo];
> -      postings[lo] = postings[mid];
> -      postings[mid] = tmp;
> +    if (comparePostings(termIDs[lo], termIDs[mid]) > 0) {
> +      int tmp = termIDs[lo];
> +      termIDs[lo] = termIDs[mid];
> +      termIDs[mid] = tmp;
>      }
> 
> -    if (comparePostings(postings[mid], postings[hi]) > 0) {
> -      RawPostingList tmp = postings[mid];
> -      postings[mid] = postings[hi];
> -      postings[hi] = tmp;
> -
> -      if (comparePostings(postings[lo], postings[mid]) > 0) {
> -        RawPostingList tmp2 = postings[lo];
> -        postings[lo] = postings[mid];
> -        postings[mid] = tmp2;
> +    if (comparePostings(termIDs[mid], termIDs[hi]) > 0) {
> +      int tmp = termIDs[mid];
> +      termIDs[mid] = termIDs[hi];
> +      termIDs[hi] = tmp;
> +
> +      if (comparePostings(termIDs[lo], termIDs[mid]) > 0) {
> +        int tmp2 = termIDs[lo];
> +        termIDs[lo] = termIDs[mid];
> +        termIDs[mid] = tmp2;
>        }
>      }
> 
> @@ -184,51 +217,52 @@ final class TermsHashPerField extends In
>      if (left >= right)
>        return;
> 
> -    RawPostingList partition = postings[mid];
> +    int partition = termIDs[mid];
> 
>      for (; ;) {
> -      while (comparePostings(postings[right], partition) > 0)
> +      while (comparePostings(termIDs[right], partition) > 0)
>          --right;
> 
> -      while (left < right && comparePostings(postings[left],
> partition) <= 0)
> +      while (left < right && comparePostings(termIDs[left], partition)
> <= 0)
>          ++left;
> 
>        if (left < right) {
> -        RawPostingList tmp = postings[left];
> -        postings[left] = postings[right];
> -        postings[right] = tmp;
> +        int tmp = termIDs[left];
> +        termIDs[left] = termIDs[right];
> +        termIDs[right] = tmp;
>          --right;
>        } else {
>          break;
>        }
>      }
> 
> -    quickSort(postings, lo, left);
> -    quickSort(postings, left + 1, hi);
> +    quickSort(termIDs, lo, left);
> +    quickSort(termIDs, left + 1, hi);
>    }
> 
>    /** Compares term text for two Posting instance and
>     *  returns -1 if p1 < p2; 1 if p1 > p2; else 0. */
> -  int comparePostings(RawPostingList p1, RawPostingList p2) {
> +  int comparePostings(int term1, int term2) {
> 
> -    if (p1 == p2) {
> +    if (term1 == term2) {
>        // Our quicksort does this, eg during partition
>        return 0;
>      }
> 
> -    termBytePool.setBytesRef(perThread.tr1, p1.textStart);
> -    termBytePool.setBytesRef(perThread.tr2, p2.textStart);
> +    termBytePool.setBytesRef(perThread.tr1,
> postingsArray.textStarts[term1]);
> +    termBytePool.setBytesRef(perThread.tr2,
> postingsArray.textStarts[term2]);
> 
>      return termComp.compare(perThread.tr1, perThread.tr2);
>    }
> 
>    /** Test whether the text for current RawPostingList p equals
>     *  current tokenText in utf8. */
> -  private boolean postingEquals() {
> -
> -    final byte[] text = termBytePool.buffers[p.textStart >>
> DocumentsWriter.BYTE_BLOCK_SHIFT];
> +  private boolean postingEquals(final int termID) {
> +    final int textStart = postingsArray.textStarts[termID];
> +    final byte[] text = termBytePool.buffers[textStart >>
> DocumentsWriter.BYTE_BLOCK_SHIFT];
>      assert text != null;
> -    int pos = p.textStart & DocumentsWriter.BYTE_BLOCK_MASK;
> +
> +    int pos = textStart & DocumentsWriter.BYTE_BLOCK_MASK;
> 
>      final int len;
>      if ((text[pos] & 0x80) == 0) {
> @@ -267,6 +301,9 @@ final class TermsHashPerField extends In
>      } else {
>        throw new IllegalArgumentException("Could not find a term
> attribute (that implements TermToBytesRefAttribute) in the
> TokenStream");
>      }
> +    if (postingsArray == null) {
> +      initPostingsArray();
> +    }
>      consumer.start(f);
>      if (nextPerField != null) {
>        nextPerField.start(f);
> @@ -285,7 +322,6 @@ final class TermsHashPerField extends In
>    // because token text has already been "interned" into
>    // textStart, so we hash by textStart
>    public void add(int textStart) throws IOException {
> -
>      int code = textStart;
> 
>      int hashPos = code & postingsHashMask;
> @@ -293,37 +329,39 @@ final class TermsHashPerField extends In
>      assert !postingsCompacted;
> 
>      // Locate RawPostingList in hash
> -    p = postingsHash[hashPos];
> +    int termID = postingsHash[hashPos];
> 
> -    if (p != null && p.textStart != textStart) {
> +    if (termID != -1 && postingsArray.textStarts[termID] != textStart)
> {
>        // Conflict: keep searching different locations in
>        // the hash table.
>        final int inc = ((code>>8)+code)|1;
>        do {
>          code += inc;
>          hashPos = code & postingsHashMask;
> -        p = postingsHash[hashPos];
> -      } while (p != null && p.textStart != textStart);
> +        termID = postingsHash[hashPos];
> +      } while (termID != -1 && postingsArray.textStarts[termID] !=
> textStart);
>      }
> 
> -    if (p == null) {
> +    if (termID == -1) {
> 
>        // First time we are seeing this token since we last
>        // flushed the hash.
> 
> -      // Refill?
> -      if (0 == perThread.freePostingsCount)
> -        perThread.morePostings();
> -
> -      // Pull next free RawPostingList from free list
> -      p = perThread.freePostings[--perThread.freePostingsCount];
> -      assert p != null;
> +      // New posting
> +      termID = numPostings++;
> +      if (termID >= postingsArray.textStarts.length) {
> +        growParallelPostingsArray();
> +      }
> +      if (perThread.termsHash.trackAllocations) {
> +        perThread.termsHash.docWriter.bytesUsed(bytesPerPosting);
> +      }
> +
> +      assert termID >= 0;
> 
> -      p.textStart = textStart;
> +      postingsArray.textStarts[termID] = textStart;
> 
> -      assert postingsHash[hashPos] == null;
> -      postingsHash[hashPos] = p;
> -      numPostings++;
> +      assert postingsHash[hashPos] == -1;
> +      postingsHash[hashPos] = termID;
> 
>        if (numPostings == postingsHashHalfSize)
>          rehashPostings(2*postingsHashSize);
> @@ -339,20 +377,21 @@ final class TermsHashPerField extends In
>        intUptoStart = intPool.intUpto;
>        intPool.intUpto += streamCount;
> 
> -      p.intStart = intUptoStart + intPool.intOffset;
> +      postingsArray.intStarts[termID] = intUptoStart +
> intPool.intOffset;
> 
>        for(int i=0;i<streamCount;i++) {
>          final int upto =
> bytePool.newSlice(ByteBlockPool.FIRST_LEVEL_SIZE);
>          intUptos[intUptoStart+i] = upto + bytePool.byteOffset;
>        }
> -      p.byteStart = intUptos[intUptoStart];
> +      postingsArray.byteStarts[termID] = intUptos[intUptoStart];
> 
> -      consumer.newTerm(p);
> +      consumer.newTerm(termID);
> 
>      } else {
> -      intUptos = intPool.buffers[p.intStart >>
> DocumentsWriter.INT_BLOCK_SHIFT];
> -      intUptoStart = p.intStart & DocumentsWriter.INT_BLOCK_MASK;
> -      consumer.addTerm(p);
> +      int intStart = postingsArray.intStarts[termID];
> +      intUptos = intPool.buffers[intStart >>
> DocumentsWriter.INT_BLOCK_SHIFT];
> +      intUptoStart = intStart & DocumentsWriter.INT_BLOCK_MASK;
> +      consumer.addTerm(termID);
>      }
>    }
> 
> @@ -371,20 +410,20 @@ final class TermsHashPerField extends In
>      int hashPos = code & postingsHashMask;
> 
>      // Locate RawPostingList in hash
> -    p = postingsHash[hashPos];
> +    int termID = postingsHash[hashPos];
> 
> -    if (p != null && !postingEquals()) {
> +    if (termID != -1 && !postingEquals(termID)) {
>        // Conflict: keep searching different locations in
>        // the hash table.
>        final int inc = ((code>>8)+code)|1;
>        do {
>          code += inc;
>          hashPos = code & postingsHashMask;
> -        p = postingsHash[hashPos];
> -      } while (p != null && !postingEquals());
> +        termID = postingsHash[hashPos];
> +      } while (termID != -1 && !postingEquals(termID));
>      }
> 
> -    if (p == null) {
> +    if (termID == -1) {
> 
>        // First time we are seeing this token since we last
>        // flushed the hash.
> @@ -414,18 +453,23 @@ final class TermsHashPerField extends In
>          bytePool.nextBuffer();
>        }
> 
> -      // Refill?
> -      if (0 == perThread.freePostingsCount) {
> -        perThread.morePostings();
> +      // New posting
> +      termID = numPostings++;
> +      if (termID >= postingsArray.textStarts.length) {
> +        growParallelPostingsArray();
> +      }
> +      if (perThread.termsHash.trackAllocations) {
> +        perThread.termsHash.docWriter.bytesUsed(bytesPerPosting);
>        }
> 
> -      // Pull next free RawPostingList from free list
> -      p = perThread.freePostings[--perThread.freePostingsCount];
> -      assert p != null;
> +      assert termID != -1;
> +      assert postingsHash[hashPos] == -1;
> +
> +      postingsHash[hashPos] = termID;
> 
>        final byte[] text = bytePool.buffer;
>        final int textUpto = bytePool.byteUpto;
> -      p.textStart = textUpto + bytePool.byteOffset;
> +      postingsArray.textStarts[termID] = textUpto +
> bytePool.byteOffset;
> 
>        // We first encode the length, followed by the UTF8
>        // bytes.  Length is encoded as vInt, but will consume
> @@ -446,10 +490,6 @@ final class TermsHashPerField extends In
>          System.arraycopy(utf8.bytes, 0, text, textUpto+2,
> utf8.length);
>        }
> 
> -      assert postingsHash[hashPos] == null;
> -      postingsHash[hashPos] = p;
> -      numPostings++;
> -
>        if (numPostings == postingsHashHalfSize) {
>          rehashPostings(2*postingsHashSize);
>        }
> @@ -467,25 +507,25 @@ final class TermsHashPerField extends In
>        intUptoStart = intPool.intUpto;
>        intPool.intUpto += streamCount;
> 
> -      p.intStart = intUptoStart + intPool.intOffset;
> +      postingsArray.intStarts[termID] = intUptoStart +
> intPool.intOffset;
> 
>        for(int i=0;i<streamCount;i++) {
>          final int upto =
> bytePool.newSlice(ByteBlockPool.FIRST_LEVEL_SIZE);
>          intUptos[intUptoStart+i] = upto + bytePool.byteOffset;
>        }
> -      p.byteStart = intUptos[intUptoStart];
> +      postingsArray.byteStarts[termID] = intUptos[intUptoStart];
> 
> -      consumer.newTerm(p);
> +      consumer.newTerm(termID);
> 
>      } else {
> -      // System.out.println("  already seen");
> -      intUptos = intPool.buffers[p.intStart >>
> DocumentsWriter.INT_BLOCK_SHIFT];
> -      intUptoStart = p.intStart & DocumentsWriter.INT_BLOCK_MASK;
> -      consumer.addTerm(p);
> +      final int intStart = postingsArray.intStarts[termID];
> +      intUptos = intPool.buffers[intStart >>
> DocumentsWriter.INT_BLOCK_SHIFT];
> +      intUptoStart = intStart & DocumentsWriter.INT_BLOCK_MASK;
> +      consumer.addTerm(termID);
>      }
> 
>      if (doNextCall)
> -      nextPerField.add(p.textStart);
> +      nextPerField.add(postingsArray.textStarts[termID]);
>    }
> 
>    int[] intUptos;
> @@ -535,14 +575,16 @@ final class TermsHashPerField extends In
> 
>      final int newMask = newSize-1;
> 
> -    RawPostingList[] newHash = new RawPostingList[newSize];
> +    int[] newHash = new int[newSize];
> +    Arrays.fill(newHash, -1);
>      for(int i=0;i<postingsHashSize;i++) {
> -      RawPostingList p0 = postingsHash[i];
> -      if (p0 != null) {
> +      int termID = postingsHash[i];
> +      if (termID != -1) {
>          int code;
>          if (perThread.primary) {
> -          final int start = p0.textStart &
> DocumentsWriter.BYTE_BLOCK_MASK;
> -          final byte[] text = bytePool.buffers[p0.textStart >>
> DocumentsWriter.BYTE_BLOCK_SHIFT];
> +          final int textStart = postingsArray.textStarts[termID];
> +          final int start = textStart &
> DocumentsWriter.BYTE_BLOCK_MASK;
> +          final byte[] text = bytePool.buffers[textStart >>
> DocumentsWriter.BYTE_BLOCK_SHIFT];
>            code = 0;
> 
>            final int len;
> @@ -561,19 +603,19 @@ final class TermsHashPerField extends In
>              code = (code*31) + text[pos++];
>            }
>          } else {
> -          code = p0.textStart;
> +          code = postingsArray.textStarts[termID];
>          }
> 
>          int hashPos = code & newMask;
>          assert hashPos >= 0;
> -        if (newHash[hashPos] != null) {
> +        if (newHash[hashPos] != -1) {
>            final int inc = ((code>>8)+code)|1;
>            do {
>              code += inc;
>              hashPos = code & newMask;
> -          } while (newHash[hashPos] != null);
> +          } while (newHash[hashPos] != -1);
>          }
> -        newHash[hashPos] = p0;
> +        newHash[hashPos] = termID;
>        }
>      }
> 
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shPerThread.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/index/TermsHashPerThread.java?rev=928283&r1=928282&r2=9
> 28283&view=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shPerThread.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/index/TermsHa
> shPerThread.java Sat Mar 27 21:21:26 2010
> @@ -36,9 +36,6 @@ final class TermsHashPerThread extends I
>    final boolean primary;
>    final DocumentsWriter.DocState docState;
> 
> -  final RawPostingList freePostings[] = new RawPostingList[256];
> -  int freePostingsCount;
> -
>    // Used when comparing postings via termRefComp, in
> TermsHashPerField
>    final BytesRef tr1 = new BytesRef();
>    final BytesRef tr2 = new BytesRef();
> @@ -100,20 +97,6 @@ final class TermsHashPerThread extends I
>        nextPerThread.abort();
>    }
> 
> -  // perField calls this when it needs more postings:
> -  void morePostings() throws IOException {
> -    assert freePostingsCount == 0;
> -    termsHash.getPostings(freePostings);
> -    freePostingsCount = freePostings.length;
> -    assert noNullPostings(freePostings, freePostingsCount, "consumer="
> + consumer);
> -  }
> -
> -  private static boolean noNullPostings(RawPostingList[] postings, int
> count, String details) {
> -    for(int i=0;i<count;i++)
> -      assert postings[i] != null: "postings[" + i + "] of " + count +
> " is null: " + details;
> -    return true;
> -  }
> -
>    @Override
>    public void startDocument() throws IOException {
>      consumer.startDocument();
> @@ -143,9 +126,8 @@ final class TermsHashPerThread extends I
>      intPool.reset();
>      bytePool.reset();
> 
> -    if (recyclePostings) {
> -      termsHash.recyclePostings(freePostings, freePostingsCount);
> -      freePostingsCount = 0;
> +    if (primary) {
> +      bytePool.reset();
>      }
>    }
>  }
> 
> Propchange:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/search/MultiT
> ermQueryWrapperFilter.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/src/java/org/apache/lucene/search/MultiTermQue
> ryWrapperFilter.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/src/java/org/apache/lucene/search/MultiTermQu
> eryWrapperFilter.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_9/src/java/org/apache/lucene/search/Mult
> iTermQueryWrapperFilter.java:909334
> 
> /lucene/java/trunk/src/java/org/apache/lucene/search/MultiTermQueryWrap
> perFilter.java:916596,916685,916755,916769,917019,917148,917154,917161,
> 917203-917204,919060,919119,919718,919731,919869,920237-
> 920499,922013,922277,922525,922528,922583,922593,922602,922624,922797,9
> 22799,922806,922886-922887,923112,923238,924151,924207,924286,924321-
> 924731,924781,925176-925462
> 
> /lucene/solr/branches/newtrunk/lucene/src/java/org/apache/lucene/search
> /MultiTermQueryWrapperFilter.java:926020-926289
> 
> Modified:
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/store/Directo
> ry.java
> URL:
> http://svn.apache.org/viewvc/lucene/java/branches/flex_1458/src/java/or
> g/apache/lucene/store/Directory.java?rev=928283&r1=928282&r2=928283&vie
> w=diff
> =======================================================================
> =======
> ---
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/store/Directo
> ry.java (original)
> +++
> lucene/java/branches/flex_1458/src/java/org/apache/lucene/store/Directo
> ry.java Sat Mar 27 21:21:26 2010
> @@ -19,15 +19,9 @@ package org.apache.lucene.store;
> 
>  import java.io.IOException;
>  import java.io.Closeable;
> -import java.util.Collection;
> -import java.util.Collections;
> +import java.util.Collection; // for javadocs
> 
> -import java.util.ArrayList;
> -import static java.util.Arrays.asList;
> -import java.util.Collection;
> -import java.util.Iterator;
> -import java.util.List;
> -import org.apache.lucene.index.IndexFileNameFilter;
> +import java.util.Arrays;
>  import org.apache.lucene.util.IOUtils;
> 
>  /** A Directory is a flat list of files.  Files may be written once,
> when they
> @@ -199,18 +193,8 @@ public abstract class Directory implemen
>     *
>     * @param to        destination directory
>     */
> -  // TODO / noncommit: Use codec? for filename filter
>    public final void copyTo(Directory to) throws IOException {
> -    /*List<String> filenames = new ArrayList<String>();
> -    IndexFileNameFilter filter = IndexFileNameFilter.getFilter();
> -
> -    for (String name : listAll())
> -      if (filter.accept(null, name))
> -        filenames.add(name);
> -
> -    copyTo(to, filenames);
> -    */
> -    copyTo(to, asList(listAll()));
> +    copyTo(to, Arrays.asList(listAll()));
>    }
> 
>    /**
> 
> Propchange:
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/analysis/Test
> ISOLatin1AccentFilter.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/src/test/org/apache/lucene/analysis/TestISOLat
> in1AccentFilter.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/src/test/org/apache/lucene/analysis/TestISOLa
> tin1AccentFilter.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/analysis/Te
> stISOLatin1AccentFilter.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/analysis/Te
> stISOLatin1AccentFilter.java:829134,829881,831036,896850,909334
> 
> /lucene/java/branches/lucene_3_0/src/test/org/apache/lucene/analysis/Te
> stISOLatin1AccentFilter.java:880793,896906
> 
> Propchange:
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/document/Test
> DateTools.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/src/test/org/apache/lucene/document/TestDateTo
> ols.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/src/test/org/apache/lucene/document/TestDateT
> ools.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/document/Te
> stDateTools.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/document/Te
> stDateTools.java:829134,829881,831036,896850,909334
> 
> /lucene/java/branches/lucene_3_0/src/test/org/apache/lucene/document/Te
> stDateTools.java:880793,896906
> 
> Propchange:
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/document/Test
> NumberTools.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/src/test/org/apache/lucene/document/TestNumber
> Tools.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/src/test/org/apache/lucene/document/TestNumbe
> rTools.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/document/Te
> stNumberTools.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/document/Te
> stNumberTools.java:829134,829881,831036,896850,909334
> 
> /lucene/java/branches/lucene_3_0/src/test/org/apache/lucene/document/Te
> stNumberTools.java:880793,896906
> 
> Propchange:
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/index/TestBac
> kwardsCompatibility.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/src/test/org/apache/lucene/index/TestBackwards
> Compatibility.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/src/test/org/apache/lucene/index/TestBackward
> sCompatibility.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_4/src/test/org/apache/lucene/index/TestB
> ackwardsCompatibility.java:748824
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/index/TestB
> ackwardsCompatibility.java:829134,829881,831036,896850,909334
> 
> /lucene/java/branches/lucene_3_0/src/test/org/apache/lucene/index/TestB
> ackwardsCompatibility.java:880793,896906
> 
> Propchange:
> lucene/java/branches/flex_1458/src/test/org/apache/lucene/util/TestAttr
> ibuteSource.java
> -----------------------------------------------------------------------
> -------
> --- svn:mergeinfo (original)
> +++ svn:mergeinfo Sat Mar 27 21:21:26 2010
> @@ -1,4 +1,4 @@
> -
> /lucene/dev/trunk/lucene/src/test/org/apache/lucene/util/TestAttributeS
> ource.java:926653,926948-928243,928255
> +/lucene/dev/trunk/lucene/src/test/org/apache/lucene/util/TestAttribute
> Source.java:926653,926791,926948-928243,928255
> 
> /lucene/java/branches/lucene_2_9/src/test/org/apache/lucene/util/TestAt
> tributeSource.java:909334
> 
> /lucene/java/trunk/src/test/org/apache/lucene/util/TestAttributeSource.
> java:916596,916685,916755,916769,917019,917148,917154,917161,917203-
> 917204,919060,919119,919718,919731,919869,920237-
> 920499,922013,922277,922525,922528,922583,922593,922602,922624,922797,9
> 22799,922806,922886-922887,923112,923238,924151,924207,924286,924321-
> 924731,924781,925176-925462
> 
> /lucene/solr/branches/newtrunk/lucene/src/test/org/apache/lucene/util/T
> estAttributeSource.java:926020-926289
> 



---------------------------------------------------------------------
To unsubscribe, e-mail: java-dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-dev-help@lucene.apache.org


Mime
View raw message