lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Uwe Schindler" <...@thetaphi.de>
Subject RE: Memory issues
Date Sat, 03 Sep 2011 21:41:05 GMT
There is no difference between 2.9 and 3.0, ist exactly the same code with
only Java 5 specific API modifications and removal of deprecated methods.

The issue you have seems to be that maybe your index have grown beyond some
limits of your JVM.

Uwe

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: uwe@thetaphi.de

> -----Original Message-----
> From: Michael Bell [mailto:mjb@gwava.com]
> Sent: Saturday, September 03, 2011 8:10 PM
> To: java-user@lucene.apache.org
> Subject: Memory issues
> 
> Ok, one customer of ours insists on running a Really Big Single Server
Lucene
> index.
> 
> To be exact, there are about 300 million documents. This is running on a
64 bit
> JVM/64 bit OS with 24 GB(!) RAM allocated.
> 
> Until very recently all was well. We then updated then in program version,
> which updated lucene 2.93 to lucene 3.02
> 
> Now, their searches are working fine IF you do not SORT the results. If
you do
> SORT, you get stuff like
> 
> 2011-08-30 13:01:31,489 [TP-Processor8] ERROR
> com.gwava.utils.ServerErrorHandlerStrategy - reportError: nastybadthing ::
> com.gwava.indexing.lucene.internal.LuceneSearchController.performSearchOp
> eration:229 :: EXCEPTION : java.lang.OutOfMemoryError: Requested array
size
> exceeds VM limit java.lang.OutOfMemoryError: Requested array size exceeds
> VM limit  at
> org.apache.lucene.search.FieldCacheImpl$StringIndexCache.createValue(Field
> CacheImpl.java:624)
>  at
> org.apache.lucene.search.FieldCacheImpl$Cache.get(FieldCacheImpl.java:183)
>  at
>
org.apache.lucene.search.FieldCacheImpl.getStringIndex(FieldCacheImpl.java:6
> 11)
>  at
> org.apache.lucene.search.FieldComparator$StringOrdValComparator.setNextR
> eader(FieldComparator.java:805)
>  at
> org.apache.lucene.search.TopFieldCollector$OneComparatorNonScoringCollect
> or.setNextReader(TopFieldCollector.java:96)
>  at org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:209)
>  at org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:199)
>  at org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:177)
>  at org.apache.lucene.search.Searcher.search(Searcher.java:49)
>  at
> com.gwava.indexing.lucene.internal.LuceneSearchController.performSearchOp
> eration(LuceneSearchController.java:223)
>  at
> 
> Looking at the sort class, the api docs appear to say it would create an
element
> of 1.2 billion items (4*300m). Is this correct? Is the issue going beyond
signed
> int32 limits of an array ( 2 billion items) or is it really a memory
issue? How
> best to diagnose?
> 
> Thanks
> 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
> For additional commands, e-mail: java-user-help@lucene.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org


Mime
View raw message