lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jaeger, Jay - DOT" <Jay.Jae...@dot.wi.gov>
Subject RE: Out of memory
Date Tue, 13 Sep 2011 21:06:20 GMT
numDocs is not the number of documents in memory.  It is the number of documents currently
in the index (which is kept on disk).  Same goes for maxDocs, except that it is a count of
all of the documents that have ever been in the index since it was created or optimized (including
deleted documents).

Your subject indicates that something is giving you some kind of Out of memory error.  We
might better be able to help you if you provide more information about your exact problem.

JRJ


-----Original Message-----
From: Rohit [mailto:rohit@in-rev.com] 
Sent: Tuesday, September 13, 2011 2:29 PM
To: solr-user@lucene.apache.org
Subject: Out of memory

I have solr running on a machine with 18Gb Ram , with 4 cores. One of the
core is very big containing 77516851 docs, the stats for searcher given
below

 

searcherName : Searcher@5a578998 main 
caching : true 
numDocs : 77516851 
maxDoc : 77518729 
lockFactory=org.apache.lucene.store.NativeFSLockFactory@5a9c5842 
indexVersion : 1308817281798 
openedAt : Tue Sep 13 18:59:52 GMT 2011 
registeredAt : Tue Sep 13 19:00:55 GMT 2011 
warmupTime : 63139

 

.         Is there a way to reduce the number of docs loaded into memory for
this core?

.         At any given time I dont need data more than past 15 days, unless
someone queries for it explicetly. How can this be achieved?

.         Will it be better to go for Solr replication or distribution if
there is little option left

 

 

Regards,

Rohit

Mobile: +91-9901768202

About Me:  <http://about.me/rohitg> http://about.me/rohitg

 


Mime
View raw message