lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <>
Subject RE: never desallocate RAM...during search
Date Mon, 23 Jun 2008 12:16:25 GMT
Hi Nicolas,

I was doing something similar to your solution to have better searching
I download you patch but I have a problem in one class. I'm not sure if I'm
doing something wrong but if I what to compile the proyect I must change in

	//private Similarity similarity;


	private SimilarityFactory similarityFactory;

I'm doing something incorrectly or is a little bug?



-----Mensaje original-----
De: Nicolas DESSAIGNE [] 
Enviado el: viernes, 20 de junio de 2008 12:01
Asunto: RE: never desallocate RAM...during search

Hi Robert,

We had actually a similar problem to your own (slow highlighting of big
documents). We corrected it by extending copyField functionality :

We just updated the patch. It should work perfectly on trunk.

Please tell us if it answers your problem.


-----Message d'origine-----
De : []
Envoyé : mercredi 18 juin 2008 15:49
À :
Objet : RE: never desallocate RAM...during search

Hi Otis,

Thank you for your attention.

I've read for days the mail list of lucene and solr and no-one have problems
with anything similar that's why it's seem a bit strange for me this

I can try what you comment about the "gc", but what I'm telling is a normal
behaviour? I must configure my JVM with gc especial parameters for solr?

Thanks a lot. I hope I can arrive to one solution with your help.

-----Mensaje original-----
De: Otis Gospodnetic []
Enviado el: miércoles, 18 de junio de 2008 14:55
Asunto: Re: never desallocate RAM...during search

I don't have the answer about why cache still shows "true", but as far as
memory usage goes, based on your description I'd guess the memory is
allocated and used by the JVM which typically  tries not to run GC unless it
needs to.  So if you want to get rid of that used memory, you need to talk
to the JVM and persuade it to run GC.  I don't think there is a way to
manage memory usage directly.  There is System.gc() that you can call, but
that's only a "suggestion" for the JVM to run GC.

Otis --
Sematext -- -- Lucene - Solr - Nutch

----- Original Message ----
> From: Roberto Nieto <>
> To: solr-user <>
> Sent: Wednesday, June 18, 2008 7:43:12 AM
> Subject: never desallocate RAM...during search
> Hi users,
> Somedays ago I made a question about RAM use during searchs but I didn't
> solve my problem with the ideas that some expert users told me. After
> somes test I can make a more specific question hoping someone can help me.
> My problem is that i need highlighting and i have quite big docs (txt of
> 40MB). The conclusion of my tests is that if I set "rows" to 10, the
> of the first 10 results are cached. This if something normal because its
> probable needed for the highlighting, but this memory is never desallocate
> although I set solr's caches to 0. With this, the memory grows up until is
> close to the heap, then the gc start to desallocate memory..but at that
> point the searches are quite slow. Is this a normal behavior? Can I
> configure some solr parameter to force the desallocation of results after
> each search? [I´m using solr 1.2]
> Another thing that I found is that although I comment (in solrconfig) all
> this options:
> ----> filterCache, queryResultCache, documentCache,
> useFilterForSortedQuery, boolTofilterOptimizer
> In the stats always appear "caching:true".
> I'm probably leaving some stupid thing but I can't find it.
> If anyone can help me..i'm quite desperate.
> Rober.

View raw message