lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rickard Bäckman" <backman.rick...@gmail.com>
Subject Re: Incremental updates / slow searches.
Date Wed, 11 Oct 2006 10:09:18 GMT
Thanks for the suggestions.

We tried to reduce the amount of times we open a new searcher with some
progress. However
a lot of our searches still times out. We are currently opening a new
searcher and warms it up before
doing the switch. We even map the fields we are using for deleting to the
document ids to be able to do
deletes without doing any searches.

Analyzing / merging is done on the same machine as the searches.
I guess that you are suggesting building the index on another machine and
copying it to the searching machine? Or is there any other solution that
still
uses incremental updatse on the searching machine?

/Rickard

On 10/9/06, Yonik Seeley <yonik@apache.org> wrote:
>
> The biggest thing would be to limit how often you open a new
> IndexSearcher, and when you do, warm up the new searcher in the
> background while you continue serving searches with the existing
> searcher.  This is the strategy that Solr uses.
>
> There is also the issue of if you are analyzing/merging docs on the
> same servers that you are executing searches on.  You can use a
> separate box to build the index and distribute changes to boxes used
> for searching.
>
> -Yonik
> http://incubator.apache.org/solr Solr, the open-source Lucene search
> server
>
> On 10/9/06, Rickard Bäckman <backman.rickard@gmail.com> wrote:
> > Hi,
> >
> > we are using a search system based on Lucene and have recently tried to
> add
> > incremental updating of the index instead of building a new index every
> now
> > and then. However we now run into problems as our searches starts to
> take
> > very long time to complete.
> >
> > Our index is about 8-9GB large and we are sending lots of updates /
> second
> > (we are probably merging in 200 - 300 in a few seconds). Today we buffer
> a
> > bunch of updates and then merge them into the existing index like a
> batch,
> > first doing deletes and then inserts.
> >
> > We are currently not using any special tuning of Lucene.
> >
> > Does anyone have any similiar experiences from Lucene or advices on how
> to
> > reduce the amount of times it takes to perform a search? In particular
> what
> > would be an optimal combination of update size, merge factor, max
> buffered
> > docs?
> >
> > /Rickard
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
> For additional commands, e-mail: java-user-help@lucene.apache.org
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message