lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dan Funk <...@danandtamara.com>
Subject Re: best strategy to deal with large index file
Date Sat, 17 Dec 2005 03:12:04 GMT
Are there specific queries that cause the out of memory problem? Or will any
query do it?

How large is the index?

MultiSearcher allows you to search over multiple indexes, and is well
supported throughout the API.  How you split your indexes is depends on what
you want to achieve. There are many here on the list developing indexes for
large data sets.  Please be a little more specific on what you are indexing,
and how you are searching.

On 12/16/05, Jeff Liang <jeff@messagesolution.com> wrote:
>
> Hi all,
>
> my index file is huge because of large set of data.  when I do search, I
> get outofmemory exception sometime.  I don't know what's usually causing
> the outofmemory exception. Is it during the search
> because of the index file is too big?  or because there are too many
> hits?  memory exception just happens when I call
> IndexSearcher.search().
> it's also bad for backup because I can't do incremental backup after
> adding new documents since I only have one big index file.
>
> What's the best strategy to deal with large index file?  what's a good
> way to split the index file?
> I start jvm with 800MB.
> thanks,
>
> Jeff
>
>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message