lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paul Hill <p...@metajure.com>
Subject RE: any good idea for loading fields into memory?
Date Mon, 25 Jun 2012 17:06:27 GMT
OK, fair enough, you want to keep everything very fast. I'm surprised that large documents
are slower for searching.
I'm way impressed all the time by the search times.  Finding good hit fragments on a big document
can be slow, but for me
(searching human created documents) is never slow.

> the reason of slowing down is getting fields value, when I use my wrapped indexsearcher
which loading
> fields into arrays, it's as far as before.

I recently read a note somewhere that said to not do getdocument during a search, but using
the (segment level) field caches is faster.
But I can't say what the tradeoff would be if you wanted most fields at each step in the search.

Good luck,
-Paul
Mime
View raw message