lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Erick Erickson <erickerick...@gmail.com>
Subject Re: scalability limit in terms of numbers of large documents
Date Sat, 14 Aug 2010 11:09:22 GMT
As asked, that's really an unanswerable question. The math is pretty easy
in terms of running out of document IDs, but "searched quickly" depends
on too many variables.

I suspect, though, that long before you ran out of document IDs, you'd need
to shard your index, Have you looked at SOLR?

Best
Erick

On Fri, Aug 13, 2010 at 9:24 PM, andynuss <andrew_nuss@yahoo.com> wrote:

>
> Hi,
>
> Lets say that I am indexing large book documents broken into chapters.  A
> typical book that you buy at amazon.  What would be the approximate limit
> to
> the number of books that can be indexed slowly and searched quickly.  The
> search unit would be a chapter, so assume that a book is divided into 15-50
> chapters.  Any ideas?
>
> Andy
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/scalability-limit-in-terms-of-numbers-of-large-documents-tp1142517p1142517.html
> Sent from the Lucene - Java Users mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
> For additional commands, e-mail: java-user-help@lucene.apache.org
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message