lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vincent Le Maout <>
Subject over 300 GB to index: feasability and performance issue
Date Mon, 26 Jul 2004 15:34:29 GMT
Hi everyone,

I have to index a huge, huge amount of data: about 10 million documents
making up about 300 GB. Is there any technical limitation in Lucene that
could prevent me from processing such amount (I mean, of course, apart
from the external limits induce by the hardware: RAM, disks, the system,
whatever) ? If possible, does anyone have an idea of the amount of resource
needed: RAM, CPU time, size of indexes, access time on such a collection ?
if not, is it possible to extrapolate an estimation from previous 
benchmarks ?

Thanks in advance.

Vincent Le Maout

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message