lucy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ken Youens-Clark <>
Subject [lucy-user] Running out of memory while loading
Date Mon, 22 Oct 2012 16:03:32 GMT

I'm just experimenting with the use of Lucy to index the data I formally store in relational
(MySQL) databases.  I'm just taking the text from the db and putting into Lucy stores.  Each
directory gets its own directory so that it's easy for me to update just part of my search
index when a db changes.  So far, I've processed about 30 dbs into a total of about 2.3GB
of Lucy indexes.  The problem is that my machine (a pretty decent, dual-core Linux host) keeps
running out of memory, esp. when indexing a large db with 100K+ records.  My sysadmin keeps
killing it as the it will take down the machine.  

I'm using the latest Perl and Lucy source.  Any ideas?

View raw message