lucy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dag Lem <...@nimrod.no>
Subject Re: [lucy-user] Running out of memory while loading
Date Tue, 23 Oct 2012 12:35:10 GMT
Ken Youens-Clark <kyclark@gmail.com> writes:

> Hi,
> 
> I'm just experimenting with the use of Lucy to index the data I formally store in relational
(MySQL) databases.  I'm just taking the text from the db and putting into Lucy stores.  Each
directory gets its own directory so that it's easy for me to update just part of my search
index when a db changes.  So far, I've processed about 30 dbs into a total of about 2.3GB
of Lucy indexes.  The problem is that my machine (a pretty decent, dual-core Linux host) keeps
running out of memory, esp. when indexing a large db with 100K+ records.  My sysadmin keeps
killing it as the it will take down the machine.  
> 
> I'm using the latest Perl and Lucy source.  Any ideas?
> 
> Ken
> 

One possible cause of your problems is that your database driver
attempts to blast the complete result set from your SQL query into
client memory in one go, so that you are likely to run out of memory
before even starting on the indexing.

I don't really know DBD::mysql, assuming that's what you're using, but
I know from first hand experience that you have to take special care
with DBD::Pg.

-- 
Best regards,

Dag Lem

Mime
View raw message