lucene-pylucene-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Moshe Cohen <>
Subject Re: Java Memory errors and "too many open files" when using Pylucene
Date Wed, 20 May 2009 16:48:26 GMT
Version being used : 2.4.1 .
I have already tried most of the well documented Lucene ideas. The seemingly
weird thing  is that the index is always quite small. I have experience of
much larger indices on SOLR with no such errors.

Started with a memory error, after increasing JVM heap on init I got the too
many open files error, increased the OS limit and got a memory error
Of course, I got further along in each stage but ultimately I hit an error.
I can workaround the problem by just restarting the program. This is what
lead me to suspecting resource leaks specific to Pylucene.

Are there any useful monitoring functions that can retrieve the resource
usage state along the way?


On Wed, May 20, 2009 at 7:31 PM, Andi Vajda <> wrote:

> On Wed, 20 May 2009, Moshe Cohen wrote:
>  I would like to know if there is any Pylucene-specific issues with regard
>> to
>> the two JVM errors in the subject.
> About memory errors: be sure to give the Java VM enough memory when
> initializing it with initVM(). More about this at [1].
> About open files: is your index using the Lucene compound file format ? If
> not, that could help. If you are already, what is your OS and related number
> of file limit ? Have you tried increasing it ? Are you closing all 'things'
> that you think ought to be closed ? What version of PyLucene and Java Lucene
> are you using ?
> Unless there is an egregious leak somewhere, both issues should reproduce
> with the same Java program (PyLucene just wraps a Java VM, the same Java
> Lucene code is run). You may want to ask the same questions on
> [2].
> Andi..
> [1]
> [2]

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message