lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "bruno da silva" <brunogi...@gmail.com>
Subject Re: memory leak getting docs
Date Wed, 05 Nov 2008 20:21:29 GMT
Hello Marc
I'd suggest you create the IndexSearcher outside of your method and pass the
indexreader as a parameter ...

like : private Document getDocumentData(IndexReader reader, String id)

you don't have a memory leak you have a intensive use of memory..



On Wed, Nov 5, 2008 at 3:11 PM, Marc Sturlese <marc.sturlese@gmail.com>wrote:

>
> Hey there, I have posted about this problem before but I think I didn't
> explain mysql very well.
> I'll try to explain my problem inside the context:
> I get ids from a database and I look for the documents in an index that
> correspon to each id. There is just one match for every id. One I have the
> docuement, I have to modify some fields. What I do is to create a new doc,
> copy the fields of the old doc doing the modifications, delete the old doc
> from the index and add the new.
> I have to do it for about 5.000 documents.
> My problem is that when I ejecute the app it starts increasing the memory
> use until 700Mb or even more.
> I have realized that the memory leak apperars when getting the docuements.
> I leave here the piece of code I use to do that... any advice would be
> really helpful coz it's driving me mad.
>
> private Document getDocumentData(String id) {
>        IndexSearcher searcher = null ;
>        IndexReader reader = null;
>        Document doc = null ;
>        TermDocs td = null;
>
>        try {
>
>            searcher = new IndexSearcher(path_my_index) ;
>            reader=searcher.getIndexReader();
>
>            td=reader.termDocs(new Term(GlobalFields.ID,id)) ;
>
>            if(td.next())
>            {
>                doc = reader.document(td.doc()) ;
>            }
>            if(td != null)
>            {
>                td.close() ;
>            }
>            td=null;
>        } catch (Exception ex) {
>            logger.error("Troubles geting data: " + ex.getMessage()) ;
>            logger.error("Stack: " + ErrorStack.getError(ex)) ;
>        } finally {
>           if(searcher != null) {
>                try {
>                    reader.close() ;
>                    searcher.close() ;
>                    reader = null ;
>                    searcher = null ;
>                } catch(Exception e){} ;
>                    reader = null ;
>                    searcher = null ;
>            }
>        }
>        return doc ;
>    }
>
> This functions is called for every doc which I have to get the information
> PD: If I call manually the garbage collector at the end of the function the
> memory leak never apears... but the app runs really slow!!
> Thanks in advanced
>
> Marc
>
>
> --
> View this message in context:
> http://www.nabble.com/memory-leak-getting-docs-tp20348816p20348816.html
> Sent from the Lucene - Java Users mailing list archive at Nabble.com.
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
> For additional commands, e-mail: java-user-help@lucene.apache.org
>
>


-- 
Bruno da Silva

Consultant
Abilis Solutions
1407 de la Montagne
Montreal, Quebec
H3G 1Z3
Cell: 514 - 572 3453
bdasilva@abilis.ca

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message