lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kevin A. Burton" <bur...@newsmonster.org>
Subject Re: OptimizeIt -- Re: force gc idiom - Re: OutOfMemory example
Date Mon, 13 Sep 2004 21:55:23 GMT
David Spencer wrote:

> Jiří Kuhn wrote:
>
>> This doesn't work either!
>
>
> You're right.
> I'm running under JDK1.5 and trying larger values for -Xmx and it 
> still fails.
>
> Running under (Borlands) OptimzeIt shows the number of Terms and 
> Terminfos (both in org.apache.lucene.index) increase every time thru 
> the loop, by several hundred instances each.

Yes... I'm running into a similar situation on JDK 1.4.2 with Lucene 
1.3... I used the JMP debugger and all my memory is taken by Terms and 
TermInfo...

> I can trace thru some Term instances on the reference graph of 
> OptimizeIt but it's unclear to me what's right. One *guess* is that 
> maybe the WeakHashMap in either SegmentReader or FieldCacheImpl is the 
> problem.

Kevin

-- 

Please reply using PGP.

    http://peerfear.org/pubkey.asc    
    
    NewsMonster - http://www.newsmonster.org/
    
Kevin A. Burton, Location - San Francisco, CA, Cell - 415.595.9965
       AIM/YIM - sfburtonator,  Web - http://peerfear.org/
GPG fingerprint: 5FB2 F3E2 760E 70A8 6174 D393 E84D 8D04 99F1 4412
  IRC - freenode.net #infoanarchy | #p2p-hackers | #newsmonster


---------------------------------------------------------------------
To unsubscribe, e-mail: lucene-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: lucene-user-help@jakarta.apache.org


Mime
View raw message