Return-Path: Delivered-To: apmail-lucene-java-user-archive@www.apache.org Received: (qmail 23040 invoked from network); 13 Sep 2007 14:50:22 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 13 Sep 2007 14:50:22 -0000 Received: (qmail 61348 invoked by uid 500); 13 Sep 2007 14:50:06 -0000 Delivered-To: apmail-lucene-java-user-archive@lucene.apache.org Received: (qmail 61310 invoked by uid 500); 13 Sep 2007 14:50:06 -0000 Mailing-List: contact java-user-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: java-user@lucene.apache.org Delivered-To: mailing list java-user@lucene.apache.org Received: (qmail 61274 invoked by uid 99); 13 Sep 2007 14:50:05 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 13 Sep 2007 07:50:05 -0700 X-ASF-Spam-Status: No, hits=2.6 required=10.0 tests=DNS_FROM_OPENWHOIS,SPF_HELO_PASS,SPF_PASS,WHOIS_MYPRIVREG X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of lists@nabble.com designates 216.139.236.158 as permitted sender) Received: from [216.139.236.158] (HELO kuber.nabble.com) (216.139.236.158) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 13 Sep 2007 14:50:02 +0000 Received: from isper.nabble.com ([192.168.236.156]) by kuber.nabble.com with esmtp (Exim 4.63) (envelope-from ) id 1IVq0k-0000Uh-Eq for java-user@lucene.apache.org; Thu, 13 Sep 2007 07:49:42 -0700 Message-ID: <12655880.post@talk.nabble.com> Date: Thu, 13 Sep 2007 07:49:42 -0700 (PDT) From: testn To: java-user@lucene.apache.org Subject: Re: Java Heap Space -Out Of Memory Error In-Reply-To: <12655013.post@talk.nabble.com> MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit X-Nabble-From: test1@doramail.com References: <12475468.post@talk.nabble.com> <12476607.post@talk.nabble.com> <12478804.post@talk.nabble.com> <12484208.post@talk.nabble.com> <12492218.post@talk.nabble.com> <12496515.post@talk.nabble.com> <12500824.post@talk.nabble.com> <12515624.post@talk.nabble.com> <12528983.post@talk.nabble.com> <12595489.post@talk.nabble.com> <12650012.post@talk.nabble.com> <12652816.post@talk.nabble.com> <12655013.post@talk.nabble.com> X-Virus-Checked: Checked by ClamAV on apache.org Should the file be "segments_8" and "segments.gen"? Why is it "Segment"? The case is different. Sebastin wrote: > > java.io.IoException:File Not Found- Segments is the error message > > testn wrote: >> >> What is the error message? Probably Mike, Erick or Yonik can help you >> better on this since I'm no one in index area. >> >> Sebastin wrote: >>> >>> HI testn, >>> 1.I optimize the Large Indexes of size 10 GB using Luke.it >>> optimize all the content into a single CFS file and it generates >>> segments.gen and segments_8 file when i search the item it shows an >>> error that segments file is not there.could you help me in this >>> >>> testn wrote: >>>> >>>> 1. You can close the searcher once you're done. If you want to reopen >>>> the index, you can close and reopen only the updated 3 readers and keep >>>> the 2 old indexreaders and reuse it. It should reduce the time to >>>> reopen it. >>>> 2. Make sure that you optimize it every once in a while >>>> 3. You might consider separating indices in separated storage and use >>>> ParallelReader >>>> >>>> >>>> >>>> Sebastin wrote: >>>>> >>>>> The problem in my pplication are as follows: >>>>> 1.I am not able to see the updated records in my >>>>> index store because i instantiate >>>>> IndexReader and IndexSearcher class once that is in the first >>>>> search.further searches use the same IndexReaders(5 Directories) and >>>>> IndexSearcher with different queries. >>>>> >>>>> 2.My search is very very slow First 2 Directories of >>>>> size 10 GB each which are having old index records and no update in >>>>> that remaining 3 Diretories are updated every second. >>>>> >>>>> 3.i am Indexing 20 million records per day so the >>>>> Index store gets growing and it makes search very very slower. >>>>> >>>>> 4.I am using searcherOne class as the global >>>>> application helper class ,with the scope as APPLICATION it consists of >>>>> one IndexReader and IndexSearcher get set method which will hold the >>>>> IndexReader and IndexSearcher object after the First Search.it is used >>>>> for all other searches. >>>>> >>>>> 5.I am using Lucene 2.2.0 version, in a WEB Application >>>>> which index 15 fields per document and Index 5 Fieds,store 10 Fields.i >>>>> am not using any sort in my query.for a single query upto the maximum >>>>> it fetches 600 records from the index store(5 direcories) >>>>> >>>>> >>>>> hossman wrote: >>>>>> >>>>>> >>>>>> : I set IndexSearcher as the application Object after the first >>>>>> search. >>>>>> ... >>>>>> : how can i reconstruct the new IndexSearcher for every hour to see >>>>>> the >>>>>> : updated records . >>>>>> >>>>>> i'm confused ... my understanding based on the comments you made >>>>>> below >>>>>> (in an earlier message) was that you already *were* constructing a >>>>>> new >>>>>> IndexSearcher once an hour -- but every time you do that, your memory >>>>>> usage grows, and and that sometimes you got OOM Errors. >>>>>> >>>>>> if that's not what you said, then i think you need to explain, in >>>>>> detail, >>>>>> in one message, exactly what your problem is. And don't assume we >>>>>> understand anything -- tell us *EVERYTHING* (like, for example, what >>>>>> the >>>>>> word "crore" means, how "searcherOne" is implemented, and the answer >>>>>> to >>>>>> the specfic question i asked in my last message: does your >>>>>> application, >>>>>> contain anywhere in it, any code that will close anything >>>>>> (IndexSearchers >>>>>> or IndexReaders) ? >>>>>> >>>>>> >>>>>> : > : I use StandardAnalyzer.the records daily ranges from 5 crore to >>>>>> 6 crore. >>>>>> : > for >>>>>> : > : every second i am updating my Index. i instantiate >>>>>> IndexSearcher object >>>>>> : > one >>>>>> : > : time for all the searches. for an hour can i see the updated >>>>>> records in >>>>>> : > the >>>>>> : > : indexstore by reinstantiating IndexSearcher object.but the >>>>>> problem when >>>>>> : > i >>>>>> : > : reinstantiate IndexSearcher ,my RAM memory gets appended.is >>>>>> there any >>>>>> >>>>>> >>>>>> : > IndexSearcher are you explicitly closing both the old >>>>>> IndexSearcher as >>>>>> : > well as all of 4 of those old IndexReaders and the MultiReader? >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -Hoss >>>>>> >>>>>> >>>>>> --------------------------------------------------------------------- >>>>>> To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org >>>>>> For additional commands, e-mail: java-user-help@lucene.apache.org >>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>> >>>> >>> >>> >> >> > > -- View this message in context: http://www.nabble.com/Java-Heap-Space--Out-Of-Memory-Error-tf4376803.html#a12655880 Sent from the Lucene - Java Users mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org For additional commands, e-mail: java-user-help@lucene.apache.org