lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ryan McKinley <ryan...@gmail.com>
Subject Re: restoring a corrupt index?
Date Sun, 11 Nov 2007 17:48:15 GMT

> 
> Ryan are you able to update to that commit I just did?  If so I think
> you should run the tool without -fix and post back what it printed.  It
> should report an error on that one segment due to the missing file.
> Then, run -fix to remove that segment (please backup your index first!).
> Then, if you have a zillion segments in the index, try optimizing it?
> 

uggg.  Thanks for all your help / pointers.  I've been able to salvage a 
functioning index - (with 800K fewer docs) - I'll just go back to an old 
index and build from there.

Optimizing reduced 180K files to 1800 - I'm guessing that had something 
to do with why it hit a max open file limit.

I'm running a standard solr-trunk from a few weeks back (nothing custom, 
no 'embedded' stuff).  The index is constantly updating a few files, so 
I am not surprised by the mass of unused files optimize cleans up.

If I understand Yonik's comments, something in lucene 2.3 has changed so 
that hitting the max open file limit could not result in a corrupt 
index.  Is that true?  In the meantime would optimizing more often help 
avoid this?

thanks
ryan






---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org


Mime
View raw message