lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rahil <>
Subject Re: IOException Access Denied errors [ modified]
Date Wed, 24 May 2006 15:22:25 GMT
Hi Dan

Dan Armbrust wrote:

> The MySQL drivers are horrible at dealing with large result sets - 
> that article gives you the workaround to tell it to bring the results 
> back as they are needed (like it should in the first place) but I have 
> found that it isn't reliable - it tends to drop out at random points 
> during the query - so you will get a different number of rows each 
> time you rerun the query.  In MySQL - the only reliable way I have 
> found to get all of the results from a large table is to use their 
> "limit" keyword in the query, and only ask it for X (I usually use 
> 10,000, but use whatever works best with your system) number of rows 
> as a time, and then keep rerunning the query, incrementing up the 
> start position of the "limit" keyword.  This issue also varies a lot 
> from version to version of the driver - some versions have been 
> completely broken, and others are only slightly broken.  To bad we 
> can't get lucene quality code everywhere :)

The incremental query seems to work better. Thanks.

> >> Exception in thread "main" Access is denied
> To me, that really seems like you have an issue with the location that 
> you are writing the index to.  I would make sure you have full write 
> permissions to the location, and make sure there aren't some old / 
> invalid files sitting in there.

Oh Im really quite tired of trying to resolve this "Access denied" 
issue. Ive deleted and recreated the index directory umpteen times! 
Finally I created a brand new directory in a brand new location and ran 
my index program. Lucene seemed to index the results from the first 
30000 queries (increments of 10000) successfully but then finally threw 
this age-old error yet again !! ..  Can you or anyone else make any 
sense of this? I surely cant !

Also the indexing seems to be noticeably slow. For indexing every 10000 
result sets by allocating 1.4gb of memory at run time, it takes 
approximately 20 seconds. With a database of a million records the total 
time in indexing will take ~ 35 mins. Is that normal?


> Dan

Try the all-new Yahoo! Mail. "The New Version is radically easier to use" – The Wall Street

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message