couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adam Kocoloski <>
Subject Re: Best practice for view updates across large data sets
Date Wed, 28 Oct 2009 02:53:53 GMT
On Oct 27, 2009, at 8:25 PM, Larry wrote:

> As I had expected Im starting to experience lengthy re-indexing  
> times when
> changing/updating our views. We have just over 300K worth of documents
> currently and it will be growing. One of our views takes about 20  
> minutes or
> so to index when installed. This locks up key aspects of our  
> application and
> we would like to find a way to keep the application continuously  
> functional.
> I know that our views scripts can certainly be optimized and thats  
> something
> were working on as our knowledge and experience with CouchDB grows.  
> However
> given where we are now I was wondering if there is a "best practice"  
> or any
> tips that users may have on updating views across large data sets.
> Thanks for the help!
> larry

Hi Larry, one trick you may find useful in 0.10 is to take advantage  
of the fact that the view index files are identified by the hash of  
their contents.  This means that you can have your _design/production  
document and your _design/development document, and when you're  
satisfied with the dev version of your app and you want to deploy it,  
you can just update _design/production to be identical to _design/ 
development -- your production system will automatically use the  
prebuilt indexes from _design/development with zero downtime.  You can  
even use HTTP COPY to do this if you like.  Cheers,


View raw message