couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Charles S. Koppelman-Milstein" <>
Subject Mass updates
Date Thu, 09 May 2013 03:24:37 GMT
I am trying to understand whether Couch is the way to go to meet some of
my organization's needs.  It seems pretty terrific.
The main concern I have is maintaining a consistent state across code
releases.  Presumably, our data model will change over the course of
time, and when it does, we need to make the several million old
documents conform to the new model.

Although I would love to pipe a view through an update handler and call
it a day, I don't believe that option exists.  The two ways I
understandto do this are:

1. Query all documents, update each doc client-side, and PUT those
changes in the _bulk_docs API (presumably this should be done in batches)
2. Query the ids for all docs, and one at a time, PUT them through an
update handler

Are these options reasonably performant?  If we have to do a mass-update
once a deployment, it's not terrible if it's not lightning-speed, but it
shouldn't take terribly long.  Also, I have read that update handlers
have indexes built against them.  If this is a fire-once option, is that

Which option is better?  Is there an even better way?


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message