couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jim Klo <>
Subject Re: Mass updates
Date Thu, 09 May 2013 20:28:36 GMT
Could you not use a VDU function that fixes the structure in a target DB and then replicate
from old to new DB?

- JK

Sent from my iPhone

On May 8, 2013, at 8:25 PM, "Charles S. Koppelman-Milstein" <>

> I am trying to understand whether Couch is the way to go to meet some of
> my organization's needs.  It seems pretty terrific.
> The main concern I have is maintaining a consistent state across code
> releases.  Presumably, our data model will change over the course of
> time, and when it does, we need to make the several million old
> documents conform to the new model.
> Although I would love to pipe a view through an update handler and call
> it a day, I don't believe that option exists.  The two ways I
> understandto do this are:
> 1. Query all documents, update each doc client-side, and PUT those
> changes in the _bulk_docs API (presumably this should be done in batches)
> 2. Query the ids for all docs, and one at a time, PUT them through an
> update handler
> Are these options reasonably performant?  If we have to do a mass-update
> once a deployment, it's not terrible if it's not lightning-speed, but it
> shouldn't take terribly long.  Also, I have read that update handlers
> have indexes built against them.  If this is a fire-once option, is that
> worthwhile?   
> Which option is better?  Is there an even better way?
> Thanks,
> Charles

View raw message