couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ask Bjørn Hansen <...@apache.org>
Subject Re: Largest CouchDB dbs?
Date Mon, 03 Nov 2008 09:23:18 GMT

On Nov 2, 2008, at 21:19, Paul Davis wrote:

> Also on second reading of the email, make sure to leverage loading
> documents via _bulk_docs. When I load huge numbers of documents I tend
> to try and load a couple thousand at a time. If an update fails I fall
> back to a binary search to find the offending record. There's an open
> request for identifying the offending records.

Yes, I do that. I'm down to 400 documents per request though - I got  
way too frequent failures (500 errors) when using more than that.   I  
don't believe it's bad data, because just trying to submit it again  
and again will eventually work (at least that's what happens on the  
occasional errors when I submit just hundreds of documents).

I wrote about this a few days ago in a mail with the subject  
"_bulk_docs limit".


  - ask

Mime
View raw message