incubator-couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dave Cottlehuber <d...@muse.net.nz>
Subject Re: limit on number of docs updated via _bulk_docs?
Date Tue, 05 Jun 2012 07:50:21 GMT
On 5 June 2012 05:07, Tim Tisdall <tisdall@gmail.com> wrote:
> Hopefully someone can give me an idea on this problem because I think
> I've about exhausted ideas.
>
> I'm doing a series of document updates fairly rapidly.  I was doing
> the updates via PUT and was having no problems except for the DB file
> size growing way too fast.  I changed things to update the database in
> batches using _bulk_docs.  Now I seem to have a problem with
> connections timing out to couchdb after about 11000 doc updates.  I've
> tried different size batches from 5 to 500 docs but each time the
> program dies with a connection time out after about the same number of
> doc updates.
>
> I thought it may be a problem with my code (it's in PHP, and that's
> usually the problem ;) ), however I tried something that I think
> negates that possibility.  I have the script running and stop it after
> about 5000 updates, then manually restart the script again right away.
>  The second time the script dies after about 6000 updates.  So, still
> around 11000 updates over 2 different processes.
>
> Any thoughts, guesses, things to try, or things to test?
>
> -Tim

I'd start off with putting the couch into [log] level = debug and see what comes
out. Feel free to put a section up on friendpaste.com or similar for comment.

A+
Dave

Mime
View raw message