couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Benoit Chesneau <>
Subject Re: How many filtered replications is too many?
Date Fri, 06 Jul 2012 09:58:13 GMT
On Fri, Jul 6, 2012 at 3:37 AM, Andreas Kemkes <> wrote:
> I'm trying to split up a monolithic database into smaller ones using filtered continuous
replications in couchdb 1.2.
> I need about 200 of these replications (on a single server) and would like to parallelize
as much as possible.  Yet, when I do, the cpu load gets very high and the system seems to
be crawling, replication seems to be slow, and I'm seeing timeout and other errors.
> How can I best determine what the bottleneck is?
> Are there suggestions on how to configure couchdb to handle it better (I've increased
max_dbs_open to 200)?
> How do I best achieve good throughput?
> This will be a one-time task, so any large measurement / monitoring effort is probably
> Any suggestions are much appreciated (including suggestions for different approaches).
> Thanks,
> Andreas

This is exactly why the view changes feed is existing in rcouch.
Using the view changes feed and the view filter using it you will use
an index instead of passing a function on each documents.

- benoƮt

View raw message