couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andreas Kemkes <a5s...@yahoo.com>
Subject How many filtered replications is too many?
Date Fri, 06 Jul 2012 01:37:47 GMT
I'm trying to split up a monolithic database into smaller ones using filtered continuous replications
in couchdb 1.2.

I need about 200 of these replications (on a single server) and would like to parallelize
as much as possible.  Yet, when I do, the cpu load gets very high and the system seems to
be crawling, replication seems to be slow, and I'm seeing timeout and other errors.

How can I best determine what the bottleneck is?

Are there suggestions on how to configure couchdb to handle it better (I've increased max_dbs_open
to 200)?

How do I best achieve good throughput?

This will be a one-time task, so any large measurement / monitoring effort is probably overkill.

Any suggestions are much appreciated (including suggestions for different approaches).

Thanks,

Andreas
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message