couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brian Candler <>
Subject Re: Suggestions on optimizing document look up
Date Wed, 01 Apr 2009 21:31:21 GMT
On Wed, Apr 01, 2009 at 12:34:44PM -0700, Manjunath Somashekhar wrote:
> As of now wat we are doing is a simple look up like:
> def getDocById(self, id):
>      return self.db[id]
> For doing a million lookups like this it takes about 50-60 mins on my
> laptop. Is there a better way of doing the same?

This is a million separate HTTP requests? Are you doing them sequentially,
or doing multiple requests in parallel? You could try batching them. That
is, by POSTing to _all_docs with "keys"=>["k1","k2","k3",...] you can get a
bunch of documents at once.

A million separate requests in 60 minutes is 277 requests per second, which
is fairly respectable, especially if they are jumping all over the database
(which means a head seek for each one, if the blocks aren't already in

View raw message