incubator-couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Will Heger <>
Subject Re: growable arrays in reductions
Date Thu, 06 Dec 2012 23:08:02 GMT
In the end, I could write a list a function, but I do so at the cost
of caching and incremental update.  For example, I have a grocery cart
that is described by a series of transactions, items added, items
removed.  If I wanted to keep a total bill, taxes, item count, in a
reduction, that would be a pretty canonical reduction along the lines
of the Event Sourcing design pattern.

My question is whether appending a list of the underlying transaction
ids would create a problem.

"As a rule of thumb, the data returned by reduce functions should
remain "smallish" and not grow faster than log(num_rows_processed)."

I'm not totally clear on how to parse this statement.  Is the size
related to size of mapped input documents?  Collectively or
individually measured?

Having the underlying id's would allow me to "close-the-loop" from a
transactions standpoint.  For example, I have ten different clients
contributing to this one cart.   Any particular client can then
instantly recognize whether her contribution is factored into the
summary by scanning for her id within the transaction list.

There are other methods for achieving this, but if this is not going
to cause a problem, it is presently the most elegant for my
application.  But beyond this, I'm just interested in what amount of
growth is allowable.

"From 0.10 onwards, CouchDB uses a heuristic to detect reduce
functions that won't scale to give the developer an early warning"

So far Couch has not complained to me about any of the reductions I've
written, but I still feel like I'm flying a bit blind.

View raw message