incubator-couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Newson <rnew...@apache.org>
Subject Re: Why do couchdb reduce functions have to be commutative
Date Mon, 02 Dec 2013 14:10:21 GMT
Because the order that we pass keys and values to the reduce function
is not defined. In sharded situations (like bigcouch, which is being
merged) an intermediate reduce value on an effectively random subset
of keys/values is generated at each node and a final rereduce is done
on all the intermediates. The constraints on reduce functions exist in
anticipation of clustering.

B.


On 1 December 2013 21:45, Oliver Dain <opublic@dains.org> wrote:
> Hey CouchDB users,
>
> I've just started messing around with CouchDB and I understand why CouchDB
> reduce functions need to be associative, but I don't understand why they
> also have to be commutative. I posted a much more detailed version of this
> question to StackOverflow yesterday, but haven't gotten an answer yet (my
> SO experience says that means I probably won't ever get one). Figured it
> might be smart to explicitly loop in the couch community.
>
> The original StackOverflow question is here:
>
> http://stackoverflow.com/questions/20303355/why-do-couchdb-reduce-functions-have-to-be-commutative
>
> Any thoughts would be appreciated!
>
> Thanks,
> Oliver

Mime
View raw message