couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ross Bates <>
Subject Array being chunked in reduce
Date Wed, 29 Apr 2009 19:10:51 GMT
Hi All  - I have about 600 of documents with 3 attributes:

parent, child, level

When I run the following map/reduce everything looks correct at first in
that I get an array back where the key is the level and the value is a
collection of arrays that have share the same level value. So if there were
5 unique levels there would be 5 rows

function(doc) {
    emit( parseInt(doc.level), [doc.child,doc.parent]);

return b;

The problem is that when I run this on a larger data set the value array
starts getting chunked into arbitrary sizes (50, 83, etc....) When I say
chunked, the row has a bunch of additional nested arrays to store the

Is there some sort of max array size imposed by couch db?

Many Thanks,

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message