couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paul Davis <paul.joseph.da...@gmail.com>
Subject Re: Array being chunked in reduce
Date Wed, 29 Apr 2009 19:28:32 GMT
Ross Bates wrote:
> Hi All  - I have about 600 of documents with 3 attributes:
>
> parent, child, level
>
> When I run the following map/reduce everything looks correct at first in
> that I get an array back where the key is the level and the value is a
> collection of arrays that have share the same level value. So if there were
> 5 unique levels there would be 5 rows
>
> function(doc) {
>     emit( parseInt(doc.level), [doc.child,doc.parent]);
> };
>
>
> function(a,b){
> return b;
> }
>
>
> The problem is that when I run this on a larger data set the value array
> starts getting chunked into arbitrary sizes (50, 83, etc....) When I say
> chunked, the row has a bunch of additional nested arrays to store the
> key/value.
>
> Is there some sort of max array size imposed by couch db?
>
> Many Thanks,
> Ross
>
>   
Ross,

Firstly, the chunked bits you're seeing is couchdb running rereduce on 
your reduce output. The full signature of a reduce function is:

    function(keys, values, rereduce)

Where rereduce is a boolean that flags if the function is being called 
on its own output.

Also, the function as you have it would eventually tip over as its 
returning too much data. For your particular use case I would suggest 
omitting the reduce function entirely and then doing a selection on the 
map and any post-processing client side.

HTH,
Paul Davis

Mime
View raw message