couchdb-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paul Carey <>
Subject Re: reduce_limit error
Date Sun, 16 Aug 2009 15:49:14 GMT
On Tue, May 5, 2009 at 8:50 PM, Brian Candler<> wrote:
> On Mon, May 04, 2009 at 03:08:38PM -0700, Chris Anderson wrote:
>> I'm checking in a patch that should cut down on the number of mailing
>> list questions asking why a particular reduce function is hella slow.
>> Essentially the patch throws an error if the reduce function return
>> value is not at least half the size of the values array that was
>> passed in. (The check is skipped if the size is below a fixed amount,
>> 200 bytes for now).
> I think that 200 byte limit is too low, as I have now had to turn off the
> reduce_limit on my server for this:

I'm using a reduce function to sort data in so that clients can query
for the most recent piece of data. For example

function most_recent_reading-map(doc) {
  if(doc.type === "TemperatureReading") {
    emit(doc.station_id, doc);

function most_recent_reading-reduce(keys, values) {
  var sorted = values.sort(function (a,b) {
    return b.created_at.localeCompare(a.created_at)
  return sorted[0];

The main reason I might to do this is to simplify client logic, but
another valid reason is to prevent sending and processing
unnecessarily large chunks of JSON.

This kind of reduce function may fall foul of the
reduce_overflow_error, but only if the document is greater than 200
bytes. So, I'm echoing the opinion that 200 bytes is too low. I also
believe that throwing an exception is a bit draconian as it could
result in an unjustified failure in production. I think a warning
would be more appropriate.


View raw message