couchdb-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paul Carey <>
Subject Re: reduce_limit error
Date Tue, 18 Aug 2009 08:00:13 GMT
>> function most_recent_reading-map(doc) {
>>  if(doc.type === "TemperatureReading") {
>>    emit(doc.station_id, doc);
>>  }
>> }
>> function most_recent_reading-reduce(keys, values) {
>>  var sorted = values.sort(function (a,b) {
>>    return b.created_at.localeCompare(a.created_at)
>>  });
>>  return sorted[0];
>> }
> you should never accumulate a list in a reduce function...
> if you want to create a compressed final JSON output, the thing to do
> would be to run a list function on a group reduce query, and have it
> make the final aggregate. that way you don't end up with an infinitely
> long overflowing list in your reduce values.

Given that the reduce function returns a single value, I don't
understand why you consider it to be accumulating a list. I see it as
being roughly equivalent to returning a very large scalar.

As the argument against having the output of a reduce function grow
too fast is based on degraded performance with a large dataset, I ran
a test case with 1 million docs. The query returns in about 0.03s,
which is significantly faster than a group_level based query against a
dataset of a similar size.


View raw message