couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ryszard Szopa" <>
Subject strange behavior when updating views in database with large amounts of data
Date Sun, 22 Jun 2008 02:15:11 GMT

I have a database with 20 thousand moderately large documents (some
text, some numbers, nothing really special). I create an extremely
simple design document:

(doc) {log(doc._id); emit(null, null);}"}}}

When I query this view, beam starts working using a lot of CPU and
just a little bit of memory. The view-server doesn't seem to be active
at all: you don't see any effects of calling the log in the map
function. The file
/usr/local/var/lib/couchdb/.testing1_design/testing.view is always 4
kb. This lasts very long (more than 2 hours, in fact I lost hope that
it will ever finish), and after that time BANG! the data starts being
sent to the view-server and after a very short time you can have
access to the view (I was just looking at the log file with tail in
order paste some log messages in this e-mail when it happened, and
that's why I noticed).

Is this normal? What on earth had CouchDB been doing the two hours
before sending anything to the view server?

When I tried to insert 200 documents from a similar dataset at once
and then updating the view, then it take some time, but it was a lot
faster (even though I was getting the data from the Internet, which
probably introduced significant overhead). I also tried with 30
thousand trivially small documents and I also didn't get such an

The lag is perfectly reproducible. In fact, I spent the whole day
reproducing it in different settings.

I am using CouchDB built from the revision from Saturday morning, on
Ubuntu Linux. I have a Pentium 1.6G and 2G RAM.

I hope my report gives you any clue on what is wrong.


    -- Richard

View raw message