couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Julian Moritz <>
Subject Re: performance issues
Date Mon, 05 Apr 2010 19:17:36 GMT

Adam Kocoloski schrieb:
> On Apr 5, 2010, at 2:52 PM, Julian Moritz <> wrote:
> Hi Julian, it is still true that CouchDB will use only one couchjs
> process for all the map functions in a single design doc. It uses a
> second couchjs for the reduce functions, and of course separate design
> docs get their own processes as well.
> In my experience simple view indexing was almost always limited by the
> Erlang VM, so parallelizing was premature. If you've got a modern
> SpiderMonkey and you're still CPU limited perhaps that's no longer the
> case.  Can you remind us of the Couch and SM versions here?
> Adam

I'm not using any version of spider monkey. If I'd install spidermonkey
on my ubuntu laptop, I cannot use most of the programs I do need (e.g.
firefox and eclipse).

I've configured couchdb (version 0.11) like this:


Best Regards

>>> I've developed a (in my eyes) simple view. I have a wordlist which does
>>> not  contain unique words. I want to store it in a view, with every word
>>> occurring once and ordered by random. Therefore I have a simple view
>>> function:
>>> function(doc){
>>> emit([hash(doc.word), doc.word], null);
>>> }
>>> and a simple reduce:
>>> function(key, values, rereduce){
>>> return true;
>>> }
>>> calling that view with group=true it does what I want.
>>> When storing plenty of words to the database, one of my two cpu cores is
>>> used completely by couchjs.
>>> Isn't the view built using two (or all) cpu cores? I thought (obviously
>>> I'm wrong) that it would be calculated in parallel and using a
>>> quadro-core (or more cores) would make storing faster.
>>> Is there a solution for that? Should I use another query-server?
>>> Regards
>>> Julian

View raw message