incubator-couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexander Gabriel <a...@barbalex.ch>
Subject Writing thousands of documents
Date Thu, 28 Feb 2013 01:25:24 GMT
Hi experts

In my CouchApp I need to import thousands of docs, maximum about 30'000 at
once.
I'm using jquery.couch.js.

In a loop I build up the 30'000 objects, then call a function that uses

$db.openDoc(GUID, {
    success: function (art) {
art[x].DsName = object;
$db.saveDoc(art);
  }
});

to update the 30'000 docs.

The docs are fetched:
1. For initial upload in development via activeX from an Access-Database
using a JavaScript Interface Library for Microsoft Access from
http://accessdb.sourceforge.net
2. In production from .csv-files the user can upload

My problem is that even though the 30'000 docs are looped sequentially, all
30'000 are opened first. THEN all 30'000 are saved in one rush. This
overwhelms the browser which crashes when memory usage goes over 1.4 GB.
A typical doc is about 21 KB.

What I would expect is that opening docs is slower than looping. So the
first put's would com after many get's. I can't understand though why the
first put does not happen before all get operations are over.

What would be a good strategy to prevent this from happening?

Can I somehow get the loop to wait until after the success callback of the
saveDoc operation?

The app is here:
http://www.barbalex.iriscouch.com/artendb/_design/artendb/index.html
https://github.com/barbalex/artendb

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message