couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James Marca <jma...@translab.its.uci.edu>
Subject possible bug using bulk docs?
Date Fri, 01 May 2009 20:41:34 GMT
Hi All, 

I'm getting an out of memory type of crash when uploading lots of
files using bulk docs, and I'm wondering whether
this is a known issue or user error.

I set the logging to debug and this is what I see:


[Fri, 01 May 2009 20:27:30 GMT] [debug] [<0.108.0>] 'POST'
/d12_june2007/_bulk_docs {1,1}
Headers: [{'Connection',"TE, close"},
          {'Content-Length',"257418477"},
          {'Host',"127.0.0.1:5984"},
          {"Te","deflate,gzip;q=0.3"},
          {'User-Agent',"libwww-perl/5.805"}]

[Fri, 01 May 2009 20:31:01 GMT] [info] [<0.108.0>] 127.0.0.1 - -
'POST' /d12_june2007/_bulk_docs 201


In top I can watch the RAM usage go up and down until finally it peaks
and the server is just gone.

That content length is from 1278 documents.  I can easily split that
pile up into fewer documents (and that is what I am going to do) but
I thought I'd raise the issue here as this feels like a software bug
somewhere.  

I am running the release 0.9, on Gentoo, using the gentoo ebuild,
x86_64 using the latest (compiled yesterday) release of erlang on
gentoo.  

I'm more than happy to pull new code from git or svn to test things
out too.

Regards, 
James

-- 
James E. Marca
Researcher
Institute of Transportation Studies
AIRB Suite 4000
University of California
Irvine, CA 92697-3600
jmarca@translab.its.uci.edu

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.


Mime
View raw message