couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ev...@bath.ac.uk
Subject Re: Large attachments
Date Thu, 25 Nov 2010 11:01:01 GMT
Quoting Benoit Chesneau <bchesneau@gmail.com>:

> On Mon, Nov 22, 2010 at 3:51 PM, Bram Neijt <bneijt@gmail.com> wrote:
>> Bit of a mis-understanding here, it is about downloads, not uploads.
>>
>> For example:
>> dd if=/dev/urandom of=/tmp/test.bin count=50000 bs=10240
>> Put test.bin as an attachment in a coucdb database
>> Run
>> for i in {0..50};do curl http://localhost:5984/[test
>> database]/[doc_id]/test.bin > /dev/null 2>&1 & done
>>
>> This will create 50 curl processes which download from your couchdb.
>> Looking at the memory consumption of couchdb, it seems like it is
>> loading large parts of the file into memory.
>>
>> Bram
>>
> what is the exact memory usage ?

The process appears to grow by approximately the attachment size, per  
client connection implying that the entire attachment is being  
buffered before being sent to the client.

I've raised this as an issue:
https://issues.apache.org/jira/browse/COUCHDB-964

David



Mime
View raw message