couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Newson <robert.new...@gmail.com>
Subject Re: Large attachments
Date Thu, 25 Nov 2010 13:47:13 GMT
David,

I've failed to reproduce this locally by following your instructions.
My memory usage was stable (OS X). Another user has tried the test on
Linux with R13 and reports stable memory usage also.

Can you provide more details of the OS, hardware and the manner in
which you are monitoring the memory usage itself? I'd like to
eliminate as many distracting factors as possible.

Thanks,
B.

On Thu, Nov 25, 2010 at 11:01 AM,  <evxdo@bath.ac.uk> wrote:
> Quoting Benoit Chesneau <bchesneau@gmail.com>:
>
>> On Mon, Nov 22, 2010 at 3:51 PM, Bram Neijt <bneijt@gmail.com> wrote:
>>>
>>> Bit of a mis-understanding here, it is about downloads, not uploads.
>>>
>>> For example:
>>> dd if=/dev/urandom of=/tmp/test.bin count=50000 bs=10240
>>> Put test.bin as an attachment in a coucdb database
>>> Run
>>> for i in {0..50};do curl http://localhost:5984/[test
>>> database]/[doc_id]/test.bin > /dev/null 2>&1 & done
>>>
>>> This will create 50 curl processes which download from your couchdb.
>>> Looking at the memory consumption of couchdb, it seems like it is
>>> loading large parts of the file into memory.
>>>
>>> Bram
>>>
>> what is the exact memory usage ?
>
> The process appears to grow by approximately the attachment size, per client
> connection implying that the entire attachment is being buffered before
> being sent to the client.
>
> I've raised this as an issue:
> https://issues.apache.org/jira/browse/COUCHDB-964
>
> David
>
>
>

Mime
View raw message