couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Newson <robert.new...@gmail.com>
Subject Re: Large attachments
Date Mon, 22 Nov 2010 14:11:37 GMT
Curl buffers binary uploads, depending on the manner you perform the operation.

B.

On Mon, Nov 22, 2010 at 2:03 PM, Bram Neijt <bneijt@gmail.com> wrote:
> I can reproduce this problem: if I upload a 500 MB and start 10
> concurrent curl commands, memory usage increase dramatically with the
> following environment:
> Description:    Ubuntu 10.10
> Release:        10.10
> Codename:       maverick
> {"couchdb":"Welcome","version":"1.0.1"}
>
> Bram
>
> On Tue, Nov 16, 2010 at 5:56 PM,  <evxdo@bath.ac.uk> wrote:
>> Well, I'm just doing a GET directly to the document_id + attachment:
>> http://localhost:5984/database/doc_id/attachment
>>
>> Clicking on the attachment in Futon would have the same effect.
>>
>> David
>>
>> Quoting Jan Lehnardt <jan@apache.org>:
>>
>>> Hi David,
>>>
>>> On 16 Nov 2010, at 14:00, evxdo@bath.ac.uk wrote:
>>>
>>>> Hi everyone,
>>>>
>>>> I'm trying to work with some large attachments (around 1.5 GB).  When I
>>>> go to download these (as a standalone attachment) the  CouchDB process grows
>>>> in size by at least the size of the  attachment before the download starts.
>>>> This implies that the  attachment is being loaded into memory entirely
>>>> before being sent  to the client. Has anyone else seen this behaviour? Is
>>>> this a bug,  or is there a configuration change I can make to resolve this?
>>>>
>>>> I've tried disabling compression on attachments in case it's the
>>>>  compression that's causing the problem.
>>>>
>>>> I'm using 1.0.1.
>>>
>>> What does your request look like?
>>>
>>> The standalone attachment API does not buffer.
>>>
>>> Cheers
>>> Jan
>>> --
>>>
>>>
>>>
>>
>>
>>
>>
>

Mime
View raw message