couchdb-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Robert Newson (JIRA)" <>
Subject [jira] Commented: (COUCHDB-964) Large memory usage downloading attachments
Date Thu, 25 Nov 2010 13:55:16 GMT


Robert Newson commented on COUCHDB-964:

I've failed to reproduce this locally by following your instructions.
My memory usage was stable (OS X). Another user has tried the test on
Linux with R13 and reports stable memory usage also.

Can you provide more details of the OS, hardware and the manner in
which you are monitoring the memory usage itself? I'd like to
eliminate as many distracting factors as possible.

> Large memory usage downloading attachments
> ------------------------------------------
>                 Key: COUCHDB-964
>                 URL:
>             Project: CouchDB
>          Issue Type: Bug
>          Components: HTTP Interface
>    Affects Versions: 1.0.1
>         Environment: Linux, Erlang R14B
>            Reporter: David Orrell
> When downloading a large attachment the CouchDB process appears to load the entire attachment
in memory before data is sent to the client. I have a 1.5 GB attachment and the CouchDB process
grows by approximately this amount per client connection.
> For example (as reported by Bram Nejit):
> dd if=/dev/urandom of=/tmp/test.bin count=50000 bs=10240
> Put test.bin as an attachment in a coucdb database
> Run
> for i in {0..50};do curl http://localhost:5984/[test
> database]/[doc_id]/test.bin > /dev/null 2>&1 & done
> This will create 50 curl processes which download from your couchdb. Looking at the memory
consumption of couchdb, it seems like it is loading large parts of the file into memory.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message