incubator-couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Norman Barker <norman.bar...@gmail.com>
Subject Re: chunked response and couch_doc_open
Date Fri, 23 Oct 2009 18:11:37 GMT
On Fri, Oct 23, 2009 at 11:33 AM, Paul Davis
<paul.joseph.davis@gmail.com> wrote:
> On Fri, Oct 23, 2009 at 1:27 PM, Norman Barker <norman.barker@gmail.com> wrote:
>> Hi,
>>
>> is there a way (in Erlang) to open a couchdb document and to iterate
>> over the document body without having to open up all of the document
>> in memory?
>>
>> I would like to use a chunked response to keep the system having a low
>> memory overhead.
>>
>> Not a particular couch question, is there a method in erlang to find
>> the size (as in number of bytes) of a particular term?
>>
>> many thanks,
>>
>> Norman
>>
>
> Norman,
>
> Well, for document JSON we store Erlang term binaries on disk so
> there's no real way to stream a doc across the wire from disk without
> loading the whole thing into RAM. Have you noticed CouchDB having
> memory issues on read loads? Its generally pretty light on its memory
> requirements for reads.
>
> The only way to get the size of a Term in bytes that I know of is the
> brute force: size(term_to_binary(Term)) method.
>
> Paul Davis
>

I am sending sizeable JSON documents (a couple of mb), as this scales
by X concurrent users then the problem grows. I have crashed erlang
when the process gets up to about a 1gb of memory.  (Note, this was on
windows) The workaround is to increase the memory allocation.

Erlang (and couchdb) is fantastic in that it is so light to run as
opposed to a J2EE server, streaming documents out would be good
optimisation. Running a couchdb instance in < 30mb of memory space
would be my ideal.

If you can point me in the right direction then this is something I
can contribute back, most of my erlang code so far has been specific
to my application.

Many thanks,

Norman

Mime
View raw message