couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mike Marino <>
Subject Re: Storage limitations?
Date Sun, 10 Nov 2013 22:00:56 GMT
Hi Mark,

On Sun, Nov 10, 2013 at 10:10 PM, Mark Deibert <> wrote:
> Each comment is normally a very small amount of information, and id,
> userName, a date and a small text field. Probably no more than a short
> sentence on average. Why do I need to go through the trouble of creating
> new comment docs for each? This totally complicates the comment read code
> (there will 1000 reads to every 1 write, at least) both in the UI and the
> db for no reason.

I'm not sure why this should complicate the comment read code.  You
can generate a view that emits as key the particular page/post to
which the comment refers.  As value, you can emit either a subset of
data you need from the comment, or simply emit "null" and call the
view with include_docs to get all the documents for the particular
post.  This would be a single http call.


> On Sun, Nov 10, 2013 at 1:34 PM, Jens Alfke <> wrote:
>> On Nov 9, 2013, at 11:47 AM, Robert Newson <<mailto:
>>>> wrote:
>> attachments are different to documents. They're stored as a series of
>> binary chunks and so they can be streamed in and out, you can go large
>> with attachments.
>> But on the other hand, all attachments will get copied during a database
>> compaction, so they slow down the process and require more free disk space.
>> If you have many gigabytes of attachments, you might consider storing them
>> externally and putting URL links in the documents.
>> As for comments, just add new documents for each comment and use a
>> view (,
>> to bring the article
>> and comment thread together. No need to update a document that way.
>> Yup. The guide <<>> has a
>> chapter-long example of a blog application that shows how to do comments
>> this way.
>> —Jens

View raw message