It is practically a seek and large streaming read. I do not believe this would be an issue. I have never run such a workload but a simple experiment should clear the air.

Cheers
Avinash

On Wed, Mar 17, 2010 at 7:42 PM, Carlos Sanchez <carlos.sanchez@riskmetrics.com> wrote:
We could have blob as large as 50mb compressed (XML compresses quite well).  Typical documents we would deal with would be between 500K and 3MB

Carlos


________________________________________
From: Avinash Lakshman [avinash.lakshman@gmail.com]
Sent: Wednesday, March 17, 2010 8:49 PM
To: user@cassandra.apache.org
Subject: Re: Storing large blobs

My question would be how large is large? Perhaps you could compress the blobs and then store them. But it depends on the answer to the first question.

Cheers
Avinash

On Wed, Mar 17, 2010 at 5:10 PM, Carlos Sanchez <carlos.sanchez@riskmetrics.com<mailto:carlos.sanchez@riskmetrics.com>> wrote:
Has anyone had experience storing large blobs in Cassandra? Is really Cassandra tailored for large content?

Carlos

This email message and any attachments are for the sole use of the intended recipients and may contain proprietary and/or confidential information which may be privileged or otherwise protected from disclosure. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not an intended recipient, please contact the sender by reply email and destroy the original message and any copies of the message as well as any attachments to the original message.


This email message and any attachments are for the sole use of the intended recipients and may contain proprietary and/or confidential information which may be privileged or otherwise protected from disclosure. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not an intended recipient, please contact the sender by reply email and destroy the original message and any copies of the message as well as any attachments to the original message.