cassandra-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jens Rantil <>
Subject Re: Increasing size of "Batch of prepared statements"
Date Wed, 22 Oct 2014 18:14:41 GMT

Apologize for the late answer.

On Mon, Oct 6, 2014 at 2:38 PM, shahab <> wrote:

> But do you mean that inserting columns with large size (let's say a text
> with 20-30 K) is potentially problematic in Cassandra?

AFAIK, the size _warning_ you are getting relates to the size of the batch
of prepared statements (INSERT INTO mykeyspace.mytable VALUES (?,?,?,?)).
That is, it has nothing to do with the actual content of your row. 20-30 K
shouldn't be a problem. But it's considered good practise to split larger
files (maybe > 5 MB into chunks) since it makes operations easier to your
cluster more likely to spread more evenly across cluster.

> What shall i do if I want columns with large size?

Just don't insert to many rows in a single batch and you should be fine.
Like Shane's JIRA ticket said, the warning is to let you know you are not
following best practice when adding too many rows in a single batch. It can
create bottlenecks in a single Cassandra node.


Jens Rantil
Backend engineer
Tink AB

Phone: +46 708 84 18 32

Facebook <!/> Linkedin
 Twitter <>

View raw message