cassandra-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paul Nickerson <>
Subject Re: Out of Memory Error While Opening SSTables on Startup
Date Tue, 10 Feb 2015 20:15:12 GMT
Thank you Rob. I tried a 12 GiB heap size, and still crashed out. There
are 1,617,289 files under OpsCenter/rollups60.

Once I downgraded Cassandra to 2.1.1 (apt-get install cassandra=2.1.1), I
was able to start up Cassandra OK with the default heap size formula.

Now my cluster is running multiple versions of Cassandra. I think I will
downgrade the rest to 2.1.1.

 ~ Paul Nickerson

On Tue, Feb 10, 2015 at 2:05 PM, Robert Coli <> wrote:

> On Tue, Feb 10, 2015 at 11:02 AM, Paul Nickerson <> wrote:
>> I am getting an out of memory error why I try to start Cassandra on one
>> of my nodes. Cassandra will run for a minute, and then exit without
>> outputting any error in the log file. It is happening while SSTableReader
>> is opening a couple hundred thousand things.
> ...
>> Does anyone know how I might get Cassandra on this node running again?
>> I'm not very familiar with correctly tuning Java memory parameters, and I'm
>> not sure if that's the right solution in this case anyway.
> Try running 2.1.1, and/or increasing heap size beyond 8gb.
> Are there actually that many SSTables on disk?
> =Rob

View raw message