Just check the process owner to be sure (top, htop, ps, ...)

http://docs.datastax.com/en/cassandra/2.0/cassandra/install/installRecommendSettings.html#reference_ds_sxl_gf3_2k__user-resource-limits

C*heers,

Alain

2015-07-01 7:33 GMT+02:00 Neha Trivedi <nehajtrivedi@gmail.com>:
Arun,
I am logging on to Server as root and running (sudo service cassandra start)

regards
Neha

On Wed, Jul 1, 2015 at 11:00 AM, Neha Trivedi <nehajtrivedi@gmail.com> wrote:
Thanks Arun ! I will try and get back !

On Wed, Jul 1, 2015 at 10:32 AM, Arun <arunsirik@gmail.com> wrote:
Looks like you have too many open files issue. Increase the ulimit for the user.

 If you are starting the cassandra daemon using user cassandra, increase the ulimit for that user.


> On Jun 30, 2015, at 21:16, Neha Trivedi <nehajtrivedi@gmail.com> wrote:
>
> Hello,
> I have a 4 node cluster with SimpleSnitch.
> Cassandra :  Cassandra 2.1.3
>
> I am trying to add a new node (cassandra 2.1.7) and I get the following error.
>
> ERROR [STREAM-IN-] 2015-06-30 05:13:48,516 JVMStabilityInspector.java:94 - JVM state determined to be unstable.  Exiting forcefully due to:
> java.io.FileNotFoundException: /var/lib/cassandra/data/-Index.db (Too many open files)
>
> I increased the MAX_HEAP_SIZE then I get :
> ERROR [CompactionExecutor:9] 2015-06-30 23:31:44,792 CassandraDaemon.java:223 - Exception in thread Thread[CompactionExecutor:9,1,main]
> java.lang.RuntimeException: java.io.FileNotFoundException: /var/lib/cassandra/data/-Data.db (Too many open files)
>     at org.apache.cassandra.io.compress.CompressedThrottledReader.open(CompressedThrottledReader.java:52) ~[apache-cassandra-2.1.7.jar:2.1.7]
>
> Is it because of the different version of Cassandra (2.1.3 and 2.17) ?
>
> regards
> N
>
>
>
>
>
>
>