cassandra-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sikander Rafiq <hafiz_ra...@hotmail.com>
Subject Re: No Host AvailableException during querying Cassandra.
Date Mon, 30 Jan 2017 09:32:50 GMT
Hi Admin,

Please remove my email from the list. thanks.

Sikander


Sent from Outlook<http://aka.ms/weboutlook>


________________________________
From: Venkata D <dvenkatj2eedev@gmail.com>
Sent: Friday, January 27, 2017 10:01 PM
To: user@cassandra.apache.org
Subject: No Host AvailableException during querying Cassandra.

Hello All,

We are using DSE 4.6.6 & Cassandra 2.0.14.425.

I am facing this exception right now. We got this exception couple of times & repair jobs
helped us temporarily.

As the data is growing significantly we are experiencing this exception more than couple of
times. Does any one have any thoughts on this ?


Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure:
Task 114 in stage 17.0 failed 4 times, most recent failure: Lost task 114.3 in stage 17.0
(TID 196, ): com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried
for query failed (tried: [All IP addresses] - use getErrors() for details)
        com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:65)
        com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:259)
        com.datastax.driver.core.ArrayBackedResultSet$MultiPage.prepareNextRow(ArrayBackedResultSet.java:279)
        com.datastax.driver.core.ArrayBackedResultSet$MultiPage.isExhausted(ArrayBackedResultSet.java:239)
        com.datastax.driver.core.ArrayBackedResultSet$1.hasNext(ArrayBackedResultSet.java:122)
        com.datastax.spark.connector.rdd.reader.PrefetchingResultSetIterator.hasNext(PrefetchingResultSetIterator.scala:16)
        scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
        scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
        com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:10)
        scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388)
        scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
        scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
        org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:235)
        org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:163)
        org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:70)
        org.apache.spark.rdd.RDD.iterator(RDD.scala:227)
        org.apache.spark.rdd.FilteredRDD.compute(FilteredRDD.scala:34)
        org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
        org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
        org.apache.spark.rdd.FilteredRDD.compute(FilteredRDD.scala:34)
        org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
        org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        org.apache.spark.scheduler.Task.run(Task.scala:54)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        java.lang.Thread.run(Thread.java:745)


Thanks,
Venkat.

Mime
View raw message