spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nan Zhu (JIRA)" <>
Subject [jira] [Commented] (SPARK-20251) Spark streaming skips batches in a case of failure
Date Sun, 09 Apr 2017 23:57:41 GMT


Nan Zhu commented on SPARK-20251:

why this is an invalid report? I have been observing the same behavior recently when I upgrade
to Spark 2.1 

The basic idea (in my side), an exception thrown from DStream.compute() method should close
the app instead of proceeding (as the error handling in Spark Streaming is to release the
await lock set in awaitTermination)

I am still looking at those threads within Spark Streaming to see what was happening, 

can we change it back to a valid case and give me more time to investigate?

> Spark streaming skips batches in a case of failure
> --------------------------------------------------
>                 Key: SPARK-20251
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0
>            Reporter: Roman Studenikin
> We are experiencing strange behaviour of spark streaming application. Sometimes it just
skips batch in a case of job failure and starts working on the next one.
> We expect it to attempt to reprocess batch, but not to skip it. Is it a bug or we are
missing any important configuration params?
> Screenshots from spark UI:

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message