spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-11137) Make StreamingContext.stop() exception-safe
Date Mon, 18 Jan 2016 17:03:40 GMT

     [ https://issues.apache.org/jira/browse/SPARK-11137?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Apache Spark reassigned SPARK-11137:
------------------------------------

    Assignee: Apache Spark

> Make StreamingContext.stop() exception-safe
> -------------------------------------------
>
>                 Key: SPARK-11137
>                 URL: https://issues.apache.org/jira/browse/SPARK-11137
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.5.1
>            Reporter: Felix Cheung
>            Assignee: Apache Spark
>            Priority: Minor
>
> In StreamingContext.stop(), when an exception is thrown the rest of the stop/cleanup
action is aborted.
> Discussed in https://github.com/apache/spark/pull/9116,
> srowen commented
> Hm, this is getting unwieldy. There are several nested try blocks here. The same argument
goes for many of these methods -- if one fails should they not continue trying? A more tidy
solution would be to execute a series of () -> Unit code blocks that perform some cleanup
and make sure that they each fire in succession, regardless of the others. The final one to
remove the shutdown hook could occur outside synchronization.
> I realize we're expanding the scope of the change here, but is it maybe worthwhile to
go all the way here?
> Really, something similar could be done for SparkContext and there's an existing JIRA
for it somewhere.
> At least, I'd prefer to either narrowly fix the deadlock here, or fix all of the finally-related
issue separately and all at once.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message