spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-4180) SparkContext constructor should throw exception if another SparkContext is already running
Date Fri, 06 Feb 2015 22:34:38 GMT

     [ https://issues.apache.org/jira/browse/SPARK-4180?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen updated SPARK-4180:
-----------------------------
    Fix Version/s:     (was: 1.2.0)

> SparkContext constructor should throw exception if another SparkContext is already running
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4180
>                 URL: https://issues.apache.org/jira/browse/SPARK-4180
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>            Priority: Blocker
>              Labels: backport-needed
>
> Spark does not currently support multiple concurrently-running SparkContexts in the same
JVM (see SPARK-2243).  Therefore, SparkContext's constructor should throw an exception if
there is an active SparkContext that has not been shut down via {{stop()}}.
> PySpark already does this, but the Scala SparkContext should do the same thing.  The
current behavior with multiple active contexts is unspecified / not understood and it may
be the source of confusing errors (see the user error report in SPARK-4080, for example).
> This should be pretty easy to add: just add a {{activeSparkContext}} field to the SparkContext
companion object and {{synchronize}} on it in the constructor and {{stop()}} methods; see
PySpark's {{context.py}} file for an example of this approach.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message