spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-26362) Deprecate 'spark.driver.allowMultipleContexts' to disallow multiple Spark contexts
Date Thu, 13 Dec 2018 15:43:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-26362?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720318#comment-16720318
] 

ASF GitHub Bot commented on SPARK-26362:
----------------------------------------

HyukjinKwon opened a new pull request #23311: [SPARK-26362][CORE] Deprecate 'spark.driver.allowMultipleContexts'
to discourage multiple creation of SparkContexts
URL: https://github.com/apache/spark/pull/23311
 
 
   ## What changes were proposed in this pull request?
   
   Multiple SparkContexts are discouraged and it has been warning for last 4 years ago (see
SPARK-4180). It could cause arbitrary and mysterious error cases.
   
   Honestly, I didn't even know Spark still allows it, which looks never officially supported
- see SPARK-2243.
   
   I believe It should be good timing now to deprecate this configuration.
   
   ## How was this patch tested?
   
   Manually tested:
   
   ```bash
   $ ./bin/spark-shell --conf=spark.driver.allowMultipleContexts=true
   ...
   Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
   18/12/13 23:36:48 WARN SparkContext: 'spark.driver.allowMultipleContexts' is deprecated
as of Spark 3.0.0, and creation of multiple SparkContexts will be disallowed afterward.
   ...
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> Deprecate 'spark.driver.allowMultipleContexts' to disallow multiple Spark contexts
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-26362
>                 URL: https://issues.apache.org/jira/browse/SPARK-26362
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Hyukjin Kwon
>            Priority: Major
>
> Multiple Spark contexts are discouraged and it has been warning from 4 years ago (see
SPARK-4180).
> It could cause arbitrary and mysterious error cases. (Honestly, I didn't even know Spark
allows it). 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message