spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From pwendell <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...
Date Wed, 27 Aug 2014 03:23:56 GMT
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16755854
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging
{
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw
timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that,
this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does
not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`)
is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    gotcha - I think it's fine as-is. This is not a user visible config anywyas.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message