spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From squito <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-8880] Fix confusing Stage.attemptId mem...
Date Wed, 08 Jul 2015 21:05:53 GMT
Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/7275#discussion_r34198482
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/Stage.scala ---
    @@ -62,22 +62,28 @@ private[spark] abstract class Stage(
     
       var pendingTasks = new HashSet[Task[_]]
     
    +  /** The ID to use for the next new attempt for this stage. */
       private var nextAttemptId: Int = 0
     
       val name = callSite.shortForm
       val details = callSite.longForm
     
    -  /** Pointer to the latest [StageInfo] object, set by DAGScheduler. */
    -  var latestInfo: StageInfo = StageInfo.fromStage(this)
    +  /**
    +   * Pointer to the [StageInfo] object for the most recent attempt. This needs to be
initialized
    +   * here, before any attempts have actually been created, because the DAGScheduler uses
this
    +   * StageInfo to tell SparkListeners when a job starts (which happens before any stage
attempts
    +   * have been created).
    +   */
    +  private var _latestInfo: StageInfo = StageInfo.fromStage(this, nextAttemptId)
     
    -  /** Return a new attempt id, starting with 0. */
    -  def newAttemptId(): Int = {
    -    val id = nextAttemptId
    +  /** Creates a new attempt for this stage by creating a new StageInfo with a new attempt
ID. */
    +  def makeNewStageAttempt(numPartitionsToCompute: Int) = {
    --- End diff --
    
    needs a return type


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message