spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jerryshao <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-4352][YARN][WIP] Incorporate locality p...
Date Thu, 16 Jul 2015 06:18:52 GMT
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/6394#discussion_r34758587
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala ---
    @@ -872,6 +872,25 @@ class DAGScheduler(
         // will be posted, which should always come after a corresponding SparkListenerStageSubmitted
         // event.
         stage.latestInfo = StageInfo.fromStage(stage, Some(partitionsToCompute.size))
    +    val taskIdToLocations = try {
    +      stage match {
    +        case s: ShuffleMapStage =>
    +          partitionsToCompute.map { id => (id, getPreferredLocs(stage.rdd, id))}.toMap
    +        case s: ResultStage =>
    +          val job = s.resultOfJob.get
    +          partitionsToCompute.map { id =>
    +            val p = job.partitions(id)
    +            (id, getPreferredLocs(stage.rdd, p))
    +          }.toMap
    +      }
    +    } catch {
    +      case NonFatal(e) =>
    +        abortStage(stage, s"Task creation failed: $e\n${e.getStackTraceString}")
    +        runningStages -= stage
    +        return
    +    }
    +    stage.latestInfo.taskLocalityPreferences = Some(taskIdToLocations.values.toSeq)
    --- End diff --
    
    Yeah, this is a way, I will change it, another concern is about Mima test, since this
is a public class, I will test it locally, thanks for your advises :).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message