spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From andrewor14 <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-6284][MESOS] Add mesos role, principal ...
Date Wed, 01 Jul 2015 22:22:53 GMT
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4960#discussion_r33731141
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala
---
    @@ -79,17 +119,59 @@ private[mesos] trait MesosSchedulerUtils extends Logging {
       /**
        * Signal that the scheduler has registered with Mesos.
        */
    +  protected def getResource(res: List[Resource], name: String): Double = {
    +    // A resource can have multiple values in the offer since it can either be from
    +    // a specific role or wildcard.
    +    res.filter(_.getName == name).map(_.getScalar.getValue).sum
    +  }
    +
       protected def markRegistered(): Unit = {
         registerLatch.countDown()
       }
     
       /**
    -   * Get the amount of resources for the specified type from the resource list
    +   * Partition the existing set of resources into two groups, those remaining to be
    +   * scheduled and those requested to be used for a new task.
    +   * @param resources The full list of available resources
    +   * @param resourceName The name of the resource to take from the available resources
    +   * @param count The amount of resources to take from the available resources
    +   * @return The remaining resources list and the used resources list.
        */
    -  protected def getResource(res: List[Resource], name: String): Double = {
    -    for (r <- res if r.getName == name) {
    -      return r.getScalar.getValue
    +  def partitionResources(
    +      resources: List[Resource],
    +      resourceName: String,
    +      count: Double): (List[Resource], List[Resource]) = {
    +    var remain = count
    +    var requestedResources = new ArrayBuffer[Resource]
    +    val remainingResources = resources.collect {
    +      case r => {
    --- End diff --
    
    The use of collect here is a little strange. People usually use it to filter things out,
but here we don't do that. To simplify the code I think you can just use map here. Also, style:
    ```
    val remainingResources = resources.map { r =>
      if (remain > 0 ...) {
        ...
      } else {
        ...
      }
    }
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message