spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jcuquemelle <...@git.apache.org>
Subject [GitHub] spark pull request #19881: [SPARK-22683][CORE] Add tasksPerExecutorSlot para...
Date Tue, 20 Mar 2018 17:18:35 GMT
Github user jcuquemelle commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19881#discussion_r175852840
  
    --- Diff: docs/configuration.md ---
    @@ -1795,6 +1796,19 @@ Apart from these, the following properties are also available,
and may be useful
         Lower bound for the number of executors if dynamic allocation is enabled.
       </td>
     </tr>
    +<tr>
    +  <td><code>spark.dynamicAllocation.fullParallelismDivisor</code></td>
    +  <td>1</td>
    +  <td>
    +    By default, the dynamic allocation will request enough executors to maximize the

    +    parallelism according to the number of tasks to process. While this minimizes the

    +    latency of the job, with small tasks this setting wastes a lot of resources due to
    --- End diff --
    
    done


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message