spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jcuquemelle <>
Subject [GitHub] spark pull request #19881: [SPARK-22683][CORE] Add tasksPerExecutorSlot para...
Date Tue, 20 Mar 2018 17:18:35 GMT
Github user jcuquemelle commented on a diff in the pull request:
    --- Diff: docs/ ---
    @@ -1795,6 +1796,19 @@ Apart from these, the following properties are also available,
and may be useful
         Lower bound for the number of executors if dynamic allocation is enabled.
    +  <td><code>spark.dynamicAllocation.fullParallelismDivisor</code></td>
    +  <td>1</td>
    +  <td>
    +    By default, the dynamic allocation will request enough executors to maximize the

    +    parallelism according to the number of tasks to process. While this minimizes the

    +    latency of the job, with small tasks this setting wastes a lot of resources due to
    --- End diff --


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message