spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From andrewo...@apache.org
Subject spark git commit: [SPARK-5636] Ramp up faster in dynamic allocation
Date Fri, 06 Feb 2015 18:56:08 GMT
Repository: spark
Updated Branches:
  refs/heads/branch-1.3 156839181 -> 0a903059c


[SPARK-5636] Ramp up faster in dynamic allocation

A recent patch #4051 made the initial number default to 0. With this change, any Spark application
using dynamic allocation's default settings will ramp up very slowly. Since we never request
more executors than needed to saturate the pending tasks, it is safe to ramp up quickly. The
current default of 60 may be too slow.

Author: Andrew Or <andrew@databricks.com>

Closes #4409 from andrewor14/dynamic-allocation-interval and squashes the following commits:

d3cc485 [Andrew Or] Lower request interval


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0a903059
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0a903059
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0a903059

Branch: refs/heads/branch-1.3
Commit: 0a903059cfb538cb0abd028d088436b8789bcff5
Parents: 1568391
Author: Andrew Or <andrew@databricks.com>
Authored: Fri Feb 6 10:54:23 2015 -0800
Committer: Andrew Or <andrew@databricks.com>
Committed: Fri Feb 6 10:56:04 2015 -0800

----------------------------------------------------------------------
 .../scala/org/apache/spark/ExecutorAllocationManager.scala     | 6 +++---
 docs/configuration.md                                          | 2 +-
 2 files changed, 4 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/0a903059/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala b/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala
index 5d5288b..8b38366 100644
--- a/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala
+++ b/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala
@@ -76,15 +76,15 @@ private[spark] class ExecutorAllocationManager(
   private val maxNumExecutors = conf.getInt("spark.dynamicAllocation.maxExecutors",
     Integer.MAX_VALUE)
 
-  // How long there must be backlogged tasks for before an addition is triggered
+  // How long there must be backlogged tasks for before an addition is triggered (seconds)
   private val schedulerBacklogTimeout = conf.getLong(
-    "spark.dynamicAllocation.schedulerBacklogTimeout", 60)
+    "spark.dynamicAllocation.schedulerBacklogTimeout", 5)
 
   // Same as above, but used only after `schedulerBacklogTimeout` is exceeded
   private val sustainedSchedulerBacklogTimeout = conf.getLong(
     "spark.dynamicAllocation.sustainedSchedulerBacklogTimeout", schedulerBacklogTimeout)
 
-  // How long an executor must be idle for before it is removed
+  // How long an executor must be idle for before it is removed (seconds)
   private val executorIdleTimeout = conf.getLong(
     "spark.dynamicAllocation.executorIdleTimeout", 600)
 

http://git-wip-us.apache.org/repos/asf/spark/blob/0a903059/docs/configuration.md
----------------------------------------------------------------------
diff --git a/docs/configuration.md b/docs/configuration.md
index 4c86cb7..00e973c 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -1140,7 +1140,7 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 <tr>
   <td><code>spark.dynamicAllocation.schedulerBacklogTimeout</code></td>
-  <td>60</td>
+  <td>5</td>
   <td>
     If dynamic allocation is enabled and there have been pending tasks backlogged for more
than
     this duration (in seconds), new executors will be requested. For more detail, see this


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org


Mime
View raw message