spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From dongjoon-hyun <...@git.apache.org>
Subject [GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...
Date Mon, 10 Jul 2017 03:23:00 GMT
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126329304
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,61 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(_ > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    --- End diff --
    
    It's a SparkException check test case for validation [ExecutorAllocationManager](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala#L177-L179
    ).
    
    I think you can replace `SparkException` with yours in `ExecutorAllocationManagerSuite`.
    ```
        // Min < 0
        val conf1 = conf.clone().set("spark.dynamicAllocation.minExecutors", "-1")
        intercept[SparkException] { contexts += new SparkContext(conf1) }
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message