spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [spark] venkata91 commented on a change in pull request #28874: [SPARK-32036] Replace references to blacklist/whitelist language with more appropriate terminology, excluding the blacklisting feature.
Date Sat, 20 Jun 2020 01:28:16 GMT

venkata91 commented on a change in pull request #28874:
URL: https://github.com/apache/spark/pull/28874#discussion_r443089270



##########
File path: core/src/test/scala/org/apache/spark/ui/UISeleniumSuite.scala
##########
@@ -48,24 +48,24 @@ import org.apache.spark.util.CallSite
 
 private[spark] class SparkUICssErrorHandler extends DefaultCssErrorHandler {
 
-  private val cssWhiteList = List("bootstrap.min.css", "vis-timeline-graph2d.min.css")
+  private val cssExcludeList = List("bootstrap.min.css", "vis-timeline-graph2d.min.css")

Review comment:
       also it seems in some cases we use `exclude` for both `whitelist` as well as `blacklist`.
Like here `exclude` is used for `whiteList` and [here](https://github.com/apache/spark/blob/8f414bc6b4eeb59203ecb33c26c762a57bf5429e/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterScheduler.scala#L542)
for `blackList`. In general, I prefer `allowedList` for `whitelist` and `denyList` or `rejectList`
or `stopList` etc for `blacklist` makes it easier to comprehend quickly. I understand its
hard to use the same word everywhere because of the context.

##########
File path: core/src/test/scala/org/apache/spark/ui/UISeleniumSuite.scala
##########
@@ -48,24 +48,24 @@ import org.apache.spark.util.CallSite
 
 private[spark] class SparkUICssErrorHandler extends DefaultCssErrorHandler {
 
-  private val cssWhiteList = List("bootstrap.min.css", "vis-timeline-graph2d.min.css")
+  private val cssExcludeList = List("bootstrap.min.css", "vis-timeline-graph2d.min.css")

Review comment:
       same here instead of `exclude` how about `allowed`?

##########
File path: R/pkg/tests/fulltests/test_sparkSQL.R
##########
@@ -3921,14 +3921,14 @@ test_that("No extra files are created in SPARK_HOME by starting session
and maki
   # before creating a SparkSession with enableHiveSupport = T at the top of this test file
   # (filesBefore). The test here is to compare that (filesBefore) against the list of files
before
   # any test is run in run-all.R (sparkRFilesBefore).
-  # sparkRWhitelistSQLDirs is also defined in run-all.R, and should contain only 2 whitelisted
dirs,
+  # sparkRIncludedSQLDirs is also defined in run-all.R, and should contain only 2 included
dirs,

Review comment:
       does it make sense to have `allowed` here instead of `included`?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message