spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From vanzin <...@git.apache.org>
Subject [GitHub] spark pull request #20888: [SPARK-23775][TEST] Make DataFrameRangeSuite not ...
Date Fri, 13 Apr 2018 17:38:05 GMT
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20888#discussion_r181460382
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/DataFrameRangeSuite.scala ---
    @@ -152,22 +154,28 @@ class DataFrameRangeSuite extends QueryTest with SharedSQLContext
with Eventuall
       }
     
       test("Cancelling stage in a query with Range.") {
    +    val slices = 10
    +
         val listener = new SparkListener {
    -      override def onJobStart(jobStart: SparkListenerJobStart): Unit = {
    -        eventually(timeout(10.seconds), interval(1.millis)) {
    -          assert(DataFrameRangeSuite.stageToKill > 0)
    +      override def onTaskStart(taskStart: SparkListenerTaskStart): Unit = {
    +        eventually(timeout(10.seconds)) {
    +          assert(DataFrameRangeSuite.isTaskStarted)
             }
    -        sparkContext.cancelStage(DataFrameRangeSuite.stageToKill)
    +        sparkContext.cancelStage(taskStart.stageId)
    +        DataFrameRangeSuite.semaphore.release(slices)
    --- End diff --
    
    The do:
    
    ```
    val lock = new Object()
    lock.synchronized { lock.wait() }
    ```
    
    Again, you just want to go to sleep waiting for an interrupt. There's like a thousand
ways to do that.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message