spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From srowen <...@git.apache.org>
Subject [GitHub] spark issue #19529: [SPARK-22308] Support alternative unit testing styles in...
Date Sun, 29 Oct 2017 10:32:07 GMT
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/19529
  
    @nkronenfeld @gatorsmile I think this has been failing the master build (Maven only) for
a few days:
    
    https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-2.6/3981/consoleFull
    
    ```
    SQLQuerySuite:
    *** RUN ABORTED ***
      org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see
SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently
running SparkContext was created at:
    org.apache.spark.sql.test.GenericFunSpecSuite.<init>(GenericFunSpecSuite.scala:28)
    sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    java.lang.reflect.Constructor.newInstance(Constructor.java:422)
    java.lang.Class.newInstance(Class.java:442)
    org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:66)
    org.scalatest.tools.DiscoverySuite$$anonfun$1.apply(DiscoverySuite.scala:38)
    org.scalatest.tools.DiscoverySuite$$anonfun$1.apply(DiscoverySuite.scala:37)
    scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    scala.collection.Iterator$class.foreach(Iterator.scala:893)
    scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
    scala.collection.AbstractTraversable.map(Traversable.scala:104)
    org.scalatest.tools.DiscoverySuite.<init>(DiscoverySuite.scala:37)
    org.scalatest.tools.Runner$.genDiscoSuites$1(Runner.scala:1165)
    org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1250)
    ```
    
    I suspect that there's some slightly different way that the tests execute in Maven that
may highlight a problem with how the SQL tests use and reuse SparkContexts. It's likely this
change did cause it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message