spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "holdenk (JIRA)" <>
Subject [jira] [Created] (SPARK-10812) Spark Hadoop Util does not support stopping a non-yarn Spark Context & starting a Yarn spark context.
Date Thu, 24 Sep 2015 21:25:04 GMT
holdenk created SPARK-10812:

             Summary: Spark Hadoop Util does not support stopping a non-yarn Spark Context
& starting a Yarn spark context.
                 Key: SPARK-10812
             Project: Spark
          Issue Type: Bug
          Components: YARN
            Reporter: holdenk
            Priority: Minor

While this is likely not a huge issue for real production systems, for test systems which
may setup a Spark Context and tear it down and stand up a Spark Context with a different master
(e.g. some local mode & some yarn mode) tests this cane be an issue. Discovered during
work on spark-testing-base on Spark 1.4.1, but seems like the logic that triggers it is present
in master (see SparkHadoopUtil object). A valid work around for users encountering this issue
is to fork a different JVM, however this can be heavy weight.

[info] SampleMiniClusterTest:
[info] Exception encountered when attempting to run a suite with class name: com.holdenkarau.spark.testing.SampleMiniClusterTest
*** ABORTED ***
[info]   java.lang.ClassCastException: org.apache.spark.deploy.SparkHadoopUtil cannot be cast
to org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
[info]   at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.get(YarnSparkHadoopUtil.scala:163)
[info]   at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:257)
[info]   at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:561)
[info]   at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:115)
[info]   at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
[info]   at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
[info]   at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
[info]   at com.holdenkarau.spark.testing.SharedMiniCluster$class.setup(SharedMiniCluster.scala:186)
[info]   at com.holdenkarau.spark.testing.SampleMiniClusterTest.setup(SampleMiniClusterTest.scala:26)
[info]   at com.holdenkarau.spark.testing.SharedMiniCluster$class.beforeAll(SharedMiniCluster.scala:103)

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message