spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Saiph Kappa <>
Subject How to submit spark job to YARN from scala code
Date Thu, 17 Dec 2015 16:50:03 GMT

Since it is not currently possible to submit a spark job to a spark cluster
running in standalone mode (cluster mode - it's not currently possible to
specify this deploy mode within the code), can I do it with YARN?

I tried to do something like this (but in scala):


... // Client object - main
methodSystem.setProperty("SPARK_YARN_MODE", "true")val sparkConf = new
SparkConf()try {  val args = new ClientArguments(argStrings,
sparkConf)  new Client(args, sparkConf).run()} catch {  case e:
Exception => {    Console.err.println(e.getMessage)    System.exit(1)

» in

However it is not possible to create a new instance of Client since
import org.apache.spark.deploy.yarn.Client is private

Is there any way I can submit spark jobs from the code in cluster mode
and not using the spark-submit script?


View raw message