spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Question about yarn-cluster mode and spark.driver.allowMultipleContexts
Date Wed, 02 Dec 2015 18:28:41 GMT
On Tue, Dec 1, 2015 at 9:43 PM, Anfernee Xu <anfernee.xu@gmail.com> wrote:
> But I have a single server(JVM) that is creating SparkContext, are you
> saying Spark supports multiple SparkContext in the same JVM? Could you
> please clarify on this?

I'm confused. Nothing you said so far requires multiple contexts. From
your original message:

> I have a long running backend server where I will create a short-lived Spark job

You can have a single SparkContext and submit multiple jobs to it. And
that works regardless of cluster manager or deploy mode.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message