spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Prabhu Joseph <prabhujose.ga...@gmail.com>
Subject Concurreny does not improve for Spark Jobs with Same Spark Context
Date Fri, 19 Feb 2016 05:51:35 GMT
Hi All,

   When running concurrent Spark Jobs on YARN (Spark-1.5.2) which share a
single Spark Context, the jobs take more time to complete comparing with
when they ran with different Spark Context.
The spark jobs are submitted on different threads.

Test Case:

    A.  3 spark jobs submitted serially
    B.  3 spark jobs submitted concurrently and with different SparkContext
    C.  3 spark jobs submitted concurrently and with same Spark Context
    D.  3 spark jobs submitted concurrently and with same Spark Context and
tripling the resources.

A and B takes equal time, But C and D are taking 2-3 times longer than A,
which shows concurrency does not improve with shared Spark Context. [Spark
Job Server]

Thanks,
Prabhu Joseph

Mime
View raw message