spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Starch, Michael D (398M)" <Michael.D.Sta...@jpl.nasa.gov>
Subject Reusing Spark Functions
Date Wed, 14 Oct 2015 17:18:31 GMT
All,

Is a Function object in Spark reused on a given executor, or is sent and deserialized with
each new task?

On my project, we have functions that incur a very large setup cost, but then could be called
many times.  Currently, I am using object deserialization to run this intensive setup,  I
am wondering if this function is reused (within the context of the executor), or I am I deserializing
this object over and over again for each task sent to a given worker.

Are there other ways to share objects between tasks on the same executor?

Many thanks,

Michael
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message