spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Starch, Michael D (398M)" <>
Subject Reusing Spark Functions
Date Wed, 14 Oct 2015 17:18:31 GMT

Is a Function object in Spark reused on a given executor, or is sent and deserialized with
each new task?

On my project, we have functions that incur a very large setup cost, but then could be called
many times.  Currently, I am using object deserialization to run this intensive setup,  I
am wondering if this function is reused (within the context of the executor), or I am I deserializing
this object over and over again for each task sent to a given worker.

Are there other ways to share objects between tasks on the same executor?

Many thanks,

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message