spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sai Prasanna <ansaiprasa...@gmail.com>
Subject SPARK Shell RDD reuse
Date Fri, 18 Apr 2014 07:07:20 GMT
Hi All,

In the interactive shell the spark context remains same. So if run a query
multiple times, the RDDs created by previous runs will be reused in the
subsequent runs and not recomputed until i exit and restart the shell again
right?

Or is there a way to force to reuse/recompute in the presence/absence of
RDDs programmatically?

Thanks !

Mime
View raw message