spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 牛兆捷 <nzjem...@gmail.com>
Subject Re: memory size for caching RDD
Date Thu, 04 Sep 2014 06:27:15 GMT
But is it possible to make t resizable? When we don't have many RDD to
cache, we can give some memory to others.


2014-09-04 13:45 GMT+08:00 Patrick Wendell <pwendell@gmail.com>:

> Changing this is not supported, it si immutable similar to other spark
> configuration settings.
>
> On Wed, Sep 3, 2014 at 8:13 PM, 牛兆捷 <nzjemail@gmail.com> wrote:
> > Dear all:
> >
> > Spark uses memory to cache RDD and the memory size is specified by
> > "spark.storage.memoryFraction".
> >
> > One the Executor starts, does Spark support adjusting/resizing memory
> size
> > of this part dynamically?
> >
> > Thanks.
> >
> > --
> > *Regards,*
> > *Zhaojie*
>



-- 
*Regards,*
*Zhaojie*

Mime
View raw message