spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Liu, Raymond" <raymond....@intel.com>
Subject RE: memory size for caching RDD
Date Thu, 04 Sep 2014 06:32:55 GMT
You don’t need to. It is not static allocated to RDD cache, it is just an up limit.
If you don’t use up the memory by RDD cache, it is always available for other usage. except
those one also controlled by some memoryFraction conf. e.g. spark.shuffle.memoryFraction which
you also set the up limit.

Best Regards,
Raymond Liu

From: 牛兆捷 [mailto:nzjemail@gmail.com]
Sent: Thursday, September 04, 2014 2:27 PM
To: Patrick Wendell
Cc: user@spark.apache.org; dev@spark.apache.org
Subject: Re: memory size for caching RDD

But is it possible to make t resizable? When we don't have many RDD to cache, we can give
some memory to others.

2014-09-04 13:45 GMT+08:00 Patrick Wendell <pwendell@gmail.com<mailto:pwendell@gmail.com>>:
Changing this is not supported, it si immutable similar to other spark
configuration settings.

On Wed, Sep 3, 2014 at 8:13 PM, 牛兆捷 <nzjemail@gmail.com<mailto:nzjemail@gmail.com>>
wrote:
> Dear all:
>
> Spark uses memory to cache RDD and the memory size is specified by
> "spark.storage.memoryFraction".
>
> One the Executor starts, does Spark support adjusting/resizing memory size
> of this part dynamically?
>
> Thanks.
>
> --
> *Regards,*
> *Zhaojie*



--
Regards,
Zhaojie

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message