spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mohan s <iammoha...@gmail.com>
Subject Re: Question about executor memory setting
Date Thu, 29 Sep 2016 17:14:48 GMT
Hi

Kindly go through the below link.  It explains good way about spark memory allocations.
https://www.slideshare.net/cloudera/top-5-mistakes-to-avoid-when-writing-apache-spark-applications?from_m_app=ios

Regards

Mohan.s

> On 28-Sep-2016, at 7:57 AM, Dogtail L <spark.rui92@gmail.com> wrote:
> 
> Hi all,
> 
> May I ask a question about executor memory setting? I was running PageRank with input
size 2.8GB on one workstation for testing. I gave PageRank one executor.
> 
> In case 1, I set --executor-cores to 4, and --executor-memory to 1GB, the stage (stage
2) completion time is 14 min, the the detailed stage info is below:
> 
> 
> 
> In case 2, I set --executor-cores to 4, and --executor-memory to 6GB, the stage (stage
2) completion time is 34 min, the the detailed stage info is below:
> 
> ‚Äč
> I am totally confused why when executor-memory gets larger, the stage completion time
is more than two times slower? From the web UI, I found that when executor memory is 6GB,
the shuffle spill (Disk) per task is smaller, which means fewer IO operations, but weirdly,
the task completion time is longer though. Could anyone give me some hints? Great thanks!

Mime
View raw message