spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soumya Simanta <soumya.sima...@gmail.com>
Subject Re: Problem with giving memory to executors on YARN
Date Sat, 20 Sep 2014 12:21:55 GMT
There 128 cores on each box. Yes there are other applications running on
the cluster. YARN is assigning two containers to my application. I'll
investigate this a little more. PS: I'm new to YARN.



On Fri, Sep 19, 2014 at 4:49 PM, Vipul Pandey <vipandey@gmail.com> wrote:

> How many cores do you have in your boxes?
> looks like you are assigning 32 cores "per" executor - is that what you
> want?  are there other applications running on the cluster? you might want
> to check YARN UI to see how many containers are getting allocated to your
> application.
>
>
> On Sep 19, 2014, at 1:37 PM, Soumya Simanta <soumya.simanta@gmail.com>
> wrote:
>
> I'm launching a Spark shell with the following parameters
>
> ./spark-shell --master yarn-client --executor-memory 32g --driver-memory
> 4g --executor-cores 32 --num-executors 8
>
> but when I look at the Spark UI it shows only 209.3 GB total memory.
>
>
> Executors (10)
>
>    - *Memory:* 55.9 GB Used (209.3 GB Total)
>
> This is a 10 node YARN cluster where each node has 48G of memory.
>
> Any idea what I'm missing here?
>
> Thanks
> -Soumya
>
>
>
>
>

Mime
View raw message