spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paolo Platter <paolo.plat...@agilelab.it>
Subject Executor memory
Date Wed, 04 Feb 2015 10:15:13 GMT
Hi spark,

I’m experiencing an issue on allcated executor memory.

I have a standalone cluster, with a master and three workers ( 16GB of RAM each )
In spark-env.sh I configured WORKER_MEMORY = 10g and when I check from spark UI all it’s
fine

[cid:Image21451.png@14b540f16d853cc]

When I launch the spark-shell ( with app submit it’s the same ) I allocate 10 gb

export SPARK_JAVA_OPTS="-Dspark.executor.memory=10g"

[cid:Image21457.png@14b541239ff53d2]

As you can see, memory it’s correctly allocated on workers, but when I check Application
UI Details I’m getting this data:

[cid:Image21459.png@14b5413c64c53d4]

Why the application is not using all the memory allocated, but just something more than 50%
?
I’m not able to find specific configuration about this behaviour.

Thanks

Paolo
Mime
View raw message