spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paolo Platter <>
Subject Executor memory
Date Wed, 04 Feb 2015 10:15:13 GMT
Hi spark,

I’m experiencing an issue on allcated executor memory.

I have a standalone cluster, with a master and three workers ( 16GB of RAM each )
In I configured WORKER_MEMORY = 10g and when I check from spark UI all it’s


When I launch the spark-shell ( with app submit it’s the same ) I allocate 10 gb

export SPARK_JAVA_OPTS="-Dspark.executor.memory=10g"


As you can see, memory it’s correctly allocated on workers, but when I check Application
UI Details I’m getting this data:


Why the application is not using all the memory allocated, but just something more than 50%
I’m not able to find specific configuration about this behaviour.


View raw message