spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Antony Mayi <antonym...@yahoo.com.INVALID>
Subject java.lang.OutOfMemoryError: GC overhead limit exceeded
Date Tue, 27 Jan 2015 20:43:38 GMT
Hi,
I am using spark.yarn.executor.memoryOverhead=8192 yet getting executors crashed with this
error.
does that mean I have genuinely not enough RAM or is this matter of config tuning?
other config options used:spark.storage.memoryFraction=0.3
SPARK_EXECUTOR_MEMORY=14G
running spark 1.2.0 as yarn-client on cluster of 10 nodes (the workload is ALS trainImplicit
on ~15GB dataset)
thanks for any ideas,Antony.
Mime
View raw message