spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <>
Subject [jira] [Updated] (SPARK-5772) spark-submit --driver-memory not correctly taken into account
Date Thu, 12 Feb 2015 13:31:12 GMT


Sean Owen updated SPARK-5772:
    Component/s:     (was: Spark Core)
                 Spark Submit
       Priority: Minor  (was: Major)

{{spark-submit}} handles the case of JVM properties for the driver specially when configured
in the config file. The same is not done for {{--driver-memory}}. My hunch is that this is
supposed to be unsupported, but probably worth a warning or something at least.

> spark-submit --driver-memory not correctly taken into account
> -------------------------------------------------------------
>                 Key: SPARK-5772
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.2.0, 1.2.1
>         Environment: Debian 7
>            Reporter: Guillaume Charhon
>            Priority: Minor
> spark-submit --driver-memory not taken correctly into account
> The spark.driver.memory does not seem to be correctly taken into account. I came across
this issue as I had a java.lang.OutOfMemoryError: Java heap space when I was doing a random
forest training.
> I did all my tests with 1 master and 4 worker nodes. All machines have 16 cores, 106
Gb of RAM running Debian 7 on Google Compute Engine.
> As I had the memory error, I noticed that the master had only 265.4 MB registered on
the Executor page of the Web UI. Workers machines have 42.4 GB.
> in command line:
> ../hadoop/spark-install/bin/spark-submit --driver-memory=83971m 
> --> does NOT work (master memory is not correct)
> in spark-default.conf : 
> spark.driver.memory 83971m 
> -->works
> in
> -->works
> The spark.driver.memory is displayed with the correct value (83971m) on the Web UI (http://spark-m:4040/environment/
> ) for all the tests. 
> However, we can see on the executor Web ui page (http://spark-m:4040/executors/) that
the memory is not correctly allocated when the option is provided with the spark-submit command

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message