livy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Junaid Nasir <>
Subject Session taking all the available resources even with number of cores specified
Date Thu, 11 Jan 2018 10:22:37 GMT
Hi everyone,

I am using livy 0.4 with Spark 2.1.0 standalone cluster.I can create sessions
and run jobs. but 1 session takes up all the available resources. I have tried
setting up executorCores, numExecutors as well as
this command works fine when running a session from cmd line  
                  ./spark-2.1.0/bin/pyspark --master spark://master:7077  --executor-cores
2  --num-executors 1 --total-executor-cores 4

Not using Mixmax yet?  
post request on livy:8998/session

                  {    "kind": "pyspark",    "proxyUser": "root",    "conf": {        "":
",,",        "spark.executor.cores": 2,        "":
2,        "livy.spark.driver.cores": 2,        "livy.spark.executor.cores": 2,        "livy.spark.executor.instances":
1    },    "executorMemory": "1G",    "executorCores": 2,    "numExecutors": 1,    "driverCores":
1,    "driverMemory": "1G"}

Not using Mixmax yet?  

Is there any configuration I can do to limit the cores, so that I can run
multiple sessions on same cluster?
View raw message