livy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Junaid Nasir <jna...@an10.io>
Subject Session taking all the available resources even with number of cores specified
Date Thu, 11 Jan 2018 10:22:37 GMT
Hi everyone,

I am using livy 0.4 with Spark 2.1.0 standalone cluster.I can create sessions
and run jobs. but 1 session takes up all the available resources. I have tried
setting up executorCores, numExecutors as well as spark.total.executor.cores.
this command works fine when running a session from cmd line  
                  ./spark-2.1.0/bin/pyspark --master spark://master:7077  --executor-cores
2  --num-executors 1 --total-executor-cores 4
                

Not using Mixmax yet?  
post request on livy:8998/session

                  {    "kind": "pyspark",    "proxyUser": "root",    "conf": {        "spark.cassandra.connection.host":
"10.128.1.1,10.128.1.2,10.128.1.3",        "spark.executor.cores": 2,        "spark.total.executor.cores":
2,        "livy.spark.driver.cores": 2,        "livy.spark.executor.cores": 2,        "livy.spark.executor.instances":
1    },    "executorMemory": "1G",    "executorCores": 2,    "numExecutors": 1,    "driverCores":
1,    "driverMemory": "1G"}
                

Not using Mixmax yet?  

Is there any configuration I can do to limit the cores, so that I can run
multiple sessions on same cluster?
RegardsJunaid
Mime
View raw message