spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sachin Aggarwal <>
Subject notebook connecting Spark On Yarn
Date Wed, 15 Feb 2017 11:41:10 GMT

I am trying to create multiple notebooks connecting to spark on yarn. After
starting few jobs my cluster went out of containers. All new notebook
request are in busy state as Jupyter kernel gateway is not getting any
containers for master to be started.

Some job are not leaving the containers for approx 10-15 mins. so user is
not able to figure out what is wrong, why his kernel is still in busy state

Is there any property or hack by which I can return valid response to users
that there are no containers left.

can I label/mark few containers for master equal to max kernel execution I
am allowing in my cluster. so that if new kernel starts he will at least
one container for master. it can be dynamic on priority based. if there is
no container left then yarn can preempt some containers and provide them to
new requests.


Thanks & Regards

Sachin Aggarwal

View raw message