spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michał Gabrysiak (JIRA) <j...@apache.org>
Subject [jira] [Created] (SPARK-11588) Submitting of Spark (Streaming) Job and killing/stopping of them
Date Mon, 09 Nov 2015 08:24:11 GMT
Michał Gabrysiak created SPARK-11588:
----------------------------------------

             Summary: Submitting of Spark (Streaming) Job and killing/stopping of them
                 Key: SPARK-11588
                 URL: https://issues.apache.org/jira/browse/SPARK-11588
             Project: Spark
          Issue Type: Question
          Components: Spark Core, Spark Submit
    Affects Versions: 1.5.1
         Environment: Centos 6.7
            Reporter: Michał Gabrysiak


While searching for the possibility of submitting of the Spak Job on the side of the cluster
I found this article http://arturmkrtchyan.com/apache-spark-hidden-rest-api. I used them (Spark
REST API) and I thin it's working fine. I have a few question for this API:
1. I can't find information about this REST API in Spark documentation. Why? Why this api
is hidden?
2. If I want to use this REST API in my code can I start Spark into YARN? So I want to know
if YARN (or Mesos) cluster is supported?
3. How I can run driver on the cluster side (other possibilities than REST API) programmatically
from code? When I create SparkContext I can use "spark.submit.deployMode" flag (set to "cluster")
but I have the impression that the driver operates locally.

Second group of the questions (about killing of driver). When I start driver on the cluster
side I see information about this on Spark UI (see screen1.png). But when I press "(kill)"
link/button driver is still running. I saerched information how to kill drivers and I found
this option:
a) bin/spark-class org.apache.spark.deploy.Client kill <spark-master> <driver-id>
but i got information that Client  is deprecated
b) spark-subbmit --kill <spark-master> <driver-id> dosn't work
c) using spark REST API I got the same message as in point b but driver is still running
d) ps -aux | grep <driver-id> + kill -9 pid_of_driver works but it is not a good solution
(I want to stop driver programmatically)






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message