hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "zhihai xu (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HIVE-16422) Should kill running Spark Jobs when a query is cancelled.
Date Wed, 12 Apr 2017 02:31:41 GMT

     [ https://issues.apache.org/jira/browse/HIVE-16422?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

zhihai xu updated HIVE-16422:
-----------------------------
    Status: Patch Available  (was: Open)

> Should kill running Spark Jobs when a query is cancelled.
> ---------------------------------------------------------
>
>                 Key: HIVE-16422
>                 URL: https://issues.apache.org/jira/browse/HIVE-16422
>             Project: Hive
>          Issue Type: Bug
>          Components: Spark
>    Affects Versions: 2.1.0
>            Reporter: zhihai xu
>            Assignee: zhihai xu
>         Attachments: HIVE-16422.000.txt
>
>
> Should kill running Spark Jobs when a query is cancelled. When a query is cancelled,
Driver.releaseDriverContext will be called by Driver.close. releaseDriverContext will call
DriverContext.shutdown which will call all the running tasks' shutdown.
> {code}
>   public synchronized void shutdown() {
>     LOG.debug("Shutting down query " + ctx.getCmd());
>     shutdown = true;
>     for (TaskRunner runner : running) {
>       if (runner.isRunning()) {
>         Task<?> task = runner.getTask();
>         LOG.warn("Shutting down task : " + task);
>         try {
>           task.shutdown();
>         } catch (Exception e) {
>           console.printError("Exception on shutting down task " + task.getId() + ": "
+ e);
>         }
>         Thread thread = runner.getRunner();
>         if (thread != null) {
>           thread.interrupt();
>         }
>       }
>     }
>     running.clear();
>   }
> {code}
> since SparkTask didn't implement shutdown method to kill the running spark job, the spark
job may be still running after the query is cancelled. So it will be good to kill the spark
job in SparkTask.shutdown to save cluster resource.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message