hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "zhihai xu (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-16456) Kill spark job when InterruptedException happens or driverContext.isShutdown is true.
Date Fri, 28 Apr 2017 20:54:04 GMT

    [ https://issues.apache.org/jira/browse/HIVE-16456?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989448#comment-15989448
] 

zhihai xu commented on HIVE-16456:
----------------------------------

Thanks [~xuefuz]! I created a Review Request for my patch at the following RB link:
https://reviews.apache.org/r/58856/

> Kill spark job when InterruptedException happens or driverContext.isShutdown is true.
> -------------------------------------------------------------------------------------
>
>                 Key: HIVE-16456
>                 URL: https://issues.apache.org/jira/browse/HIVE-16456
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: zhihai xu
>            Assignee: zhihai xu
>            Priority: Minor
>         Attachments: HIVE-16456.000.patch
>
>
> Kill spark job when InterruptedException happens or driverContext.isShutdown is true.
If the InterruptedException happened in RemoteSparkJobMonitor and LocalSparkJobMonitor, it
will be better to kill the job. Also there is a race condition between submit the spark job
and query/operation cancellation, it will be better to check driverContext.isShutdown right
after submit the spark job. This will guarantee the job being killed no matter when shutdown
is called. It is similar as HIVE-15997.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message