hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "zhihai xu (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-16456) Kill spark job when InterruptedException happens or driverContext.isShutdown is true.
Date Thu, 04 May 2017 19:54:04 GMT

    [ https://issues.apache.org/jira/browse/HIVE-16456?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15997349#comment-15997349
] 

zhihai xu commented on HIVE-16456:
----------------------------------

I also updated the new patch at the review board:
https://reviews.apache.org/r/58856/diff/1-2/

> Kill spark job when InterruptedException happens or driverContext.isShutdown is true.
> -------------------------------------------------------------------------------------
>
>                 Key: HIVE-16456
>                 URL: https://issues.apache.org/jira/browse/HIVE-16456
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: zhihai xu
>            Assignee: zhihai xu
>            Priority: Minor
>         Attachments: HIVE-16456.000.patch, HIVE-16456.001.patch
>
>
> Kill spark job when InterruptedException happens or driverContext.isShutdown is true.
If the InterruptedException happened in RemoteSparkJobMonitor and LocalSparkJobMonitor, it
will be better to kill the job. Also there is a race condition between submit the spark job
and query/operation cancellation, it will be better to check driverContext.isShutdown right
after submit the spark job. This will guarantee the job being killed no matter when shutdown
is called. It is similar as HIVE-15997.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message