hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jagadesh Kiran N (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HADOOP-11624) Prevent fail-over during client shutdown
Date Sat, 20 Jun 2015 09:49:00 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-11624?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel

Jagadesh Kiran N updated HADOOP-11624:
    Labels:   (was: newbie++)

> Prevent fail-over during client shutdown
> ----------------------------------------
>                 Key: HADOOP-11624
>                 URL: https://issues.apache.org/jira/browse/HADOOP-11624
>             Project: Hadoop Common
>          Issue Type: Bug
>            Reporter: Kihwal Lee
>            Priority: Critical
> We've seen a HBase RS hanging during shutdown. It turns out the ipc client was throwing
{{java.nio.channels.ClosedByInterruptException}} during the shutdown. Then the HA failover
retry logic then determined to failover and retry. If an interrupt was received as part of
the user shutting down the client, failover should not happen.

This message was sent by Atlassian JIRA

View raw message