hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Xuefu Zhang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-12650) Spark-submit is killed when Hive times out. Killing spark-submit doesn't cancel AM request. When AM is finally launched, it tries to connect back to Hive and gets refused.
Date Fri, 25 Mar 2016 04:13:25 GMT

    [ https://issues.apache.org/jira/browse/HIVE-12650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15211391#comment-15211391
] 

Xuefu Zhang commented on HIVE-12650:
------------------------------------

Thanks, Rui. I think it's fine to list all possible causes in an error message when we don't
actually know the exact one. We can also suggest user where to look further (such as yarn
logs).

I understand that prewarming containers complicates the things a bit, but I'm not sure of
your proposal. Could you provide a patch showing the changes you have in mind?

> Spark-submit is killed when Hive times out. Killing spark-submit doesn't cancel AM request.
When AM is finally launched, it tries to connect back to Hive and gets refused.
> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-12650
>                 URL: https://issues.apache.org/jira/browse/HIVE-12650
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 1.1.1, 1.2.1
>            Reporter: JoneZhang
>            Assignee: Xuefu Zhang
>
> I think hive.spark.client.server.connect.timeout should be set greater than spark.yarn.am.waitTime.
The default value for 
> spark.yarn.am.waitTime is 100s, and the default value for hive.spark.client.server.connect.timeout
is 90s, which is not good. We can increase it to a larger value such as 120s.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message