hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Aihua Xu (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-18916) SparkClientImpl doesn't error out if spark-submit fails
Date Wed, 13 Jun 2018 19:20:00 GMT

    [ https://issues.apache.org/jira/browse/HIVE-18916?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16511568#comment-16511568
] 

Aihua Xu commented on HIVE-18916:
---------------------------------

[~stakiar] One thought on how to get the error: Rather than checking the log for "Error",
can we separate the STDERR from STDOUT from bin/spark-submit process so when there is an error,
we can capture the error from STDERR? Is that possible? 

> SparkClientImpl doesn't error out if spark-submit fails
> -------------------------------------------------------
>
>                 Key: HIVE-18916
>                 URL: https://issues.apache.org/jira/browse/HIVE-18916
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Assignee: Sahil Takiar
>            Priority: Major
>         Attachments: HIVE-18916.1.WIP.patch, HIVE-18916.2.patch, HIVE-18916.3.patch
>
>
> If {{spark-submit}} returns a non-zero exit code, {{SparkClientImpl}} will simply log
the exit code, but won't throw an error. Eventually, the connection timeout will get triggered
and an exception like {{Timed out waiting for client connection}} will be logged, which is
pretty misleading.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message