spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shay Rojansky (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-7736) Exception not failing Python applications (in yarn cluster mode)
Date Thu, 25 Jun 2015 20:56:04 GMT

    [ https://issues.apache.org/jira/browse/SPARK-7736?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14601923#comment-14601923
] 

Shay Rojansky commented on SPARK-7736:
--------------------------------------

The problem is simply with the YARN status for the application. If a Spark application throws
an exception after having instantiated the SparkContext, the application obviously terminates
but YARN lists the job as SUCCEEDED. This makes it hard for users to see what happened to
their jobs in the YARN UI.

Let me know if this is still unclear.

> Exception not failing Python applications (in yarn cluster mode)
> ----------------------------------------------------------------
>
>                 Key: SPARK-7736
>                 URL: https://issues.apache.org/jira/browse/SPARK-7736
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>         Environment: Spark 1.3.1, Yarn 2.7.0, Ubuntu 14.04
>            Reporter: Shay Rojansky
>
> It seems that exceptions thrown in Python spark apps after the SparkContext is instantiated
don't cause the application to fail, at least in Yarn: the application is marked as SUCCEEDED.
> Note that any exception right before the SparkContext correctly places the application
in FAILED state.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message