airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ivan Vergiliev (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AIRFLOW-1515) Airflow 1.8.1 tasks not being marked as upstream_failed when one of the parents fails
Date Tue, 05 Sep 2017 13:09:00 GMT

    [ https://issues.apache.org/jira/browse/AIRFLOW-1515?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16153612#comment-16153612
] 

Ivan Vergiliev commented on AIRFLOW-1515:
-----------------------------------------

I'm seeing this as well on 1.8.0. As far is I can tell, this is related to the `flag_upstream_failed`
property of `DepContext` and the logic around it in `airflow/ti_deps/deps/trigger_rule_dep.py:_evaluate_trigger_rule`.
If I'm reading this correctly, when `flag_upstream_failed` is set to False, task state is
not properly set to `UPSTREAM_FAILED` and stay in a "something is blocking this task from
being scheduled" state.

> Airflow 1.8.1 tasks not being marked as upstream_failed when one of the parents fails
> -------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-1515
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1515
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: core, DAG, DagRun
>    Affects Versions: 1.8.1
>            Reporter: Jose Sanchez
>         Attachments: airflow_bug.png, rofl_test.py
>
>
> the trigger rule "all_done" is not working when its parents are marked as State.None
instead of State.Upstream_Failed. I am submitting a very small dag as example and a picture
of the run, where the last Task should have been executed before marking the dag as failed.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message