airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Siddharth Anand (JIRA)" <>
Subject [jira] [Closed] (AIRFLOW-396) DAG status still running when all its tasks are complete
Date Fri, 05 Aug 2016 02:25:20 GMT


Siddharth Anand closed AIRFLOW-396.
    Resolution: Implemented

This was fixed in a later release. 

The commit that fixed it is :

You can either cherry pick the change or upgrade to

> DAG status still running when all its tasks are complete
> --------------------------------------------------------
>                 Key: AIRFLOW-396
>                 URL:
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DagRun, scheduler
>    Affects Versions: Airflow 1.7.0
>            Reporter: Nadeem Ahmed Nazeer
>            Assignee: Siddharth Anand
>            Priority: Minor
>         Attachments: DagRuns.png, Dag_code.txt, Dag_no_input_files_still_running_state.png,
Graph view.png, Task_run.png, Task_run_dag_no_input_files_working.png, Tree_view.png
> Hello,
> I am facing a situation with Airflow where it doesn't flag the DAG's as success even
though all of the tasks in that DAG are complete.
> I have a BranchPythonOperator which forks into running all downstream tasks or just a
single task (dummy operator as an endpoint) depending if files exists to be processed or not
for that cycle.
> I see that in the DAG's that go to the dummy operator, the status of the DAG always shows
running where its complete. I can't get to figure out what is stopping the scheduler from
marking this DAG success. Since it is in running state, every time the scheduler checks the
status of this DAG which is unnecessary.
> My airflow version is 1.7.0. This is the only version that works for me when it comes
to scheduler. Any version above this, the scheduler gets stuck without a trace and wouldn't
schedule anything.
> Please advise.

This message was sent by Atlassian JIRA

View raw message