airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nadeem Ahmed Nazeer <>
Subject DAG status still running when all its tasks are complete
Date Mon, 01 Aug 2016 21:29:07 GMT

I am facing a situation with Airflow where it doesn't flag the DAG's as
success even though all of the tasks in that DAG are complete.

I have a BranchPythonOperator which forks into running all downstream tasks
or just a single task (dummy operator as an endpoint) depending if files
exists to be processed or not for that cycle.

I see that in the DAG's that go to the dummy operator, the status of the
DAG always shows running where its complete. I can't get to figure out what
is stopping the scheduler from marking this DAG success. Since it is in
running state, every time the scheduler checks the status of this DAG which
is unnecessary.

Please advise.


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message