airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From siddharth anand <san...@apache.org>
Subject Re: DAG status still running when all its tasks are complete
Date Wed, 03 Aug 2016 05:27:23 GMT
Hi Nadeem,
Can you open a JIRA, attach a DAG which I can run to reproduce your issue,
and assign the JIRA to me?
-s

On Tue, Aug 2, 2016 at 8:40 PM, Nadeem Ahmed Nazeer <nazeer@neon-lab.com>
wrote:

> Could someone please shed some light on this DAG status?
>
> My airflow version is 1.7.0. This is the only version that works for me
> when it comes to scheduler. Any version above this, the scheduler gets
> stuck without a trace and wouldn't schedule anything.
>
> Thanks,
> Nadeem
>
> On Mon, Aug 1, 2016 at 2:29 PM, Nadeem Ahmed Nazeer <nazeer@neon-lab.com>
> wrote:
>
> > Hello,
> >
> > I am facing a situation with Airflow where it doesn't flag the DAG's as
> > success even though all of the tasks in that DAG are complete.
> >
> > I have a BranchPythonOperator which forks into running all downstream
> > tasks or just a single task (dummy operator as an endpoint) depending if
> > files exists to be processed or not for that cycle.
> >
> > I see that in the DAG's that go to the dummy operator, the status of the
> > DAG always shows running where its complete. I can't get to figure out
> what
> > is stopping the scheduler from marking this DAG success. Since it is in
> > running state, every time the scheduler checks the status of this DAG
> which
> > is unnecessary.
> >
> > Please advise.
> >
> > Thanks,
> > Nadeem
> >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message