airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vardan Gupta <vardangupta...@gmail.com>
Subject Re: dag_run timeouts
Date Tue, 19 Feb 2019 10:21:13 GMT
<<Bumping it again, in case it was missed earlier>>

On Fri, Feb 15, 2019 at 12:49 PM vardanguptacse@gmail.com <
vardanguptacse@gmail.com> wrote:

> Hi team,
>
> We wanted to enable dag_run timeouts on our dags but when we have gone
> through the behavior of how dagrun_timeout works, we got to know, it only
> works on when below conditions are met.
>
> 1) dagrun should be scheduled one i.e. not manually created
> 2) max_active_runs must be configured
>
> When it works:
> During dagrun creation, if count of existing dagruns equate to
> max_active_runs configured and previous runs are running longer than
> configured timeout, it will be marked as failed(
> https://github.com/apache/airflow/blob/master/airflow/jobs.py#L784)
>
> How can we achieve below:
>
> 1) With manually created dag_runs
> 2) Enabling timeouts on existing dag_runs without requiring triggering of
> new dag_runs
> 3) Though dag_run is marked as failed but running task will keep on
> running until it reaches terminal state.
>
> Workaround:
> We also explored execution_timeouts at an individual task level in
> combination with corresponding trigger rule, this works perfectly for us.
>
> Regards,
> Vardan Gupta
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message