airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (AIRFLOW-1886) Failed jobs are not being counted towards max_active_runs_per_dag
Date Sun, 02 Sep 2018 18:09:01 GMT

     [ https://issues.apache.org/jira/browse/AIRFLOW-1886?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Apache Spark reassigned AIRFLOW-1886:
-------------------------------------

    Assignee: Oleg Yamin  (was: Holden Karau's magical unicorn)

> Failed jobs are not being counted towards max_active_runs_per_dag
> -----------------------------------------------------------------
>
>                 Key: AIRFLOW-1886
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1886
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DagRun
>    Affects Versions: 1.8.1
>            Reporter: Oleg Yamin
>            Assignee: Oleg Yamin
>            Priority: Major
>
> # Currently, I have setup max_active_runs_per_dag = 2 in airflow.cfg but when a DAG aborts,
it will keep submitting next DAG in the queue not counting the current incomplete DAG that
is already in the queue. I am using 1.8.1 but i see that the jobs.py in latest version is
still not addressing this issue.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message