airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Siddharth Anand (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AIRFLOW-1886) Failed jobs are not being counted towards max_active_runs_per_dag
Date Wed, 06 Dec 2017 00:48:32 GMT

     [ https://issues.apache.org/jira/browse/AIRFLOW-1886?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Siddharth Anand updated AIRFLOW-1886:
-------------------------------------
    Description: # Currently, I have setup max_active_runs_per_dag = 2 in airflow.cfg but
when a DAG aborts, it will keep submitting next DAG in the queue not counting the current
incomplete DAG that is already in the queue. I am using 1.8.1 but i see that the jobs.py in
latest version is still not addressing this issue.  (was: Currently, I have setup max_active_runs_per_dag
= 2 in airflow.cfg but when a DAG aborts, it will keep submitting next DAG in the queue not
counting the current incomplete DAG that is already in the queue. I am using 1.8.1 but i see
that the jobs.py in latest version is still not addressing this issue.)

> Failed jobs are not being counted towards max_active_runs_per_dag
> -----------------------------------------------------------------
>
>                 Key: AIRFLOW-1886
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1886
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DagRun
>    Affects Versions: 1.8.1
>            Reporter: Oleg Yamin
>            Assignee: Oleg Yamin
>
> # Currently, I have setup max_active_runs_per_dag = 2 in airflow.cfg but when a DAG aborts,
it will keep submitting next DAG in the queue not counting the current incomplete DAG that
is already in the queue. I am using 1.8.1 but i see that the jobs.py in latest version is
still not addressing this issue.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message