airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sachin <>
Subject dag_run marked as failed even when there are tasks with status UP_FOR_RETRY
Date Wed, 12 Jul 2017 05:02:04 GMT

I am using airflow

I have a dag with the Bash operator which has command: 'dat' which will

when I trigger the dag using "airflow trigger_dag testretry1" it executes
and try running the task and fails with error

ERROR - Bash command failed

I expect it to attempt 2nd time after as retries and retries_delay are
defined. after 1st attempt the dag run is marked as failed and it never
runs 2nd time. The task instance status is up_for_retry at this point.

As soon as I change the dag_run to "running" (UI has a option to change the
status of dag_run), the task runs 2nd time and it fails (as expected).

Expecting this:
As I have retries=1 and retries_delay=2 mins, I expect the task instance to
run and fail 1st time (task instance status=up_for_retry, dag_run
status=running) and again run after 2 minutes and fail (task instance
status=failed, dag_run status=failed)

This is what happening:
The task instance to run and fail 1st time (task instance
status=up_for_retry, dag_run status=failed) and again task instance does
not run 2nd time.

Here is my DAG defination:

from airflow.models import DAGfrom airflow.operators import BashOperator
from datetime import datetime, timedelta
default_args = {
  'owner': 'Sachin Parmar',
  'depends_on_past': False,
  'email': [''],
  'email_on_failure': False,
  'email_on_retry': False,
  'retries': 1,
  'retry_delay': timedelta(minutes=2),}
dag = DAG('testretry1', default_args=default_args, schedule_interval=None)
task1 = BashOperator(

is anyting wrong with the dag or my understanding? please correct me.

Sachin Parmar

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message