airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From twinkle <twinkle.sachd...@gmail.com>
Subject Airflow and Celery - co-ordination issue
Date Thu, 02 Mar 2017 14:27:10 GMT
Hi,

We plan to use Airflow along with Celery as the backend.
 Today within a DAG run,  despite showing some of the tasks in a DAG as
successful, Airflow was not scheduling the next potential tasks in it.
Looking at the Celery Flower, following exceptions were observed:

Traceback (most recent call last):
  File
"/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/celery/app/trace.py",
line 367, in trace_task
    R = retval = fun(*args, **kwargs)
  File
"/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/celery/app/trace.py",
line 622, in __protected_call__
    return self.run(*args, **kwargs)
  File
"/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/airflow/executors/celery_executor.py",
line 45, in execute_command
    raise AirflowException('Celery command failed')
AirflowException: Celery command failed

There has been no failure logs at the Airflow side, and it has marked the
task as Succeeded.

Looking at the meta data table, i found the state of the task as FAILURE.
It seems like some of the link is broken, as to some extent Airflow
realises the failure, due to which it stopped scheduling the tasks further,
but it is not complete, as the UI showed different state.

Has anyone else experienced it?

Regards,
Twinkle

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message