airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From George Leslie-Waksman <geo...@cloverhealth.com.INVALID>
Subject Re: Tasks stay queued when they fail in celery
Date Fri, 04 Aug 2017 17:21:39 GMT
We've seen this before as well, it's a bug in the Celery Executor that has
a bunch of different manifestations.

There is at least one open issue relating to this bug:
https://issues.apache.org/jira/browse/AIRFLOW-1463

I have been working on a fix but it's likely to be a few more days before I
have a chance to make some progress.

--George

On Fri, Jul 28, 2017 at 5:05 PM David Capwell <dcapwell@gmail.com> wrote:

> We noticed that in the past few days we keep seeing tasks stay in the
> queued state.  Looking into celery, we see that the task had failed.
>
> Traceback (most recent call last):
>   File "/python/lib/python2.7/site-packages/celery/app/trace.py", line
> 367, in trace_task
>     R = retval = fun(*args, **kwargs)
>   File "/python/lib/python2.7/site-packages/celery/app/trace.py", line
> 622, in __protected_call__
>     return self.run(*args, **kwargs)
>   File
> "/python/lib/python2.7/site-packages/airflow/executors/celery_executor.py",
> line 59, in execute_command
>     raise AirflowException('Celery command failed')
> AirflowException: Celery command failed
>
>
> Why does airflow not learn about this and recover? And what can we do to
> prevent this?
>
> Thanks for your time reading this email.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message