airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dubey, Somesh" <>
Subject RE: airflow 1.9 tasks randomly failing | k8 - hive
Date Fri, 24 Aug 2018 14:34:41 GMT
Thanks so much Emmanuel for the reply. 
How did Airflow classified those processes as Zombie. 
Did any log say that. 
As we are not able to get any good logs of tasks failing. 

Thanks so much- You are potentially saving hours of work/sleepless nights as things are breaking
production for us. 

-----Original Message-----
From: Emmanuel Brard [] 
Sent: Friday, August 24, 2018 6:48 AM
Subject: Re: airflow 1.9 tasks randomly failing | k8 - hive


We have a similar setup with Airflow 1.9 on Kubernetes with the Celery executor.

We saw some airflow tasks being killed inside the container because of cgroup limits (set
by kubernetes and pushed to the docker daemon) but not airflow itself (airflow celery command)
which ended up in zombie being identified by the scheduler (setting the task to fail). We
decreased the number of celery slots on each worker reducing memory pressure and this stoped.
It happened mostly with one DAG which had simple operators but was quite big with subdags

Maybe you are facing the same issue ?


On Fri, Aug 24, 2018 at 3:40 PM Dubey, Somesh <>

> We are on AirFlow 1.9 on Kubernetes. We have many hive DAGs which we 
> call via JDBC (via rest end point) and randomly tasks fail. One day 
> one would fail and next day some other.
> The pods are all healthy and there is no node eviction/any issues and 
> retries typically works. Also even when the tasks fails it completes 
> on the hive side successfully.
> We do not get good logs of the failure and typically get partial sql 
> in sysout in the task log with no error message. Loglevel increased to 
> 5 in jdbc connection.
> This only happens in production.
> Not been able to reproduce the issue in dev.
> The closest we have gotten in dev to replicate similar behavior is to 
> kill the pid (manually killing airflow run --raw) in worker node.
> That case also things run fine on hive side but task fails in Airflow.
> Similar task log with no error is received.
> Have you seen this kind of behavior. Any help is much appreciated.
> Thanks so much,
> Somesh


GetYourGuide AG

Stampfenbachstrasse 48  

8006 Zürich



View raw message