airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From harish singh <>
Subject AirflowTaskTimeout: Timeout: pipeline getting stalled
Date Wed, 22 Mar 2017 19:50:52 GMT
Hi guys,

So I have airflow 1.8 running at my company now.  Overall, the performance
have improved and  scheduling has been faster.
The jobs are running and the pipeline do progress but I am running into few
issues. Please help if you have seen this before. Any help will be

1. Jobs getting scheduled -> queued but not  Running.
Read an email from Bolke where the suggestion was to increase the size of
But this hasn't worked.
I manually cleared the tasks and saw airflow running them after clearing.

2. For the same above issue, I saw that there were Timeout errors seen:
I still havent able to understand why this happens.
This is the entire trace:

[2017-03-22 19:35:16,332] {} INFO - Filling up the DagBag from
/usr/local/airflow/pipeline/ [2017-03-22 19:35:22,451]
{} INFO - loading setup.cfg file [2017-03-22
19:35:51,041] {} ERROR - Process timed out [2017-03-22
19:35:51,041] {} ERROR - Failed to import:
/usr/local/airflow/pipeline/ Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/airflow/", line 263,
in process_file m = imp.load_source(mod_name, filepath) File
"/usr/local/airflow/pipeline/", line 167, in <module>
create_tasks(dbguid, version, dag, override_start_date) File
"/usr/local/airflow/pipeline/", line 104, in create_tasks t =
create_task(dbguid, dag, taskInfo, version, override_date) File
"/usr/local/airflow/pipeline/", line 85, in create_task retries,
1, depends_on_past, version, override_dag_date) File
"/usr/local/airflow/pipeline/dags/", line 90, in
create_python_operator depends_on_past=depends_on_past) File
"/usr/local/lib/python2.7/dist-packages/airflow/utils/", line
86, in wrapper result = func(*args, **kwargs) File
line 65, in __init__ super(PythonOperator, self).__init__(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/airflow/utils/",
line 70, in wrapper sig = signature(func) File
"/usr/local/lib/python2.7/dist-packages/funcsigs/", line 105, in
signature return Signature.from_function(obj) File
"/usr/local/lib/python2.7/dist-packages/funcsigs/", line 594, in
from_function __validate_parameters__=False) File
"/usr/local/lib/python2.7/dist-packages/funcsigs/", line 518, in
__init__ for param in parameters)) File
"/usr/lib/python2.7/", line 52, in __init__
self.__update(*args, **kwds) File "/usr/lib/python2.7/", line
548, in update self[key] = value File "/usr/lib/python2.7/",
line 61, in __setitem__ last[1] = root[0] = self.__map[key] = [last, root,
key] File
"/usr/local/lib/python2.7/dist-packages/airflow/utils/", line 38,
in handle_timeout raise AirflowTaskTimeout(self.error_message)
AirflowTaskTimeout: Timeout

3. "_cmd" doesnt work anymore for fetching sqlalchemy_connection.
Even when I am using mysql (connection url doesnt include 'sqlite'
"error: cannot use sqlite with the LocalExecutor"

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message