airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "nakostibas (JIRA)" <>
Subject [jira] [Created] (AIRFLOW-37) "No such transport" errors when using CeleryExecutor
Date Tue, 03 May 2016 00:48:12 GMT
nakostibas created AIRFLOW-37:

             Summary: "No such transport" errors when using CeleryExecutor
                 Key: AIRFLOW-37
             Project: Apache Airflow
          Issue Type: Bug
         Environment: Linux 4.2.0-35-generic #40~14.04.1-Ubuntu SMP x86_64 GNU/Linux
            Reporter: nakostibas
            Priority: Critical

Airflow functions fine when using LocalExecutor on sample DAGs and ones I have made. When
I switch to CeleryExecutor, I get the following stack trace from "airflow scheduler":

[2016-05-03 00:23:15,825] {} INFO - [celery] queuing ('example_branch_operator',
'run_this_first', datetime.datetime(2016, 4, 27, 0, 0)) through c
elery, queue=default
[2016-05-03 00:23:15,827] {} ERROR - 'No such transport: '
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/airflow/", line 657, in _execute
  File "/usr/local/lib/python2.7/dist-packages/airflow/executors/", line 86,
in heartbeat
    self.execute_async(key, command=command, queue=queue)
  File "/usr/local/lib/python2.7/dist-packages/airflow/executors/", line
64, in execute_async
    args=[command], queue=queue)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 565, in apply_async
    **dict(self._get_exec_options(), **options)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 348, in send_task
    with self.producer_or_acquire(producer) as P:
  File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 403, in producer_or_acquire
    producer, self.amqp.producer_pool.acquire, block=True,
  File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 502, in producer_pool, 
  File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 609, in pool
    self._pool = self.connection().Pool(limit=limit)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 386, in connection
    'BROKER_CONNECTION_TIMEOUT', connect_timeout
  File "/usr/local/lib/python2.7/dist-packages/kombu/", line 165, in __init__
    if not get_transport_cls(transport).can_parse_url:
  File "/usr/local/lib/python2.7/dist-packages/kombu/transport/", line 109, in
    _transport_cache[transport] = resolve_transport(transport)
  File "/usr/local/lib/python2.7/dist-packages/kombu/transport/", line 89, in resolve_transport
    raise KeyError('No such transport: {0}'.format(transport))
KeyError: 'No such transport: '

My Celery airflow.cfg is setup as follows:
celery_app_name = airflow.executors.celery_executor
celeryd_concurrency = 16
worker_log_server_port = 8793
broker_url = 'redis://localhost:6379/0'
celery_result_backend = 'redis://localhost:6379/0'
flower_port = 5555

Redis / Celery appear to be working on their own, as I can successfully execute the example
Celery application here:

This message was sent by Atlassian JIRA

View raw message