airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "nakostibas (JIRA)" <j...@apache.org>
Subject [jira] [Created] (AIRFLOW-37) "No such transport" errors when using CeleryExecutor
Date Tue, 03 May 2016 00:48:12 GMT
nakostibas created AIRFLOW-37:
---------------------------------

             Summary: "No such transport" errors when using CeleryExecutor
                 Key: AIRFLOW-37
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-37
             Project: Apache Airflow
          Issue Type: Bug
         Environment: Linux 4.2.0-35-generic #40~14.04.1-Ubuntu SMP x86_64 GNU/Linux
            Reporter: nakostibas
            Priority: Critical


Airflow functions fine when using LocalExecutor on sample DAGs and ones I have made. When
I switch to CeleryExecutor, I get the following stack trace from "airflow scheduler":

{code:none}
[2016-05-03 00:23:15,825] {celery_executor.py:62} INFO - [celery] queuing ('example_branch_operator',
'run_this_first', datetime.datetime(2016, 4, 27, 0, 0)) through c
elery, queue=default
[2016-05-03 00:23:15,827] {jobs.py:660} ERROR - 'No such transport: '
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 657, in _execute
    executor.heartbeat()
  File "/usr/local/lib/python2.7/dist-packages/airflow/executors/base_executor.py", line 86,
in heartbeat
    self.execute_async(key, command=command, queue=queue)
  File "/usr/local/lib/python2.7/dist-packages/airflow/executors/celery_executor.py", line
64, in execute_async
    args=[command], queue=queue)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/task.py", line 565, in apply_async
    **dict(self._get_exec_options(), **options)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/base.py", line 348, in send_task
    with self.producer_or_acquire(producer) as P:
  File "/usr/local/lib/python2.7/dist-packages/celery/app/base.py", line 403, in producer_or_acquire
    producer, self.amqp.producer_pool.acquire, block=True,
  File "/usr/local/lib/python2.7/dist-packages/celery/app/amqp.py", line 502, in producer_pool
    self.app.pool, 
  File "/usr/local/lib/python2.7/dist-packages/celery/app/base.py", line 609, in pool
    self._pool = self.connection().Pool(limit=limit)
  File "/usr/local/lib/python2.7/dist-packages/celery/app/base.py", line 386, in connection
    'BROKER_CONNECTION_TIMEOUT', connect_timeout
  File "/usr/local/lib/python2.7/dist-packages/kombu/connection.py", line 165, in __init__
    if not get_transport_cls(transport).can_parse_url:
  File "/usr/local/lib/python2.7/dist-packages/kombu/transport/__init__.py", line 109, in
get_transport_cls
    _transport_cache[transport] = resolve_transport(transport)
  File "/usr/local/lib/python2.7/dist-packages/kombu/transport/__init__.py", line 89, in resolve_transport
    raise KeyError('No such transport: {0}'.format(transport))
KeyError: 'No such transport: '
{code}

My Celery airflow.cfg is setup as follows:
{code:none}
[celery]
celery_app_name = airflow.executors.celery_executor
celeryd_concurrency = 16
worker_log_server_port = 8793
broker_url = 'redis://localhost:6379/0'
celery_result_backend = 'redis://localhost:6379/0'
flower_port = 5555
{code}

Redis / Celery appear to be working on their own, as I can successfully execute the example
Celery application here:
http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html#application




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message