airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Byrne (JIRA)" <>
Subject [jira] [Commented] (AIRFLOW-1979) Redis celery backend not work on 1.9.0 (configuration is ignored)
Date Thu, 29 Mar 2018 22:40:00 GMT


Sean Byrne commented on AIRFLOW-1979:

I'm using 1.9 from pypi and was also encountering this issue.

When using 1.9 with celery 3.1.19 it seemed airflow was not passing the correct broker_url
and result_backend to the celery executor. When debugging I saw the following
if always returned true, even when "celery_config_options" was not in my airflow.cfg file.
if configuration.has_option('celery', 'celery_config_options'):
    celery_configuration = import_string(
        configuration.get('celery', 'celery_config_options')
    celery_configuration = DEFAULT_CELERY_CONFIG

import_string seemed to work fine, it took the correct broker_url from airflow.cfg but it gets
overwritten before actually starting the task. I couldn't tell exactly where this happened.

I worked around the issue using this guide [] to
override the default config.


Also to note when using celery 4.1, the configuration carried over on worker start, but
failed due to other errors using postgres as a celery backend.

Hope this helps.

> Redis celery backend not work on 1.9.0 (configuration is ignored)
> -----------------------------------------------------------------
>                 Key: AIRFLOW-1979
>                 URL:
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: celery, worker
>    Affects Versions: 1.9.0
>            Reporter: Norio Akagi
>            Priority: Major
> Worker tries to connect to RabbigMQ based on a default setting and shows an error as
> {noformat}
> [2018-01-09 16:45:42,778] {} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
> [2018-01-09 16:45:42,802] {} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
> [2018-01-09 16:45:43,051] {} WARNING - section/key [celery/celery_ssl_active]
not found in config
> [2018-01-09 16:45:43,051] {} WARNING - Celery Executor will run without
> [2018-01-09 16:45:43,052] {} INFO - Using executor CeleryExecutor
> [2018-01-09 16:45:43,140: WARNING/MainProcess] /usr/local/lib/python2.7/dist-packages/celery/apps/
> Starting from version 3.2 Celery will refuse to accept pickle by default.
> The pickle serializer is a security concern as it may give attackers
> the ability to execute any command.  It's important to secure
> your broker from unauthorized access when using pickle, so we think
> that enabling pickle should require a deliberate action and not be
> the default choice.
> If you depend on pickle then you should set a setting to disable this
> warning and to be sure that everything will continue working
> when you upgrade to Celery 3.2::
>     CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
> You must only enable the serializers that you will actually use.
>   warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))
> [2018-01-09 16:45:43,240: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@
[Errno 111] Connection refused.
> Trying again in 2.00 seconds...
> {noformat}
> I deploy Airflow on kubernetes so each component (web, scheduler, worker, and flower)
is containerized and distributed among nodes. I set {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND}}
>  and {{AIRFLOW__CELERY__BROKER_URL}} in environment variables and it can be seen when
I run {{printenv}} in a container, but it looks completely ignored.
> Moving these values to {{airflow.cfg}} doesn't work either.
> It worked just perfectly 1.8 and suddenly stopped working when I upgraded Airflow to
> Do you have any idea what may cause this configuration issue?

This message was sent by Atlassian JIRA

View raw message