airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Maxime Beauchemin (JIRA)" <>
Subject [jira] [Commented] (AIRFLOW-31) Use standard imports for hooks/operators
Date Tue, 03 May 2016 01:07:12 GMT


Maxime Beauchemin commented on AIRFLOW-31:

The original idea was to provide a clean namespace under `airflow.operators` that would should
you directly all (and only) the operators that were available in your environment. 

In practice this results in importing everything upfront when referencing `airflow.operators`,
and in some magic/non-standard behavior.

I'd do it in a heartbeat if it wasn't for the drawback that changing this results in having
to change every single DAG file out there. There might be a hack where we could warn of future
deprecation when referencing operators and hooks "the old way". It's probably possible to
override or override `airflow.operators.__getattribute__` to implement this behavior.

> Use standard imports for hooks/operators
> ----------------------------------------
>                 Key: AIRFLOW-31
>                 URL:
>             Project: Apache Airflow
>          Issue Type: Improvement
>    Affects Versions: Airflow 2.0
>            Reporter: Jeremiah Lowin
>            Assignee: Jeremiah Lowin
>              Labels: enhancement
> (Migrated from
> Currently, Airflow uses a relatively complex import mechanism to import hooks and operators
without polluting the namespace with submodules. I would like to propose that Airflow abandon
that system and use standard Python importing.
> Here are a few major reasons why I think the current system has run its course.
> h3. Polluting namespace
> The biggest advantage of the current system, as I understand it, is that only Operators
appear in the `airflow.operators` namespace.  The submodules that actually contain the operators
do not.
> So for example while `airflow.operators.python_operator.PythonOperator` is a thing, `PythonOperator`
is in the `airflow.operators` namespace but `python_operator` is not.
> I think this sort of namespace pollution was helpful when Airflow was a smaller project,
but as the number of hooks/operators grows -- and especially as the `contrib` hooks/operators
grow -- I'd argue that namespacing is a *good thing*. It provides structure and organization,
and opportunities for documentation (through module docstrings).
> In fact, I'd argue that the current namespace is itself getting quite polluted -- the
only way to know what's available is to use something like Ipython tab-completion to browse
an alphabetical list of Operator names, or to load the source file and grok the import definition
(which no one installing from pypi is likely to do).
> h3. Conditional imports
> There's a second advantage to the current system that any module that fails to import
is silently ignored. It makes it easy to have optional dependencies. For example, if someone
doesn't have `boto` installed, then they don't have an `S3Hook` either. Same for a HiveOperator
> Again, as Airflow grows and matures, I think this is a little too magic. If my environment
is missing a dependency, I want to hear about it.
> On the other hand, the `contrib` namespace sort of depends on this -- we don't want users
to have to install every single dependency. So I propose that contrib modules all live in
their submodules: `from airflow.contrib.operators.my_operator import MyOperator`. As mentioned
previously, having structure and namespacing is a good thing as the project gets more complex.
> Other ways to handle this include putting "non-standard" dependencies inside the operator/hook
rather than the module (see `HiveOperator`/`HiveHook`), so it can be imported but not used.
Another is judicious use of `try`/`except ImportError`. The simplest is to make people import
things explicitly from submodules.
> h3. Operator dependencies
> Right now, operators can't depend on each other if they aren't in the same file. This
is for the simple reason that there is no guarantee on what order the operators will be loaded.
It all comes down to which dictionary key gets loaded first. One day Operator B could be loaded
after Operator A; the next day it might be loaded before. Consequently, A and B can't depend
on each other. Worse, if a user makes two operators that do depend on each other, they won't
get an error message when one fails to import.
> For contrib modules in particular, this is sort of killer.
> h3. Ease of use
> It's *hard* to set up imports for a new operator. The dictionary-based import instructions
aren't obvious for new users, and errors are silently dismissed which makes debugging difficult.
> h3. Identity
> Surprisingly, `airflow.operators.SubDagOperator != airflow.operators.subdag_operator.SubDagOperator`.
See #1168.
> h2. Proposal
> Use standard python importing for hooks/operators/etc.
> - `` files use straightforward, standard Python imports
> - major operators are available at `airflow.operators.OperatorName` or `airflow.operators.operator_module.OperatorName`.
> - contrib operators are only available at `airflow.contrib.operators.operator_module.OperatorName`
in order to manage dependencies
> - operator authors are encouraged to use `__all__` to define their module's exports
> Possibly delete namespace afterward
> - in `operators/`, run a function at the end of the file which deletes all
modules from the namespace, leaving only `Operators`. This keeps the namespace clear but lets
people use familiar import mechanisms.
> Possibly use an import function to handle `ImportError` gracefully
> - rewrite `import_module_attrs` to take one module name at a time instead of a dictionary.

This message was sent by Atlassian JIRA

View raw message