airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [airflow] kaxil commented on a change in pull request #6515: [AIRFLOW-XXX] GSoD: How to make DAGs production ready
Date Mon, 25 Nov 2019 14:50:02 GMT
kaxil commented on a change in pull request #6515: [AIRFLOW-XXX] GSoD: How to make DAGs production
ready
URL: https://github.com/apache/airflow/pull/6515#discussion_r350227048
 
 

 ##########
 File path: docs/best-practices.rst
 ##########
 @@ -0,0 +1,296 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Best Practices
+==============
+
+Running Airflow in production is seamless. It comes bundled with all the plugins and configs
+necessary to run most of the DAGs. However, you can come across certain pitfalls, which can
cause occasional errors.
+Let's take a look at what you need to do at various stages to avoid these pitfalls, starting
from writing the DAG 
+to the actual deployment in the production environment.
+
+
+Writing a DAG
+^^^^^^^^^^^^^^
+Creating a new DAG in Airflow is quite simple. However, there are many things that you need
to take care of
+to ensure the DAG run or failure does not produce unexpected results.
+
+Creating a task
+---------------
+
+You should treat tasks in Airflow equivalent to transactions in a database. It implies that
you should never produce
+incomplete results from your tasks. An example is not to produce incomplete data in ``HDFS``
or ``S3`` at the end of a task.
+
+Airflow can retry a task if it fails. Thus, the tasks should produce the same outcome on
every re-run.
+Some of the ways you can avoid producing a different result -
+
+* Do not use INSERT during a task re-run, an INSERT statement might lead to duplicate rows
in your database.
+  Replace it with UPSERT.
+* Read and write in a specific partition. Never read the latest available data in a task.

+  Someone may update the input data between re-runs, which results in different outputs.

+  A better way is to read the input data from a specific partition. You can use ``execution_date``
as a partition. 
+  You should follow this partitioning method while writing data in S3/HDFS, as well.
+* The python datetime ``now()`` function gives the current datetime object. 
+  This function should never be used inside a task, especially to do the critical computation,
as it leads to different outcomes on each run. 
+  It's fine to use it, for example, to generate a temporary log.
+
+.. tip::
+
+    You should define repetitive parameters such as ``connection_id`` or S3 paths in ``default_args``
rather than declaring them for each task.
+    The ``default_args`` help to avoid mistakes such as typographical errors.
+
+
+Deleting a task
+----------------
+
+Never delete a task from a DAG. In case of deletion, the historical information of the task
disappears from the Airflow UI. 
+It is advised to create a new DAG in case the tasks need to be deleted.
+
+
+Communication
+--------------
+
+Airflow executes tasks of a DAG on different servers in case you are using :doc:`Kubernetes
executor <../executor/kubernetes>` or :doc:`Celery executor <../executor/celery>`.

+Therefore, you should not store any file or config in the local filesystem as the next task
is likely to run on a different server without access to it — for example, a task that downloads
the data file that the next task processes.
+In the case of :class:`Local executor <airflow.executors.local_executor.LocalExecutor>`,

+storing a file on disk can make retries harder e.g., your task requires a config file that
is deleted by another task in DAG.
+
+If possible, use ``XCom`` to communicate small messages between tasks and a good way of passing
larger data between tasks is to use a remote storage such as S3/HDFS. 
+For example, if we have a task that stores processed data in S3 that task can push the S3
path for the output data in ``Xcom``,
+and the downstream tasks can pull the path from XCom and use it to read the data.
+
+The tasks should also not store any authentication parameters such as passwords or token
inside them. 
+Where at all possible, use :ref:`Connections <concepts-connections>` to store data
securely in Airflow backend and retrieve them using a unique connection id.
+
+
+Variables
+---------
+
+You should avoid usage of Variables outside an operator's ``execute()`` method or Jinja templates
if possible, 
+as Variables create a connection to metadata DB of Airflow to fetch the value, which can
slow down parsing and place extra load on the DB.
+
+Airflow parses all the DAGs in the background at a specific period.
+The default period is set using ``processor_poll_interval`` config, which is by default 1
second. During parsing, Airflow creates a new connection to the metadata DB for each DAG.
+It can result in a lot of open connections.
+
+The best way of using variables is via a Jinja template which will delay reading the value
until the task execution. The template synaxt to do this is:
+
+.. code::
+
+    {{ var.value.<variable_name> }}
+
+or if you need to deserialize a json object from the variable :
+
+.. code::
+
+    {{ var.json.<variable_name> }}
+
+
+.. note::
+
+    In general, you should not write any code outside the tasks. The code outside the tasks
runs every time Airflow parses the DAG, which happens every second by default.
+
+
+Testing a DAG
+^^^^^^^^^^^^^
+
+Airflow users should treat DAGs as production level code. The DAGs should have various tests
to ensure that it produces expected results.
+You can write a wide variety of tests for a DAG. Let's take a look at some of them.
+
+DAG Loader Test
+---------------
+
+This test should ensure that your DAG does not contain a piece of code that raises error
while loading.
+No additional code needs to be written by the user to run this test.
+
+.. code::
+
+ python your-dag-file.py
+
+Running the above command without any error ensures your DAG does not contain any uninstalled
dependency, syntax errors, etc. 
+
+You can look into :ref:`Testing a DAG <testing>` for details on how to test individual
operators.
+
+Unit tests
+-----------
+
+Unit tests ensure that there is no incorrect code in your DAG. You can write a unit test
for your tasks as well as your DAG.
+
+**Unit test for loading a DAG:**
+
+.. code::
 
 Review comment:
   ```suggestion
   .. code:: python
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

Mime
View raw message