airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From a..@apache.org
Subject [04/38] incubator-airflow-site git commit: Docs from 1.10.1
Date Thu, 29 Nov 2018 15:58:02 GMT
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/1f06fa0e/integration.html
----------------------------------------------------------------------
diff --git a/integration.html b/integration.html
index 326ae1e..12a5e30 100644
--- a/integration.html
+++ b/integration.html
@@ -159,6 +159,20 @@
 <li class="toctree-l4"><a class="reference internal" href="#bigqueryhook">BigQueryHook</a></li>
 </ul>
 </li>
+<li class="toctree-l3"><a class="reference internal" href="#cloud-sql">Cloud SQL</a><ul>
+<li class="toctree-l4"><a class="reference internal" href="#cloud-sql-operators">Cloud SQL Operators</a></li>
+<li class="toctree-l4"><a class="reference internal" href="#cloud-sql-hook">Cloud SQL Hook</a></li>
+</ul>
+</li>
+<li class="toctree-l3"><a class="reference internal" href="#compute-engine">Compute Engine</a><ul>
+<li class="toctree-l4"><a class="reference internal" href="#compute-engine-operators">Compute Engine Operators</a></li>
+</ul>
+</li>
+<li class="toctree-l3"><a class="reference internal" href="#cloud-functions">Cloud Functions</a><ul>
+<li class="toctree-l4"><a class="reference internal" href="#cloud-functions-operators">Cloud Functions Operators</a></li>
+<li class="toctree-l4"><a class="reference internal" href="#cloud-functions-hook">Cloud Functions Hook</a></li>
+</ul>
+</li>
 <li class="toctree-l3"><a class="reference internal" href="#cloud-dataflow">Cloud DataFlow</a><ul>
 <li class="toctree-l4"><a class="reference internal" href="#dataflow-operators">DataFlow Operators</a></li>
 <li class="toctree-l4"><a class="reference internal" href="#dataflowhook">DataFlowHook</a></li>
@@ -322,6 +336,15 @@ flexibility.</p>
 </div>
 </li>
 </ul>
+<p>To ensure that Airflow generates URLs with the correct scheme when
+running behind a TLS-terminating proxy, you should configure the proxy
+to set the <cite>X-Forwarded-Proto</cite> header, and enable the <cite>ProxyFix</cite>
+middleware in your <cite>airflow.cfg</cite>:</p>
+<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="n">enable_proxy_fix</span> <span class="o">=</span> <span class="kc">True</span>
+</pre></div>
+</div>
+<p>Note: you should only enable the <cite>ProxyFix</cite> middleware when running
+Airflow behind a trusted proxy (AWS ELB, nginx, etc.).</p>
 </div>
 <div class="section" id="azure-microsoft-azure">
 <span id="azure"></span><h2>Azure: Microsoft Azure<a class="headerlink" href="#azure-microsoft-azure" title="Permalink to this headline">¶</a></h2>
@@ -503,6 +526,37 @@ using a SAS token by adding {“sas_token”: “YOUR_TOKEN”}.</p>
 </dd></dl>
 
 <dl class="method">
+<dt id="airflow.contrib.hooks.wasb_hook.WasbHook.delete_file">
+<code class="descname">delete_file</code><span class="sig-paren">(</span><em>container_name</em>, <em>blob_name</em>, <em>is_prefix=False</em>, <em>ignore_if_missing=False</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/hooks/wasb_hook.html#WasbHook.delete_file"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.hooks.wasb_hook.WasbHook.delete_file" title="Permalink to this definition">¶</a></dt>
+<dd><p>Delete a file from Azure Blob Storage.</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>container_name</strong> (<em>str</em>) – Name of the container.</li>
+<li><strong>blob_name</strong> (<em>str</em>) – Name of the blob.</li>
+<li><strong>is_prefix</strong> (<em>bool</em>) – If blob_name is a prefix, delete all matching files</li>
+<li><strong>ignore_if_missing</strong> – if True, then return success even if the</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+<p>blob does not exist.
+:type ignore_if_missing: bool
+:param kwargs: Optional keyword arguments that</p>
+<blockquote>
+<div><cite>BlockBlobService.create_blob_from_path()</cite> takes.</div></blockquote>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+</tbody>
+</table>
+</dd></dl>
+
+<dl class="method">
 <dt id="airflow.contrib.hooks.wasb_hook.WasbHook.get_conn">
 <code class="descname">get_conn</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/hooks/wasb_hook.html#WasbHook.get_conn"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.hooks.wasb_hook.WasbHook.get_conn" title="Permalink to this definition">¶</a></dt>
 <dd><p>Return the BlockBlobService object.</p>
@@ -994,6 +1048,14 @@ Operators are in the contrib section.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.emr_add_steps_operator.EmrAddStepsOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/emr_add_steps_operator.html#EmrAddStepsOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.emr_add_steps_operator.EmrAddStepsOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1020,6 +1082,14 @@ emr_connection extra. (templated)</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.emr_create_job_flow_operator.EmrCreateJobFlowOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/emr_create_job_flow_operator.html#EmrCreateJobFlowOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.emr_create_job_flow_operator.EmrCreateJobFlowOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1042,6 +1112,14 @@ emr_connection extra. (templated)</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.emr_terminate_job_flow_operator.EmrTerminateJobFlowOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/emr_terminate_job_flow_operator.html#EmrTerminateJobFlowOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.emr_terminate_job_flow_operator.EmrTerminateJobFlowOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1127,6 +1205,79 @@ Overrides for this config may be passed as the job_flow_overrides.</p>
 </dd></dl>
 
 <dl class="method">
+<dt id="airflow.hooks.S3_hook.S3Hook.copy_object">
+<code class="descname">copy_object</code><span class="sig-paren">(</span><em>source_bucket_key</em>, <em>dest_bucket_key</em>, <em>source_bucket_name=None</em>, <em>dest_bucket_name=None</em>, <em>source_version_id=None</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/hooks/S3_hook.html#S3Hook.copy_object"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.hooks.S3_hook.S3Hook.copy_object" title="Permalink to this definition">¶</a></dt>
+<dd><p>Creates a copy of an object that is already stored in S3.</p>
+<p>Note: the S3 connection used here needs to have access to both
+source and destination bucket/key.</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>source_bucket_key</strong> (<em>str</em>) – <p>The key of the source object.</p>
+<p>It can be either full s3:// style url or relative path from root level.</p>
+<p>When it’s specified as a full s3:// url, please omit source_bucket_name.</p>
+</li>
+<li><strong>dest_bucket_key</strong> (<em>str</em>) – <p>The key of the object to copy to.</p>
+<p>The convention to specify <cite>dest_bucket_key</cite> is the same
+as <cite>source_bucket_key</cite>.</p>
+</li>
+<li><strong>source_bucket_name</strong> (<em>str</em>) – <p>Name of the S3 bucket where the source object is in.</p>
+<p>It should be omitted when <cite>source_bucket_key</cite> is provided as a full s3:// url.</p>
+</li>
+<li><strong>dest_bucket_name</strong> (<em>str</em>) – <p>Name of the S3 bucket to where the object is copied.</p>
+<p>It should be omitted when <cite>dest_bucket_key</cite> is provided as a full s3:// url.</p>
+</li>
+<li><strong>source_version_id</strong> (<em>str</em>) – Version ID of the source object (OPTIONAL)</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.hooks.S3_hook.S3Hook.create_bucket">
+<code class="descname">create_bucket</code><span class="sig-paren">(</span><em>bucket_name</em>, <em>region_name=None</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/hooks/S3_hook.html#S3Hook.create_bucket"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.hooks.S3_hook.S3Hook.create_bucket" title="Permalink to this definition">¶</a></dt>
+<dd><p>Creates an Amazon S3 bucket.</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>bucket_name</strong> (<em>str</em>) – The name of the bucket</li>
+<li><strong>region_name</strong> (<em>str</em>) – The name of the aws region in which to create the bucket.</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.hooks.S3_hook.S3Hook.delete_objects">
+<code class="descname">delete_objects</code><span class="sig-paren">(</span><em>bucket</em>, <em>keys</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/hooks/S3_hook.html#S3Hook.delete_objects"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.hooks.S3_hook.S3Hook.delete_objects" title="Permalink to this definition">¶</a></dt>
+<dd><table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>bucket</strong> (<em>str</em>) – Name of the bucket in which you are going to delete object(s)</li>
+<li><strong>keys</strong> (<em>str</em><em> or </em><em>list</em>) – <p>The key(s) to delete from S3 bucket.</p>
+<p>When <code class="docutils literal notranslate"><span class="pre">keys</span></code> is a string, it’s supposed to be the key name of
+the single object to delete.</p>
+<p>When <code class="docutils literal notranslate"><span class="pre">keys</span></code> is a list, it’s supposed to be the list of the
+keys to delete.</p>
+</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+</dd></dl>
+
+<dl class="method">
 <dt id="airflow.hooks.S3_hook.S3Hook.get_bucket">
 <code class="descname">get_bucket</code><span class="sig-paren">(</span><em>bucket_name</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/hooks/S3_hook.html#S3Hook.get_bucket"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.hooks.S3_hook.S3Hook.get_bucket" title="Permalink to this definition">¶</a></dt>
 <dd><p>Returns a boto3.S3.Bucket object</p>
@@ -1268,6 +1419,29 @@ by S3 and will be stored in an encrypted form while at rest in S3.</li>
 </dd></dl>
 
 <dl class="method">
+<dt id="airflow.hooks.S3_hook.S3Hook.load_file_obj">
+<code class="descname">load_file_obj</code><span class="sig-paren">(</span><em>file_obj</em>, <em>key</em>, <em>bucket_name=None</em>, <em>replace=False</em>, <em>encrypt=False</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/hooks/S3_hook.html#S3Hook.load_file_obj"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.hooks.S3_hook.S3Hook.load_file_obj" title="Permalink to this definition">¶</a></dt>
+<dd><p>Loads a file object to S3</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>file_obj</strong> (<em>file-like object</em>) – The file-like object to set as the content for the S3 key.</li>
+<li><strong>key</strong> (<em>str</em>) – S3 key that will point to the file</li>
+<li><strong>bucket_name</strong> (<em>str</em>) – Name of the bucket in which to store the file</li>
+<li><strong>replace</strong> (<em>bool</em>) – A flag that indicates whether to overwrite the key
+if it already exists.</li>
+<li><strong>encrypt</strong> (<em>bool</em>) – If True, S3 encrypts the file on the server,
+and the file is stored in encrypted form at rest in S3.</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+</dd></dl>
+
+<dl class="method">
 <dt id="airflow.hooks.S3_hook.S3Hook.load_string">
 <code class="descname">load_string</code><span class="sig-paren">(</span><em>string_data</em>, <em>key</em>, <em>bucket_name=None</em>, <em>replace=False</em>, <em>encrypt=False</em>, <em>encoding='utf-8'</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/hooks/S3_hook.html#S3Hook.load_string"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.hooks.S3_hook.S3Hook.load_string" title="Permalink to this definition">¶</a></dt>
 <dd><p>Loads a string to S3</p>
@@ -1381,6 +1555,14 @@ omit the transformation script if S3 Select expression is specified.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.s3_file_transform_operator.S3FileTransformOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/operators/s3_file_transform_operator.html#S3FileTransformOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.operators.s3_file_transform_operator.S3FileTransformOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1424,6 +1606,14 @@ such prefix. (templated)</li>
 </div>
 </dd>
 </dl>
+<dl class="method">
+<dt id="airflow.contrib.operators.s3_list_operator.S3ListOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/s3_list_operator.html#S3ListOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.s3_list_operator.S3ListOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1475,6 +1665,14 @@ dag=my-dag)</dd>
 </div></blockquote>
 <p>Note that <code class="docutils literal notranslate"><span class="pre">bucket</span></code>, <code class="docutils literal notranslate"><span class="pre">prefix</span></code>, <code class="docutils literal notranslate"><span class="pre">delimiter</span></code> and <code class="docutils literal notranslate"><span class="pre">dest_gcs</span></code> are
 templated, so you can use variables in them if you wish.</p>
+<dl class="method">
+<dt id="airflow.contrib.operators.s3_to_gcs_operator.S3ToGoogleCloudStorageOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/s3_to_gcs_operator.html#S3ToGoogleCloudStorageOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.s3_to_gcs_operator.S3ToGoogleCloudStorageOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1528,6 +1726,14 @@ required to process headers</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.s3_to_hive_operator.S3ToHiveTransfer.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/operators/s3_to_hive_operator.html#S3ToHiveTransfer.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.operators.s3_to_hive_operator.S3ToHiveTransfer.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1548,30 +1754,39 @@ required to process headers</li>
 <col class="field-name" />
 <col class="field-body" />
 <tbody valign="top">
-<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
 <li><strong>task_definition</strong> (<em>str</em>) – the task definition name on EC2 Container Service</li>
 <li><strong>cluster</strong> (<em>str</em>) – the cluster name on EC2 Container Service</li>
+<li><strong>overrides</strong> (<em>dict</em>) – the same parameter that boto3 will receive (templated):
+<a class="reference external" href="http://boto3.readthedocs.org/en/latest/reference/services/ecs.html#ECS.Client.run_task">http://boto3.readthedocs.org/en/latest/reference/services/ecs.html#ECS.Client.run_task</a></li>
 <li><strong>aws_conn_id</strong> (<em>str</em>) – connection id of AWS credentials / region name. If None,
 credential boto3 strategy will be used
 (<a class="reference external" href="http://boto3.readthedocs.io/en/latest/guide/configuration.html">http://boto3.readthedocs.io/en/latest/guide/configuration.html</a>).</li>
-<li><strong>region_name</strong> – region name to use in AWS Hook.
+<li><strong>region_name</strong> (<em>str</em>) – region name to use in AWS Hook.
 Override the region_name in connection (if provided)</li>
-<li><strong>launch_type</strong> – the launch type on which to run your task (‘EC2’ or ‘FARGATE’)</li>
+<li><strong>launch_type</strong> (<em>str</em>) – the launch type on which to run your task (‘EC2’ or ‘FARGATE’)</li>
 </ul>
 </td>
 </tr>
-<tr class="field-even field"><th class="field-name">Param:</th><td class="field-body"><p class="first">overrides: the same parameter that boto3 will receive (templated):
-<a class="reference external" href="http://boto3.readthedocs.org/en/latest/reference/services/ecs.html#ECS.Client.run_task">http://boto3.readthedocs.org/en/latest/reference/services/ecs.html#ECS.Client.run_task</a></p>
-</td>
-</tr>
-<tr class="field-odd field"><th class="field-name">Type:</th><td class="field-body"><p class="first">overrides: dict</p>
-</td>
-</tr>
-<tr class="field-even field"><th class="field-name">Type:</th><td class="field-body"><p class="first last">launch_type: str</p>
-</td>
-</tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.ecs_operator.ECSOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/ecs_operator.html#ECSOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.ecs_operator.ECSOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.contrib.operators.ecs_operator.ECSOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/ecs_operator.html#ECSOperator.on_kill"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.ecs_operator.ECSOperator.on_kill" title="Permalink to this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1592,29 +1807,42 @@ Override the region_name in connection (if provided)</li>
 <col class="field-name" />
 <col class="field-body" />
 <tbody valign="top">
-<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
 <li><strong>job_name</strong> (<em>str</em>) – the name for the job that will run on AWS Batch</li>
 <li><strong>job_definition</strong> (<em>str</em>) – the job definition name on AWS Batch</li>
 <li><strong>job_queue</strong> (<em>str</em>) – the queue name on AWS Batch</li>
-<li><strong>max_retries</strong> (<em>int</em>) – exponential backoff retries while waiter is not merged, 4200 = 48 hours</li>
+<li><strong>overrides</strong> (<em>dict</em>) – the same parameter that boto3 will receive on
+containerOverrides (templated).
+<a class="reference external" href="http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job">http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job</a></li>
+<li><strong>max_retries</strong> (<em>int</em>) – exponential backoff retries while waiter is not merged,
+4200 = 48 hours</li>
 <li><strong>aws_conn_id</strong> (<em>str</em>) – connection id of AWS credentials / region name. If None,
 credential boto3 strategy will be used
 (<a class="reference external" href="http://boto3.readthedocs.io/en/latest/guide/configuration.html">http://boto3.readthedocs.io/en/latest/guide/configuration.html</a>).</li>
-<li><strong>region_name</strong> – region name to use in AWS Hook.
+<li><strong>region_name</strong> (<em>str</em>) – region name to use in AWS Hook.
 Override the region_name in connection (if provided)</li>
 </ul>
 </td>
 </tr>
-<tr class="field-even field"><th class="field-name">Param:</th><td class="field-body"><p class="first">overrides: the same parameter that boto3 will receive on
-containerOverrides (templated):
-<a class="reference external" href="http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job">http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job</a></p>
-</td>
-</tr>
-<tr class="field-odd field"><th class="field-name">Type:</th><td class="field-body"><p class="first last">overrides: dict</p>
-</td>
-</tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.awsbatch_operator.AWSBatchOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/awsbatch_operator.html#AWSBatchOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.awsbatch_operator.AWSBatchOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.contrib.operators.awsbatch_operator.AWSBatchOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/awsbatch_operator.html#AWSBatchOperator.on_kill"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.awsbatch_operator.AWSBatchOperator.on_kill" title="Permalink to this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1773,6 +2001,14 @@ override.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/operators/redshift_to_s3_operator.html#RedshiftToS3Transfer.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1800,6 +2036,14 @@ override.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/operators/s3_to_redshift_operator.html#S3ToRedshiftTransfer.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1814,7 +2058,7 @@ submitting runs to the Databricks platform. Internally the operator talks to the
 <h3>DatabricksSubmitRunOperator<a class="headerlink" href="#databrickssubmitrunoperator" title="Permalink to this headline">¶</a></h3>
 <dl class="class">
 <dt id="airflow.contrib.operators.databricks_operator.DatabricksSubmitRunOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.databricks_operator.</code><code class="descname">DatabricksSubmitRunOperator</code><span class="sig-paren">(</span><em>json=None</em>, <em>spark_jar_task=None</em>, <em>notebook_task=None</em>, <em>new_cluster=None</em>, <em>existing_cluster_id=None</em>, <em>libraries=None</em>, <em>run_name=None</em>, <em>timeout_seconds=None</em>, <em>databricks_conn_id='databricks_default'</em>, <em>polling_period_seconds=30</em>, <em>databricks_retry_limit=3</em>, <em>do_xcom_push=False</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/databricks_operator.html#DatabricksSubmitRunOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.databricks_operator.DatabricksSubmitRunOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.databricks_operator.</code><code class="descname">DatabricksSubmitRunOperator</code><span class="sig-paren">(</span><em>json=None</em>, <em>spark_jar_task=None</em>, <em>notebook_task=None</em>, <em>new_cluster=None</em>, <em>existing_cluster_id=None</em>, <em>libraries=None</em>, <em>run_name=None</em>, <em>timeout_seconds=None</em>, <em>databricks_conn_id='databricks_default'</em>, <em>polling_period_seconds=30</em>, <em>databricks_retry_limit=3</em>, <em>databricks_retry_delay=1</em>, <em>do_xcom_push=False</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/databricks_operator.html#DatabricksSubmitRunOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.databricks_operator.DatabricksSubmitRunOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Submits an Spark job run to Databricks using the
 <a class="reference external" href="https://docs.databricks.com/api/latest/jobs.html#runs-submit">api/2.0/jobs/runs/submit</a>
@@ -1936,12 +2180,31 @@ connection.</li>
 this run. By default the operator will poll every 30 seconds.</li>
 <li><strong>databricks_retry_limit</strong> (<em>int</em>) – Amount of times retry if the Databricks backend is
 unreachable. Its value must be greater than or equal to 1.</li>
+<li><strong>databricks_retry_delay</strong> (<em>float</em>) – Number of seconds to wait between retries (it
+might be a floating point number).</li>
 <li><strong>do_xcom_push</strong> (<em>boolean</em>) – Whether we should push run_id and run_page_url to xcom.</li>
 </ul>
 </td>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.databricks_operator.DatabricksSubmitRunOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/databricks_operator.html#DatabricksSubmitRunOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.databricks_operator.DatabricksSubmitRunOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.contrib.operators.databricks_operator.DatabricksSubmitRunOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/databricks_operator.html#DatabricksSubmitRunOperator.on_kill"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.databricks_operator.DatabricksSubmitRunOperator.on_kill" title="Permalink to this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1977,7 +2240,7 @@ See <a class="reference internal" href="howto/write-logs.html#write-logs-gcp"><s
 <span id="id23"></span><h5>BigQueryCheckOperator<a class="headerlink" href="#bigquerycheckoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
 <dt id="airflow.contrib.operators.bigquery_check_operator.BigQueryCheckOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code class="descname">BigQueryCheckOperator</code><span class="sig-paren">(</span><em>sql</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryCheckOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_check_operator.BigQueryCheckOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code class="descname">BigQueryCheckOperator</code><span class="sig-paren">(</span><em>sql</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>use_legacy_sql=True</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryCheckOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_check_operator.BigQueryCheckOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.operators.check_operator.CheckOperator" title="airflow.operators.check_operator.CheckOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.operators.check_operator.CheckOperator</span></code></a></p>
 <p>Performs checks against BigQuery. The <code class="docutils literal notranslate"><span class="pre">BigQueryCheckOperator</span></code> expects
 a sql query that will return a single row. Each value on that
@@ -2009,6 +2272,8 @@ without stopping the progress of the DAG.</p>
 <tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
 <li><strong>sql</strong> (<em>string</em>) – the sql to be executed</li>
 <li><strong>bigquery_conn_id</strong> (<em>string</em>) – reference to the BigQuery database</li>
+<li><strong>use_legacy_sql</strong> (<em>boolean</em>) – Whether to use legacy SQL (true)
+or standard SQL (false).</li>
 </ul>
 </td>
 </tr>
@@ -2021,14 +2286,19 @@ without stopping the progress of the DAG.</p>
 <span id="id24"></span><h5>BigQueryValueCheckOperator<a class="headerlink" href="#bigqueryvaluecheckoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
 <dt id="airflow.contrib.operators.bigquery_check_operator.BigQueryValueCheckOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code class="descname">BigQueryValueCheckOperator</code><span class="sig-paren">(</span><em>sql</em>, <em>pass_value</em>, <em>tolerance=None</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryValueCheckOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_check_operator.BigQueryValueCheckOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code class="descname">BigQueryValueCheckOperator</code><span class="sig-paren">(</span><em>sql</em>, <em>pass_value</em>, <em>tolerance=None</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>use_legacy_sql=True</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryValueCheckOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_check_operator.BigQueryValueCheckOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.operators.check_operator.ValueCheckOperator" title="airflow.operators.check_operator.ValueCheckOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.operators.check_operator.ValueCheckOperator</span></code></a></p>
 <p>Performs a simple value check using sql code.</p>
 <table class="docutils field-list" frame="void" rules="none">
 <col class="field-name" />
 <col class="field-body" />
 <tbody valign="top">
-<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>sql</strong> (<em>string</em>) – the sql to be executed</td>
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>sql</strong> (<em>string</em>) – the sql to be executed</li>
+<li><strong>use_legacy_sql</strong> (<em>boolean</em>) – Whether to use legacy SQL (true)
+or standard SQL (false).</li>
+</ul>
+</td>
 </tr>
 </tbody>
 </table>
@@ -2039,7 +2309,7 @@ without stopping the progress of the DAG.</p>
 <span id="id25"></span><h5>BigQueryIntervalCheckOperator<a class="headerlink" href="#bigqueryintervalcheckoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
 <dt id="airflow.contrib.operators.bigquery_check_operator.BigQueryIntervalCheckOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code class="descname">BigQueryIntervalCheckOperator</code><span class="sig-paren">(</span><em>table</em>, <em>metrics_thresholds</em>, <em>date_filter_column='ds'</em>, <em>days_back=-7</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryIntervalCheckOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_check_operator.BigQueryIntervalCheckOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code class="descname">BigQueryIntervalCheckOperator</code><span class="sig-paren">(</span><em>table</em>, <em>metrics_thresholds</em>, <em>date_filter_column='ds'</em>, <em>days_back=-7</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>use_legacy_sql=True</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryIntervalCheckOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_check_operator.BigQueryIntervalCheckOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.operators.check_operator.IntervalCheckOperator" title="airflow.operators.check_operator.IntervalCheckOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.operators.check_operator.IntervalCheckOperator</span></code></a></p>
 <p>Checks that the values of metrics given as SQL expressions are within
 a certain tolerance of the ones from days_back before.</p>
@@ -2059,6 +2329,8 @@ against. Defaults to 7 days</li>
 <li><strong>metrics_threshold</strong> (<em>dict</em>) – a dictionary of ratios indexed by metrics, for
 example ‘COUNT(*)’: 1.5 would require a 50 percent or less difference
 between the current day, and the prior days_back.</li>
+<li><strong>use_legacy_sql</strong> (<em>boolean</em>) – Whether to use legacy SQL (true)
+or standard SQL (false).</li>
 </ul>
 </td>
 </tr>
@@ -2118,6 +2390,14 @@ delegation enabled.</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_get_data.BigQueryGetDataOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_get_data.html#BigQueryGetDataOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_get_data.BigQueryGetDataOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -2125,7 +2405,7 @@ delegation enabled.</li>
 <span id="id27"></span><h5>BigQueryCreateEmptyTableOperator<a class="headerlink" href="#bigquerycreateemptytableoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
 <dt id="airflow.contrib.operators.bigquery_operator.BigQueryCreateEmptyTableOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_operator.</code><code class="descname">BigQueryCreateEmptyTableOperator</code><span class="sig-paren">(</span><em>dataset_id</em>, <em>table_id</em>, <em>project_id=None</em>, <em>schema_fields=None</em>, <em>gcs_schema_object=None</em>, <em>time_partitioning={}</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>google_cloud_storage_conn_id='google_cloud_default'</em>, <em>delegate_to=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryCreateEmptyTableOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryCreateEmptyTableOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_operator.</code><code class="descname">BigQueryCreateEmptyTableOperator</code><span class="sig-paren">(</span><em>dataset_id</em>, <em>table_id</em>, <em>project_id=None</em>, <em>schema_fields=None</em>, <em>gcs_schema_object=None</em>, <em>time_partitioning={}</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>google_cloud_storage_conn_id='google_cloud_default'</em>, <em>delegate_to=None</em>, <em>labels=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryCreateEmptyTableOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryCreateEmptyTableOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Creates a new, empty table in the specified BigQuery dataset,
 optionally with schema.</p>
@@ -2166,11 +2446,7 @@ cloud storage hook.</li>
 <li><strong>delegate_to</strong> (<em>string</em>) – The account to impersonate, if any. For this to
 work, the service account making the request must have domain-wide
 delegation enabled.</li>
-</ul>
-</td>
-</tr>
-</tbody>
-</table>
+<li><strong>labels</strong> (<em>dict</em>) – <p>a dictionary containing labels for the table, passed to BigQuery</p>
 <p><strong>Example (with schema JSON in GCS)</strong>:</p>
 <div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="n">CreateTable</span> <span class="o">=</span> <span class="n">BigQueryCreateEmptyTableOperator</span><span class="p">(</span>
     <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;BigQueryCreateEmptyTableOperator_task&#39;</span><span class="p">,</span>
@@ -2211,6 +2487,20 @@ delegation enabled.</li>
 <span class="p">)</span>
 </pre></div>
 </div>
+</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_operator.BigQueryCreateEmptyTableOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryCreateEmptyTableOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryCreateEmptyTableOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -2218,7 +2508,7 @@ delegation enabled.</li>
 <span id="id28"></span><h5>BigQueryCreateExternalTableOperator<a class="headerlink" href="#bigquerycreateexternaltableoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
 <dt id="airflow.contrib.operators.bigquery_operator.BigQueryCreateExternalTableOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_operator.</code><code class="descname">BigQueryCreateExternalTableOperator</code><span class="sig-paren">(</span><em>bucket</em>, <em>source_objects</em>, <em>destination_project_dataset_table</em>, <em>schema_fields=None</em>, <em>schema_object=None</em>, <em>source_format='CSV'</em>, <em>compression='NONE'</em>, <em>skip_leading_rows=0</em>, <em>field_delimiter='</em>, <em>'</em>, <em>max_bad_records=0</em>, <em>quote_character=None</em>, <em>allow_quoted_newlines=False</em>, <em>allow_jagged_rows=False</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>google_cloud_storage_conn_id='google_cloud_default'</em>, <em>delegate_to=None</em>, <em>src_fmt_configs={}</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryCreateExternalTableOperator"><span class="viewcode-lin
 k">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryCreateExternalTableOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_operator.</code><code class="descname">BigQueryCreateExternalTableOperator</code><span class="sig-paren">(</span><em>bucket</em>, <em>source_objects</em>, <em>destination_project_dataset_table</em>, <em>schema_fields=None</em>, <em>schema_object=None</em>, <em>source_format='CSV'</em>, <em>compression='NONE'</em>, <em>skip_leading_rows=0</em>, <em>field_delimiter='</em>, <em>'</em>, <em>max_bad_records=0</em>, <em>quote_character=None</em>, <em>allow_quoted_newlines=False</em>, <em>allow_jagged_rows=False</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>google_cloud_storage_conn_id='google_cloud_default'</em>, <em>delegate_to=None</em>, <em>src_fmt_configs={}</em>, <em>labels=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryCreateExternalTableOperator"><sp
 an class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryCreateExternalTableOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Creates a new external table in the dataset with the data in Google Cloud
 Storage.</p>
@@ -2279,17 +2569,57 @@ delegation enabled.</li>
 </tr>
 </tbody>
 </table>
+<p>:param labels a dictionary containing labels for the table, passed to BigQuery
+:type labels: dict</p>
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_operator.BigQueryCreateExternalTableOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryCreateExternalTableOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryCreateExternalTableOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
 <div class="section" id="bigquerydeletedatasetoperator">
 <span id="id29"></span><h5>BigQueryDeleteDatasetOperator<a class="headerlink" href="#bigquerydeletedatasetoperator" title="Permalink to this headline">¶</a></h5>
+<dl class="class">
+<dt id="airflow.contrib.operators.bigquery_operator.BigQueryDeleteDatasetOperator">
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_operator.</code><code class="descname">BigQueryDeleteDatasetOperator</code><span class="sig-paren">(</span><em>dataset_id</em>, <em>project_id=None</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>delegate_to=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryDeleteDatasetOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryDeleteDatasetOperator" title="Permalink to this definition">¶</a></dt>
+<dd><p>Bases: <a class="reference internal" href="code.html#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.models.BaseOperator</span></code></a></p>
+<p>”
+This operator deletes an existing dataset from your Project in Big query.
+<a class="reference external" href="https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets/delete">https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets/delete</a>
+:param project_id: The project id of the dataset.
+:type project_id: string
+:param dataset_id: The dataset to be deleted.
+:type dataset_id: string</p>
+<p><strong>Example</strong>:</p>
+<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="n">delete_temp_data</span> <span class="o">=</span> <span class="n">BigQueryDeleteDatasetOperator</span><span class="p">(</span>
+                                <span class="n">dataset_id</span> <span class="o">=</span> <span class="s1">&#39;temp-dataset&#39;</span><span class="p">,</span>
+                                <span class="n">project_id</span> <span class="o">=</span> <span class="s1">&#39;temp-project&#39;</span><span class="p">,</span>
+                                <span class="n">bigquery_conn_id</span><span class="o">=</span><span class="s1">&#39;_my_gcp_conn_&#39;</span><span class="p">,</span>
+                                <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;Deletetemp&#39;</span><span class="p">,</span>
+                                <span class="n">dag</span><span class="o">=</span><span class="n">dag</span><span class="p">)</span>
+</pre></div>
+</div>
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_operator.BigQueryDeleteDatasetOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryDeleteDatasetOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryDeleteDatasetOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+</dd></dl>
+
 </div>
 <div class="section" id="bigqueryoperator">
 <span id="id30"></span><h5>BigQueryOperator<a class="headerlink" href="#bigqueryoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
 <dt id="airflow.contrib.operators.bigquery_operator.BigQueryOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_operator.</code><code class="descname">BigQueryOperator</code><span class="sig-paren">(</span><em>bql=None</em>, <em>sql=None</em>, <em>destination_dataset_table=False</em>, <em>write_disposition='WRITE_EMPTY'</em>, <em>allow_large_results=False</em>, <em>flatten_results=False</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>delegate_to=None</em>, <em>udf_config=False</em>, <em>use_legacy_sql=True</em>, <em>maximum_billing_tier=None</em>, <em>maximum_bytes_billed=None</em>, <em>create_disposition='CREATE_IF_NEEDED'</em>, <em>schema_update_options=()</em>, <em>query_params=None</em>, <em>priority='INTERACTIVE'</em>, <em>time_partitioning={}</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryOperator"><span class="viewcode-link">[source]</span></a><a class="headerl
 ink" href="#airflow.contrib.operators.bigquery_operator.BigQueryOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_operator.</code><code class="descname">BigQueryOperator</code><span class="sig-paren">(</span><em>bql=None</em>, <em>sql=None</em>, <em>destination_dataset_table=False</em>, <em>write_disposition='WRITE_EMPTY'</em>, <em>allow_large_results=False</em>, <em>flatten_results=None</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>delegate_to=None</em>, <em>udf_config=False</em>, <em>use_legacy_sql=True</em>, <em>maximum_billing_tier=None</em>, <em>maximum_bytes_billed=None</em>, <em>create_disposition='CREATE_IF_NEEDED'</em>, <em>schema_update_options=()</em>, <em>query_params=None</em>, <em>labels=None</em>, <em>priority='INTERACTIVE'</em>, <em>time_partitioning={}</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryOperator"><span class="viewcode-link">[source]</span>
 </a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Executes BigQuery SQL queries in a specific BigQuery database</p>
 <table class="docutils field-list" frame="void" rules="none">
@@ -2334,18 +2664,35 @@ set to your project default.</li>
 table to be updated as a side effect of the load job.</li>
 <li><strong>query_params</strong> (<em>dict</em>) – a dictionary containing query parameter types and
 values, passed to BigQuery.</li>
+<li><strong>labels</strong> (<em>dict</em>) – a dictionary containing labels for the job/query,
+passed to BigQuery</li>
 <li><strong>priority</strong> (<em>string</em>) – Specifies a priority for the query.
 Possible values include INTERACTIVE and BATCH.
 The default value is INTERACTIVE.</li>
 <li><strong>time_partitioning</strong> (<em>dict</em>) – configure optional time partitioning fields i.e.
-partition by field, type and
-expiration as per API specifications. Note that ‘field’ is not available in
-conjunction with dataset.table$partition.</li>
+partition by field, type and expiration as per API specifications.</li>
 </ul>
 </td>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_operator.BigQueryOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_operator.BigQueryOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_operator.html#BigQueryOperator.on_kill"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_operator.BigQueryOperator.on_kill" title="Permalink to this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -2375,6 +2722,14 @@ requested table does not exist.</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_table_delete_operator.BigQueryTableDeleteOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_table_delete_operator.html#BigQueryTableDeleteOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_table_delete_operator.BigQueryTableDeleteOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -2382,7 +2737,7 @@ requested table does not exist.</li>
 <span id="id32"></span><h5>BigQueryToBigQueryOperator<a class="headerlink" href="#bigquerytobigqueryoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
 <dt id="airflow.contrib.operators.bigquery_to_bigquery.BigQueryToBigQueryOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_to_bigquery.</code><code class="descname">BigQueryToBigQueryOperator</code><span class="sig-paren">(</span><em>source_project_dataset_tables</em>, <em>destination_project_dataset_table</em>, <em>write_disposition='WRITE_EMPTY'</em>, <em>create_disposition='CREATE_IF_NEEDED'</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>delegate_to=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_to_bigquery.html#BigQueryToBigQueryOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_to_bigquery.BigQueryToBigQueryOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_to_bigquery.</code><code class="descname">BigQueryToBigQueryOperator</code><span class="sig-paren">(</span><em>source_project_dataset_tables</em>, <em>destination_project_dataset_table</em>, <em>write_disposition='WRITE_EMPTY'</em>, <em>create_disposition='CREATE_IF_NEEDED'</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>delegate_to=None</em>, <em>labels=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_to_bigquery.html#BigQueryToBigQueryOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_to_bigquery.BigQueryToBigQueryOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Copies data from one BigQuery table to another.</p>
 <div class="admonition seealso">
@@ -2408,11 +2763,21 @@ table. Format is: (project:<a href="#id35"><span class="problematic" id="id36">|
 <li><strong>delegate_to</strong> (<em>string</em>) – The account to impersonate, if any.
 For this to work, the service account making the request must have domain-wide
 delegation enabled.</li>
+<li><strong>labels</strong> (<em>dict</em>) – a dictionary containing labels for the job/query,
+passed to BigQuery</li>
 </ul>
 </td>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_to_bigquery.BigQueryToBigQueryOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_to_bigquery.html#BigQueryToBigQueryOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_to_bigquery.BigQueryToBigQueryOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -2420,7 +2785,7 @@ delegation enabled.</li>
 <span id="id37"></span><h5>BigQueryToCloudStorageOperator<a class="headerlink" href="#bigquerytocloudstorageoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
 <dt id="airflow.contrib.operators.bigquery_to_gcs.BigQueryToCloudStorageOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_to_gcs.</code><code class="descname">BigQueryToCloudStorageOperator</code><span class="sig-paren">(</span><em>source_project_dataset_table</em>, <em>destination_cloud_storage_uris</em>, <em>compression='NONE'</em>, <em>export_format='CSV'</em>, <em>field_delimiter='</em>, <em>'</em>, <em>print_header=True</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>delegate_to=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_to_gcs.html#BigQueryToCloudStorageOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_to_gcs.BigQueryToCloudStorageOperator" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.bigquery_to_gcs.</code><code class="descname">BigQueryToCloudStorageOperator</code><span class="sig-paren">(</span><em>source_project_dataset_table</em>, <em>destination_cloud_storage_uris</em>, <em>compression='NONE'</em>, <em>export_format='CSV'</em>, <em>field_delimiter='</em>, <em>'</em>, <em>print_header=True</em>, <em>bigquery_conn_id='bigquery_default'</em>, <em>delegate_to=None</em>, <em>labels=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_to_gcs.html#BigQueryToCloudStorageOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_to_gcs.BigQueryToCloudStorageOperator" title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" href="code.html#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Transfers a BigQuery table to a Google Cloud Storage bucket.</p>
 <div class="admonition seealso">
@@ -2449,11 +2814,21 @@ https://cloud.google.com/bigquery/exporting-data-from-bigquery#exportingmultiple
 <li><strong>delegate_to</strong> (<em>string</em>) – The account to impersonate, if any.
 For this to work, the service account making the request must have domain-wide
 delegation enabled.</li>
+<li><strong>labels</strong> (<em>dict</em>) – a dictionary containing labels for the job/query,
+passed to BigQuery</li>
 </ul>
 </td>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.bigquery_to_gcs.BigQueryToCloudStorageOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/bigquery_to_gcs.html#BigQueryToCloudStorageOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.bigquery_to_gcs.BigQueryToCloudStorageOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -2537,60 +2912,889 @@ table.</li>
 
 </div>
 </div>
-<div class="section" id="cloud-dataflow">
-<h3>Cloud DataFlow<a class="headerlink" href="#cloud-dataflow" title="Permalink to this headline">¶</a></h3>
-<div class="section" id="dataflow-operators">
-<h4>DataFlow Operators<a class="headerlink" href="#dataflow-operators" title="Permalink to this headline">¶</a></h4>
+<div class="section" id="cloud-sql">
+<h3>Cloud SQL<a class="headerlink" href="#cloud-sql" title="Permalink to this headline">¶</a></h3>
+<div class="section" id="cloud-sql-operators">
+<h4>Cloud SQL Operators<a class="headerlink" href="#cloud-sql-operators" title="Permalink to this headline">¶</a></h4>
 <ul class="simple">
-<li><a class="reference internal" href="#dataflowjavaoperator"><span class="std std-ref">DataFlowJavaOperator</span></a> : launching Cloud Dataflow jobs written in Java.</li>
-<li><a class="reference internal" href="#dataflowtemplateoperator"><span class="std std-ref">DataflowTemplateOperator</span></a> : launching a templated Cloud DataFlow batch job.</li>
-<li><a class="reference internal" href="#dataflowpythonoperator"><span class="std std-ref">DataFlowPythonOperator</span></a> : launching Cloud Dataflow jobs written in python.</li>
-</ul>
-<div class="section" id="dataflowjavaoperator">
-<span id="id38"></span><h5>DataFlowJavaOperator<a class="headerlink" href="#dataflowjavaoperator" title="Permalink to this headline">¶</a></h5>
+<li><span class="xref std std-ref">CloudSqlInstanceDatabaseDeleteOperator</span> : deletes a database from a Cloud SQL</li>
+</ul>
+<p>instance.
+- <span class="xref std std-ref">CloudSqlInstanceDatabaseCreateOperator</span> : creates a new database inside a Cloud
+SQL instance.
+- <span class="xref std std-ref">CloudSqlInstanceDatabasePatchOperator</span> : updates a database inside a Cloud
+SQL instance.
+- <span class="xref std std-ref">CloudSqlInstanceDeleteOperator</span> : delete a Cloud SQL instance.
+- <a class="reference internal" href="howto/operator.html#cloudsqlinstancecreateoperator"><span class="std std-ref">CloudSqlInstanceCreateOperator</span></a> : create a new Cloud SQL instance.
+- <a class="reference internal" href="howto/operator.html#cloudsqlinstancepatchoperator"><span class="std std-ref">CloudSqlInstancePatchOperator</span></a> : patch a Cloud SQL instance.</p>
+<div class="section" id="cloudsqlinstancedatabasedeleteoperator">
+<h5>CloudSqlInstanceDatabaseDeleteOperator<a class="headerlink" href="#cloudsqlinstancedatabasedeleteoperator" title="Permalink to this headline">¶</a></h5>
 <dl class="class">
-<dt id="airflow.contrib.operators.dataflow_operator.DataFlowJavaOperator">
-<em class="property">class </em><code class="descclassname">airflow.contrib.operators.dataflow_operator.</code><code class="descname">DataFlowJavaOperator</code><span class="sig-paren">(</span><em>jar</em>, <em>dataflow_default_options=None</em>, <em>options=None</em>, <em>gcp_conn_id='google_cloud_default'</em>, <em>delegate_to=None</em>, <em>poll_sleep=10</em>, <em>job_class=None</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/dataflow_operator.html#DataFlowJavaOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.dataflow_operator.DataFlowJavaOperator" title="Permalink to this definition">¶</a></dt>
-<dd><p>Bases: <a class="reference internal" href="code.html#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.models.BaseOperator</span></code></a></p>
-<p>Start a Java Cloud DataFlow batch job. The parameters of the operation
-will be passed to the job.</p>
-<p>It’s a good practice to define dataflow_* parameters in the default_args of the dag
-like the project, zone and staging location.</p>
-<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">default_args</span> <span class="o">=</span> <span class="p">{</span>
-    <span class="s1">&#39;dataflow_default_options&#39;</span><span class="p">:</span> <span class="p">{</span>
-        <span class="s1">&#39;project&#39;</span><span class="p">:</span> <span class="s1">&#39;my-gcp-project&#39;</span><span class="p">,</span>
-        <span class="s1">&#39;zone&#39;</span><span class="p">:</span> <span class="s1">&#39;europe-west1-d&#39;</span><span class="p">,</span>
-        <span class="s1">&#39;stagingLocation&#39;</span><span class="p">:</span> <span class="s1">&#39;gs://my-staging-bucket/staging/&#39;</span>
-    <span class="p">}</span>
-<span class="p">}</span>
-</pre></div>
-</div>
-<p>You need to pass the path to your dataflow as a file reference with the <code class="docutils literal notranslate"><span class="pre">jar</span></code>
-parameter, the jar needs to be a self executing jar (see documentation here:
-<a class="reference external" href="https://beam.apache.org/documentation/runners/dataflow/#self-executing-jar">https://beam.apache.org/documentation/runners/dataflow/#self-executing-jar</a>).
-Use <code class="docutils literal notranslate"><span class="pre">options</span></code> to pass on options to your job.</p>
-<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">t1</span> <span class="o">=</span> <span class="n">DataFlowOperation</span><span class="p">(</span>
-    <span class="n">task_id</span><span class="o">=</span><span class="s1">&#39;datapflow_example&#39;</span><span class="p">,</span>
-    <span class="n">jar</span><span class="o">=</span><span class="s1">&#39;{{var.value.gcp_dataflow_base}}pipeline/build/libs/pipeline-example-1.0.jar&#39;</span><span class="p">,</span>
-    <span class="n">options</span><span class="o">=</span><span class="p">{</span>
-        <span class="s1">&#39;autoscalingAlgorithm&#39;</span><span class="p">:</span> <span class="s1">&#39;BASIC&#39;</span><span class="p">,</span>
-        <span class="s1">&#39;maxNumWorkers&#39;</span><span class="p">:</span> <span class="s1">&#39;50&#39;</span><span class="p">,</span>
-        <span class="s1">&#39;start&#39;</span><span class="p">:</span> <span class="s1">&#39;{{ds}}&#39;</span><span class="p">,</span>
-        <span class="s1">&#39;partitionType&#39;</span><span class="p">:</span> <span class="s1">&#39;DAY&#39;</span><span class="p">,</span>
-        <span class="s1">&#39;labels&#39;</span><span class="p">:</span> <span class="p">{</span><span class="s1">&#39;foo&#39;</span> <span class="p">:</span> <span class="s1">&#39;bar&#39;</span><span class="p">}</span>
-    <span class="p">},</span>
-    <span class="n">gcp_conn_id</span><span class="o">=</span><span class="s1">&#39;gcp-airflow-service-account&#39;</span><span class="p">,</span>
-    <span class="n">dag</span><span class="o">=</span><span class="n">my</span><span class="o">-</span><span class="n">dag</span><span class="p">)</span>
-</pre></div>
-</div>
-<p>Both <code class="docutils literal notranslate"><span class="pre">jar</span></code> and <code class="docutils literal notranslate"><span class="pre">options</span></code> are templated so you can use variables in them.</p>
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabaseDeleteOperator">
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.gcp_sql_operator.</code><code class="descname">CloudSqlInstanceDatabaseDeleteOperator</code><span class="sig-paren">(</span><em>project_id</em>, <em>instance</em>, <em>database</em>, <em>gcp_conn_id='google_cloud_default'</em>, <em>api_version='v1beta4'</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceDatabaseDeleteOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabaseDeleteOperator" title="Permalink to this definition">¶</a></dt>
+<dd><p>Bases: <code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.contrib.operators.gcp_sql_operator.CloudSqlBaseOperator</span></code></p>
+<p>Deletes a database from a Cloud SQL instance.</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>project_id</strong> (<em>str</em>) – Project ID of the project that contains the instance.</li>
+<li><strong>instance</strong> (<em>str</em>) – Database instance ID. This does not include the project ID.</li>
+<li><strong>database</strong> (<em>str</em>) – Name of the database to be deleted in the instance.</li>
+<li><strong>gcp_conn_id</strong> (<em>str</em>) – The connection ID used to connect to Google Cloud Platform.</li>
+<li><strong>api_version</strong> (<em>str</em>) – API version used (e.g. v1beta4).</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+<dl class="method">
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabaseDeleteOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceDatabaseDeleteOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabaseDeleteOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
 </dd></dl>
 
-<div class="code python highlight-default notranslate"><div class="highlight"><pre><span></span><span class="n">default_args</span> <span class="o">=</span> <span class="p">{</span>
-    <span class="s1">&#39;owner&#39;</span><span class="p">:</span> <span class="s1">&#39;airflow&#39;</span><span class="p">,</span>
-    <span class="s1">&#39;depends_on_past&#39;</span><span class="p">:</span> <span class="kc">False</span><span class="p">,</span>
-    <span class="s1">&#39;start_date&#39;</span><span class="p">:</span>
-        <span class="p">(</span><span class="mi">2016</span><span class="p">,</span> <span class="mi">8</span><span class="p">,</span> <span class="mi">1</span><span class="p">),</span>
+</dd></dl>
+
+</div>
+<div class="section" id="cloudsqlinstancedatabasecreateoperator">
+<h5>CloudSqlInstanceDatabaseCreateOperator<a class="headerlink" href="#cloudsqlinstancedatabasecreateoperator" title="Permalink to this headline">¶</a></h5>
+<dl class="class">
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabaseCreateOperator">
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.gcp_sql_operator.</code><code class="descname">CloudSqlInstanceDatabaseCreateOperator</code><span class="sig-paren">(</span><em>project_id</em>, <em>instance</em>, <em>body</em>, <em>gcp_conn_id='google_cloud_default'</em>, <em>api_version='v1beta4'</em>, <em>validate_body=True</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceDatabaseCreateOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabaseCreateOperator" title="Permalink to this definition">¶</a></dt>
+<dd><p>Bases: <code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.contrib.operators.gcp_sql_operator.CloudSqlBaseOperator</span></code></p>
+<p>Creates a new database inside a Cloud SQL instance.</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>project_id</strong> (<em>str</em>) – Project ID of the project that contains the instance.</li>
+<li><strong>instance</strong> (<em>str</em>) – Database instance ID. This does not include the project ID.</li>
+<li><strong>body</strong> (<em>dict</em>) – The request body, as described in
+<a class="reference external" href="https://cloud.google.com/sql/docs/mysql/admin-api/v1beta4/databases/insert#request-body">https://cloud.google.com/sql/docs/mysql/admin-api/v1beta4/databases/insert#request-body</a></li>
+<li><strong>gcp_conn_id</strong> (<em>str</em>) – The connection ID used to connect to Google Cloud Platform.</li>
+<li><strong>api_version</strong> (<em>str</em>) – API version used (e.g. v1beta4).</li>
+<li><strong>validate_body</strong> (<em>bool</em>) – Whether the body should be validated. Defaults to True.</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+<dl class="method">
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabaseCreateOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceDatabaseCreateOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabaseCreateOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+</dd></dl>
+
+</div>
+<div class="section" id="cloudsqlinstancedatabasepatchoperator">
+<h5>CloudSqlInstanceDatabasePatchOperator<a class="headerlink" href="#cloudsqlinstancedatabasepatchoperator" title="Permalink to this headline">¶</a></h5>
+<dl class="class">
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabasePatchOperator">
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.gcp_sql_operator.</code><code class="descname">CloudSqlInstanceDatabasePatchOperator</code><span class="sig-paren">(</span><em>project_id</em>, <em>instance</em>, <em>database</em>, <em>body</em>, <em>gcp_conn_id='google_cloud_default'</em>, <em>api_version='v1beta4'</em>, <em>validate_body=True</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceDatabasePatchOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabasePatchOperator" title="Permalink to this definition">¶</a></dt>
+<dd><p>Bases: <code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.contrib.operators.gcp_sql_operator.CloudSqlBaseOperator</span></code></p>
+<p>Updates a resource containing information about a database inside a Cloud SQL
+instance using patch semantics.
+See: <a class="reference external" href="https://cloud.google.com/sql/docs/mysql/admin-api/how-tos/performance#patch">https://cloud.google.com/sql/docs/mysql/admin-api/how-tos/performance#patch</a></p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>project_id</strong> (<em>str</em>) – Project ID of the project that contains the instance.</li>
+<li><strong>instance</strong> (<em>str</em>) – Database instance ID. This does not include the project ID.</li>
+<li><strong>database</strong> (<em>str</em>) – Name of the database to be updated in the instance.</li>
+<li><strong>body</strong> (<em>dict</em>) – The request body, as described in
+<a class="reference external" href="https://cloud.google.com/sql/docs/mysql/admin-api/v1beta4/databases/patch#request-body">https://cloud.google.com/sql/docs/mysql/admin-api/v1beta4/databases/patch#request-body</a></li>
+<li><strong>gcp_conn_id</strong> (<em>str</em>) – The connection ID used to connect to Google Cloud Platform.</li>
+<li><strong>api_version</strong> (<em>str</em>) – API version used (e.g. v1beta4).</li>
+<li><strong>validate_body</strong> (<em>bool</em>) – Whether the body should be validated. Defaults to True.</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+<dl class="method">
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabasePatchOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceDatabasePatchOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDatabasePatchOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+</dd></dl>
+
+</div>
+<div class="section" id="cloudsqlinstancedeleteoperator">
+<h5>CloudSqlInstanceDeleteOperator<a class="headerlink" href="#cloudsqlinstancedeleteoperator" title="Permalink to this headline">¶</a></h5>
+<dl class="class">
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDeleteOperator">
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.gcp_sql_operator.</code><code class="descname">CloudSqlInstanceDeleteOperator</code><span class="sig-paren">(</span><em>project_id</em>, <em>instance</em>, <em>gcp_conn_id='google_cloud_default'</em>, <em>api_version='v1beta4'</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceDeleteOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDeleteOperator" title="Permalink to this definition">¶</a></dt>
+<dd><p>Bases: <code class="xref py py-class docutils literal notranslate"><span class="pre">airflow.contrib.operators.gcp_sql_operator.CloudSqlBaseOperator</span></code></p>
+<p>Deletes a Cloud SQL instance.</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>project_id</strong> (<em>str</em>) – Project ID of the project that contains the instance to be deleted.</li>
+<li><strong>instance</strong> (<em>str</em>) – Cloud SQL instance ID. This does not include the project ID.</li>
+<li><strong>gcp_conn_id</strong> (<em>str</em>) – The connection ID used to connect to Google Cloud Platform.</li>
+<li><strong>api_version</strong> (<em>str</em>) – API version used (e.g. v1beta4).</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+<dl class="method">
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDeleteOperator.execute">
+<code class="descname">execute</code><span class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceDeleteOperator.execute"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceDeleteOperator.execute" title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+</dd></dl>
+
+</div>
+<div class="section" id="cloudsqlinstancecreateoperator">
+<h5>CloudSqlInstanceCreateOperator<a class="headerlink" href="#cloudsqlinstancecreateoperator" title="Permalink to this headline">¶</a></h5>
+<dl class="class">
+<dt id="airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceCreateOperator">
+<em class="property">class </em><code class="descclassname">airflow.contrib.operators.gcp_sql_operator.</code><code class="descname">CloudSqlInstanceCreateOperator</code><span class="sig-paren">(</span><em>project_id</em>, <em>body</em>, <em>instance</em>, <em>gcp_conn_id='google_cloud_default'</em>, <em>api_version='v1beta4'</em>, <em>validate_body=True</em>, <em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="_modules/airflow/contrib/operators/gcp_sql_operator.html#CloudSqlInstanceCreateOperator"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanc

<TRUNCATED>


Mime
View raw message