airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From bo...@apache.org
Subject [1/2] incubator-airflow git commit: [AIRFLOW-489] Add API Framework
Date Sun, 27 Nov 2016 19:13:58 GMT
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 7f0bf577b -> 6fb94630c


[AIRFLOW-489] Add API Framework

This implements a framework for API calls to Airflow. Currently
all access is done by cli or web ui. Especially in the context
of the cli this raises security concerns which can be alleviated
with a secured API call over the wire.

Secondly integration with other systems is a bit harder if you have
to call a cli. For public facing endpoints JSON is used.

As an example the trigger_dag functionality is now made into a
API call.

Backwards compat is retained by switching to a LocalClient.


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/d5ac6bd9
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/d5ac6bd9
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/d5ac6bd9

Branch: refs/heads/master
Commit: d5ac6bd9d0257683d8b2d0b200a513b2f43ce5dc
Parents: dedc54e
Author: Bolke de Bruin <bolke@xs4all.nl>
Authored: Mon Nov 21 19:29:35 2016 +0100
Committer: Bolke de Bruin <bolke@xs4all.nl>
Committed: Sun Nov 27 19:44:31 2016 +0100

----------------------------------------------------------------------
 .rat-excludes                                   |   5 +
 .travis.yml                                     |   5 +
 NOTICE                                          |   1 +
 airflow/api/__init__.py                         |  38 ++++++
 airflow/api/auth/__init__.py                    |  13 ++
 airflow/api/auth/backend/__init__.py            |  13 ++
 airflow/api/auth/backend/default.py             |  29 +++++
 airflow/api/auth/backend/kerberos_auth.py       | 128 +++++++++++++++++++
 airflow/api/client/__init__.py                  |  14 ++
 airflow/api/client/api_client.py                |  30 +++++
 airflow/api/client/json_client.py               |  39 ++++++
 airflow/api/client/local_client.py              |  22 ++++
 airflow/api/common/__init__.py                  |  13 ++
 airflow/api/common/experimental/__init__.py     |  13 ++
 airflow/api/common/experimental/trigger_dag.py  |  57 +++++++++
 airflow/bin/cli.py                              |  50 ++++----
 airflow/configuration.py                        |  17 +++
 airflow/settings.py                             |   1 +
 airflow/www/api/experimental/endpoints.py       |  60 ++++++++-
 airflow/www/app.py                              |  28 +++-
 docs/api.rst                                    |  43 +++++++
 docs/index.rst                                  |   1 +
 scripts/ci/kadm5.acl                            |  13 ++
 scripts/ci/kdc.conf                             |  24 ++++
 scripts/ci/krb5.conf                            |  31 +++++
 scripts/ci/minikdc.properties                   |   5 +-
 scripts/ci/requirements.txt                     |   1 +
 scripts/ci/run_tests.sh                         |   3 +
 scripts/ci/setup_kdc.sh                         |  66 +++++-----
 setup.py                                        |   7 +-
 tests/www/api/experimental/endpoints.py         |  83 ------------
 tests/www/api/experimental/test_endpoints.py    |  62 +++++++++
 .../api/experimental/test_kerberos_endpoints.py |  97 ++++++++++++++
 tox.ini                                         |  10 +-
 34 files changed, 862 insertions(+), 160 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/.rat-excludes
----------------------------------------------------------------------
diff --git a/.rat-excludes b/.rat-excludes
index 971b16d..1363766 100644
--- a/.rat-excludes
+++ b/.rat-excludes
@@ -19,3 +19,8 @@ metastore_db
 .*csv
 CHANGELOG.txt
 .*zip
+# Apache Rat does not detect BSD-2 clause properly
+# it is compatible according to http://www.apache.org/legal/resolved.html#category-a
+kerberos_auth.py
+airflow_api_auth_backend_kerberos_auth_py.html
+

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/.travis.yml
----------------------------------------------------------------------
diff --git a/.travis.yml b/.travis.yml
index 6e844d7..08145a6 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -28,6 +28,9 @@ addons:
       - mysql-client-core-5.6
       - mysql-client-5.6
       - krb5-user
+      - krb5-kdc
+      - krb5-admin-server
+      - python-selinux
   postgresql: "9.2"
 python:
   - "2.7"
@@ -35,6 +38,8 @@ python:
 env:
   global:
     - TRAVIS_CACHE=$HOME/.travis_cache/
+    - KRB5_CONFIG=/etc/krb5.conf
+    - KRB5_KTNAME=/etc/airflow.keytab
     # Travis on google cloud engine has a global /etc/boto.cfg that
     # does not work with python 3
     - BOTO_CONFIG=/tmp/bogusvalue

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/NOTICE
----------------------------------------------------------------------
diff --git a/NOTICE b/NOTICE
index 1f71240..79f43d9 100644
--- a/NOTICE
+++ b/NOTICE
@@ -15,3 +15,4 @@ This product includes Underscore.js (http://underscorejs.org - MIT license), Cop
 This product includes FooTable (http://fooplugins.com/plugins/footable-jquery/ - MIT license), Copyright 2013 Steven Usher & Brad Vincent.
 This product includes dagre (https://github.com/cpettitt/dagre - MIT license), Copyright (c) 2012-2014 Chris Pettitt.
 This product includes d3js (https://d3js.org/ - https://github.com/mbostock/d3/blob/master/LICENSE), Copyright (c) 2010-2016, Michael Bostock.
+This product includes flask-kerberos (https://github.com/mkomitee/flask-kerberos - BSD License), Copyright (c) 2013, Michael Komitee

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/__init__.py
----------------------------------------------------------------------
diff --git a/airflow/api/__init__.py b/airflow/api/__init__.py
new file mode 100644
index 0000000..ae47abf
--- /dev/null
+++ b/airflow/api/__init__.py
@@ -0,0 +1,38 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+from __future__ import print_function
+
+import logging
+
+from airflow.exceptions import AirflowException
+from airflow import configuration as conf
+from importlib import import_module
+
+api_auth = None
+
+
+def load_auth():
+    auth_backend = 'airflow.api.auth.backend.default'
+    try:
+        auth_backend = conf.get("api", "auth_backend")
+    except conf.AirflowConfigException:
+        pass
+
+    try:
+        global api_auth
+        api_auth = import_module(auth_backend)
+    except ImportError as err:
+        logging.critical("Cannot import {} for API authentication due to: {}"
+                         .format(auth_backend, err))
+        raise AirflowException(err)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/auth/__init__.py
----------------------------------------------------------------------
diff --git a/airflow/api/auth/__init__.py b/airflow/api/auth/__init__.py
new file mode 100644
index 0000000..9d7677a
--- /dev/null
+++ b/airflow/api/auth/__init__.py
@@ -0,0 +1,13 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/auth/backend/__init__.py
----------------------------------------------------------------------
diff --git a/airflow/api/auth/backend/__init__.py b/airflow/api/auth/backend/__init__.py
new file mode 100644
index 0000000..9d7677a
--- /dev/null
+++ b/airflow/api/auth/backend/__init__.py
@@ -0,0 +1,13 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/auth/backend/default.py
----------------------------------------------------------------------
diff --git a/airflow/api/auth/backend/default.py b/airflow/api/auth/backend/default.py
new file mode 100644
index 0000000..64cae86
--- /dev/null
+++ b/airflow/api/auth/backend/default.py
@@ -0,0 +1,29 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from functools import wraps
+
+client_auth = None
+
+
+def init_app(app):
+    pass
+
+
+def requires_authentication(function):
+    @wraps(function)
+    def decorated(*args, **kwargs):
+        return function(*args, **kwargs)
+
+    return decorated
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/auth/backend/kerberos_auth.py
----------------------------------------------------------------------
diff --git a/airflow/api/auth/backend/kerberos_auth.py b/airflow/api/auth/backend/kerberos_auth.py
new file mode 100644
index 0000000..7fdef12
--- /dev/null
+++ b/airflow/api/auth/backend/kerberos_auth.py
@@ -0,0 +1,128 @@
+# Copyright (c) 2013, Michael Komitee
+# All rights reserved.
+#
+# Redistribution and use in source and binary forms, with or without modification,
+# are permitted provided that the following conditions are met:
+#
+# 1. Redistributions of source code must retain the above copyright notice,
+# this list of conditions and the following disclaimer.
+#
+# 2. Redistributions in binary form must reproduce the above copyright notice,
+# this list of conditions and the following disclaimer in the documentation
+# and/or other materials provided with the distribution.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
+# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
+# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
+# ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
+# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
+# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
+# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
+# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+from future.standard_library import install_aliases
+install_aliases()
+
+import kerberos
+import logging
+import os
+
+from airflow import configuration as conf
+
+from flask import Response
+from flask import _request_ctx_stack as stack
+from flask import make_response
+from flask import request
+from flask import g
+from functools import wraps
+
+from requests_kerberos import HTTPKerberosAuth
+from socket import getfqdn
+
+client_auth = HTTPKerberosAuth(service='airflow')
+
+_SERVICE_NAME = None
+
+
+def init_app(app):
+    global _SERVICE_NAME
+
+    hostname = app.config.get('SERVER_NAME')
+    if not hostname:
+        hostname = getfqdn()
+    logging.info("Kerberos: hostname {}".format(hostname))
+
+    service = 'airflow'
+
+    _SERVICE_NAME = "{}@{}".format(service, hostname)
+
+    if 'KRB5_KTNAME' not in os.environ:
+        os.environ['KRB5_KTNAME'] = conf.get('kerberos', 'keytab')
+
+    try:
+        logging.info("Kerberos init: {} {}".format(service, hostname))
+        principal = kerberos.getServerPrincipalDetails(service, hostname)
+    except kerberos.KrbError as err:
+        logging.warn("Kerberos: {}".format(err))
+    else:
+        logging.info("Kerberos API: server is {}".format(principal))
+
+
+def _unauthorized():
+    """
+    Indicate that authorization is required
+    :return:
+    """
+    return Response("Unauthorized", 401, {"WWW-Authenticate": "Negotiate"})
+
+
+def _forbidden():
+    return Response("Forbidden", 403)
+
+
+def _gssapi_authenticate(token):
+    state = None
+    ctx = stack.top
+    try:
+        rc, state = kerberos.authGSSServerInit(_SERVICE_NAME)
+        if rc != kerberos.AUTH_GSS_COMPLETE:
+            return None
+        rc = kerberos.authGSSServerStep(state, token)
+        if rc == kerberos.AUTH_GSS_COMPLETE:
+            ctx.kerberos_token = kerberos.authGSSServerResponse(state)
+            ctx.kerberos_user = kerberos.authGSSServerUserName(state)
+            return rc
+        elif rc == kerberos.AUTH_GSS_CONTINUE:
+            return kerberos.AUTH_GSS_CONTINUE
+        else:
+            return None
+    except kerberos.GSSError:
+        return None
+    finally:
+        if state:
+            kerberos.authGSSServerClean(state)
+
+
+def requires_authentication(function):
+    @wraps(function)
+    def decorated(*args, **kwargs):
+        header = request.headers.get("Authorization")
+        if header:
+            ctx = stack.top
+            token = ''.join(header.split()[1:])
+            rc = _gssapi_authenticate(token)
+            if rc == kerberos.AUTH_GSS_COMPLETE:
+                g.user = ctx.kerberos_user
+                response = function(*args, **kwargs)
+                response = make_response(response)
+                if ctx.kerberos_token is not None:
+                    response.headers['WWW-Authenticate'] = ' '.join(['negotiate',
+                                                                     ctx.kerberos_token])
+
+                return response
+            elif rc != kerberos.AUTH_GSS_CONTINUE:
+                return _forbidden()
+        return _unauthorized()
+    return decorated

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/client/__init__.py
----------------------------------------------------------------------
diff --git a/airflow/api/client/__init__.py b/airflow/api/client/__init__.py
new file mode 100644
index 0000000..c82f579
--- /dev/null
+++ b/airflow/api/client/__init__.py
@@ -0,0 +1,14 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/client/api_client.py
----------------------------------------------------------------------
diff --git a/airflow/api/client/api_client.py b/airflow/api/client/api_client.py
new file mode 100644
index 0000000..bdb9a61
--- /dev/null
+++ b/airflow/api/client/api_client.py
@@ -0,0 +1,30 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+
+class Client:
+    def __init__(self, api_base_url, auth):
+        self._api_base_url = api_base_url
+        self._auth = auth
+
+    def trigger_dag(self, dag_id, run_id=None, conf=None):
+        """
+        Creates a dag run for the specified dag
+        :param dag_id:
+        :param run_id:
+        :param conf:
+        :return:
+        """
+        raise NotImplementedError()

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/client/json_client.py
----------------------------------------------------------------------
diff --git a/airflow/api/client/json_client.py b/airflow/api/client/json_client.py
new file mode 100644
index 0000000..4d8e87a
--- /dev/null
+++ b/airflow/api/client/json_client.py
@@ -0,0 +1,39 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from future.moves.urllib.parse import urljoin
+
+from airflow.api.client import api_client
+
+import requests
+
+
+class Client(api_client.Client):
+    def trigger_dag(self, dag_id, run_id=None, conf=None):
+        endpoint = '/api/experimental/dags/{}/dag_runs'.format(dag_id)
+        url = urljoin(self._api_base_url, endpoint)
+
+        resp = requests.post(url,
+                             auth=self._auth,
+                             json={
+                                 "run_id": run_id,
+                                 "conf": conf,
+                             })
+
+        if not resp.ok:
+            raise IOError()
+
+        data = resp.json()
+
+        return data['message']

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/client/local_client.py
----------------------------------------------------------------------
diff --git a/airflow/api/client/local_client.py b/airflow/api/client/local_client.py
new file mode 100644
index 0000000..7c2435b
--- /dev/null
+++ b/airflow/api/client/local_client.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from airflow.api.client import api_client
+from airflow.api.common.experimental import trigger_dag
+
+
+class Client(api_client.Client):
+    def trigger_dag(self, dag_id, run_id=None, conf=None):
+        dr = trigger_dag.trigger_dag(dag_id=dag_id, run_id=run_id, conf=conf)
+        return "Created {}".format(dr)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/common/__init__.py
----------------------------------------------------------------------
diff --git a/airflow/api/common/__init__.py b/airflow/api/common/__init__.py
new file mode 100644
index 0000000..9d7677a
--- /dev/null
+++ b/airflow/api/common/__init__.py
@@ -0,0 +1,13 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/common/experimental/__init__.py
----------------------------------------------------------------------
diff --git a/airflow/api/common/experimental/__init__.py b/airflow/api/common/experimental/__init__.py
new file mode 100644
index 0000000..9d7677a
--- /dev/null
+++ b/airflow/api/common/experimental/__init__.py
@@ -0,0 +1,13 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/api/common/experimental/trigger_dag.py
----------------------------------------------------------------------
diff --git a/airflow/api/common/experimental/trigger_dag.py b/airflow/api/common/experimental/trigger_dag.py
new file mode 100644
index 0000000..36abc99
--- /dev/null
+++ b/airflow/api/common/experimental/trigger_dag.py
@@ -0,0 +1,57 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from datetime import datetime
+import json
+
+from airflow.exceptions import AirflowException
+from airflow.models import DagRun, DagBag
+from airflow.utils.state import State
+
+import logging
+
+
+def trigger_dag(dag_id, run_id=None, conf=None):
+    dagbag = DagBag()
+
+    if dag_id not in dagbag.dags:
+        raise AirflowException("Dag id {} not found".format(dag_id))
+
+    dag = dagbag.get_dag(dag_id)
+
+    execution_date = datetime.now()
+
+    if not run_id:
+        run_id = "manual__{0}".format(execution_date.isoformat())
+
+    dr = DagRun.find(dag_id=dag_id, run_id=run_id)
+    if dr:
+        raise AirflowException("Run id {} already exists for dag id {}".format(
+            run_id,
+            dag_id
+        ))
+
+    run_conf = None
+    if conf:
+        run_conf = json.loads(conf)
+
+    trigger = dag.create_dagrun(
+        run_id=run_id,
+        execution_date=execution_date,
+        state=State.RUNNING,
+        conf=run_conf,
+        external_trigger=True
+    )
+
+    return trigger

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/bin/cli.py
----------------------------------------------------------------------
diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index a821aad..21e1d23 100755
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -15,12 +15,15 @@
 
 from __future__ import print_function
 import logging
+
 import reprlib
+
 import os
 import subprocess
 import textwrap
 import warnings
 from datetime import datetime
+from importlib import import_module
 
 import argparse
 from builtins import input
@@ -39,6 +42,7 @@ import time
 import psutil
 
 import airflow
+from airflow import api
 from airflow import jobs, settings
 from airflow import configuration as conf
 from airflow.exceptions import AirflowException
@@ -57,6 +61,12 @@ from sqlalchemy.orm import exc
 
 DAGS_FOLDER = os.path.expanduser(conf.get('core', 'DAGS_FOLDER'))
 
+api.load_auth()
+
+api_module = import_module(conf.get('cli', 'api_client'))
+api_client = api_module.Client(api_base_url=conf.get('cli', 'endpoint_url'),
+                               auth=api.api_auth.client_auth)
+
 
 def sigint_handler(sig, frame):
     sys.exit(0)
@@ -164,32 +174,20 @@ def backfill(args, dag=None):
 
 
 def trigger_dag(args):
-    dag = get_dag(args)
-
-    if not dag:
-        logging.error("Cannot find dag {}".format(args.dag_id))
-        sys.exit(1)
-
-    execution_date = datetime.now()
-    run_id = args.run_id or "manual__{0}".format(execution_date.isoformat())
-
-    dr = DagRun.find(dag_id=args.dag_id, run_id=run_id)
-    if dr:
-        logging.error("This run_id {} already exists".format(run_id))
-        raise AirflowException()
-
-    run_conf = {}
-    if args.conf:
-        run_conf = json.loads(args.conf)
-
-    trigger = dag.create_dagrun(
-        run_id=run_id,
-        execution_date=execution_date,
-        state=State.RUNNING,
-        conf=run_conf,
-        external_trigger=True
-    )
-    logging.info("Created {}".format(trigger))
+    """
+    Creates a dag run for the specified dag
+    :param args:
+    :return:
+    """
+    try:
+        message = api_client.trigger_dag(dag_id=args.dag_id,
+                                         run_id=args.run_id,
+                                         conf=args.conf)
+    except IOError as err:
+        logging.error(err)
+        raise AirflowException(err)
+
+    logging.info(message)
 
 
 def pool(args):

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/configuration.py
----------------------------------------------------------------------
diff --git a/airflow/configuration.py b/airflow/configuration.py
index ad8b3e3..78236ee 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -173,6 +173,16 @@ security =
 # values at runtime)
 unit_test_mode = False
 
+[cli]
+# In what way should the cli access the API. The LocalClient will use the
+# database directly, while the json_client will use the api running on the
+# webserver
+api_client = airflow.api.client.local_client
+endpoint_url = http://localhost:8080
+
+[api]
+# How to authenticate users of the API
+auth_backend = airflow.api.auth.backend.default
 
 [operators]
 # The default owner assigned to each new operator, unless
@@ -423,6 +433,13 @@ dags_are_paused_at_creation = False
 fernet_key = {FERNET_KEY}
 non_pooled_task_slot_count = 128
 
+[cli]
+api_client = airflow.api.client.local_client
+endpoint_url = http://localhost:8080
+
+[api]
+auth_backend = airflow.api.auth.backend.default
+
 [operators]
 default_owner = airflow
 

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/settings.py
----------------------------------------------------------------------
diff --git a/airflow/settings.py b/airflow/settings.py
index 35928ec..60efb4a 100644
--- a/airflow/settings.py
+++ b/airflow/settings.py
@@ -130,6 +130,7 @@ def configure_orm():
         engine_args['pool_size'] = conf.getint('core', 'SQL_ALCHEMY_POOL_SIZE')
         engine_args['pool_recycle'] = conf.getint('core',
                                                   'SQL_ALCHEMY_POOL_RECYCLE')
+        #engine_args['echo'] = True
 
     engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
     Session = scoped_session(

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/www/api/experimental/endpoints.py
----------------------------------------------------------------------
diff --git a/airflow/www/api/experimental/endpoints.py b/airflow/www/api/experimental/endpoints.py
index d70e67b..bccbed6 100644
--- a/airflow/www/api/experimental/endpoints.py
+++ b/airflow/www/api/experimental/endpoints.py
@@ -11,27 +11,75 @@
 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 # See the License for the specific language governing permissions and
 # limitations under the License.
-#
+import logging
+
+import airflow.api
 
-from airflow.www.views import dagbag
+from airflow.api.common.experimental import trigger_dag as trigger
+from airflow.exceptions import AirflowException
+from airflow.www.app import csrf
 
-from flask import Blueprint, jsonify
+from flask import (
+    g, Markup, Blueprint, redirect, jsonify, abort, request, current_app, send_file
+)
+
+requires_authentication = airflow.api.api_auth.requires_authentication
 
 api_experimental = Blueprint('api_experimental', __name__)
 
+@csrf.exempt
+@api_experimental.route('/dags/<string:dag_id>/dag_runs', methods=['POST'])
+@requires_authentication
+def trigger_dag(dag_id):
+    """
+    Trigger a new dag run for a Dag
+    """
+    data = request.get_json(force=True)
+
+    run_id = None
+    if 'run_id' in data:
+        run_id = data['run_id']
+
+    conf = None
+    if 'conf' in data:
+        conf = data['conf']
+
+    try:
+        dr = trigger.trigger_dag(dag_id, run_id, conf)
+    except AirflowException as err:
+        logging.error(err)
+        response = jsonify(error="{}".format(err))
+        response.status_code = 404
+        return response
+
+    if getattr(g, 'user', None):
+        logging.info("User {} created {}".format(g.user, dr))
+
+    response = jsonify(message="Created {}".format(dr))
+    return response
+
+
+@api_experimental.route('/test', methods=['GET'])
+@requires_authentication
+def test():
+    return jsonify(status='OK')
+
 
 @api_experimental.route('/dags/<string:dag_id>/tasks/<string:task_id>', methods=['GET'])
+@requires_authentication
 def task_info(dag_id, task_id):
     """Returns a JSON with a task's public instance variables. """
+    from airflow.www.views import dagbag
+
     if dag_id not in dagbag.dags:
-        response = jsonify({'error': 'Dag {} not found'.format(dag_id)})
+        response = jsonify(error='Dag {} not found'.format(dag_id))
         response.status_code = 404
         return response
 
     dag = dagbag.dags[dag_id]
     if not dag.has_task(task_id):
-        response = (jsonify({'error': 'Task {} not found in dag {}'
-                    .format(task_id, dag_id)}))
+        response = (jsonify(error='Task {} not found in dag {}'
+                    .format(task_id, dag_id)))
         response.status_code = 404
         return response
 

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/airflow/www/app.py
----------------------------------------------------------------------
diff --git a/airflow/www/app.py b/airflow/www/app.py
index 43c6314..c2c180a 100644
--- a/airflow/www/app.py
+++ b/airflow/www/app.py
@@ -14,11 +14,13 @@
 #
 import logging
 import socket
+import six
 
 from flask import Flask
 from flask_admin import Admin, base
 from flask_cache import Cache
 from flask_wtf.csrf import CsrfProtect
+csrf = CsrfProtect()
 
 import airflow
 from airflow import models
@@ -29,25 +31,28 @@ from airflow import jobs
 from airflow import settings
 from airflow import configuration
 
-csrf = CsrfProtect()
-
 
-def create_app(config=None):
+def create_app(config=None, testing=False):
     app = Flask(__name__)
     app.secret_key = configuration.get('webserver', 'SECRET_KEY')
     app.config['LOGIN_DISABLED'] = not configuration.getboolean('webserver', 'AUTHENTICATE')
 
     csrf.init_app(app)
 
-    #app.config = config
+    app.config['TESTING'] = testing
+
     airflow.load_login()
     airflow.login.login_manager.init_app(app)
 
-    app.register_blueprint(routes)
+    from airflow import api
+    api.load_auth()
+    api.api_auth.init_app(app)
 
     cache = Cache(
         app=app, config={'CACHE_TYPE': 'filesystem', 'CACHE_DIR': '/tmp'})
 
+    app.register_blueprint(routes)
+
     log_format = airflow.settings.LOG_FORMAT_WITH_PID
     airflow.settings.configure_logging(log_format=log_format)
 
@@ -123,8 +128,17 @@ def create_app(config=None):
 
         integrate_plugins()
 
-        from airflow.www.api.experimental.endpoints import api_experimental
-        app.register_blueprint(api_experimental, url_prefix='/api/experimental')
+        import airflow.www.api.experimental.endpoints as e
+        # required for testing purposes otherwise the module retains
+        # a link to the default_auth
+        if app.config['TESTING']:
+            if six.PY2:
+                reload(e)
+            else:
+                import importlib
+                importlib.reload(e)
+
+        app.register_blueprint(e.api_experimental, url_prefix='/api/experimental')
 
         @app.context_processor
         def jinja_globals():

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/docs/api.rst
----------------------------------------------------------------------
diff --git a/docs/api.rst b/docs/api.rst
new file mode 100644
index 0000000..eef671c
--- /dev/null
+++ b/docs/api.rst
@@ -0,0 +1,43 @@
+Experimental Rest API
+=====================
+
+Airflow exposes an experimental Rest API. It is available through the webserver. Endpoints are
+available at /api/experimental/. Please note that we expect the endpoint definitions to change.
+
+Endpoints
+---------
+
+This is a place holder until the swagger definitions are active
+
+* /api/experimental/dags/<DAG_ID>/tasks/<TASK_ID> returns info for a task (GET).
+* /api/experimental/dags/<DAG_ID>/dag_runs creates a dag_run for a given dag id (POST).
+
+CLI
+-----
+
+For some functions the cli can use the API. To configure the CLI to use the API when available
+configure as follows:
+
+.. code-block:: bash
+
+    [cli]
+    api_client = airflow.api.client.json_client
+    endpoint_url = http://<WEBSERVER>:<PORT>
+
+
+Authentication
+--------------
+
+Only Kerberos authentication is currently supported for the API. To enable this set the following
+in the configuration:
+
+.. code-block:: bash
+
+    [api]
+    auth_backend = airflow.api.auth.backend.default
+
+    [kerberos]
+    keytab = <KEYTAB>
+
+The Kerberos service is configured as `airflow/fully.qualified.domainname@REALM`. Make sure this
+principal exists in the keytab file.

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/docs/index.rst
----------------------------------------------------------------------
diff --git a/docs/index.rst b/docs/index.rst
index 9c5a14f..2a1f1c1 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -83,6 +83,7 @@ Content
     scheduler
     plugins
     security
+    api
     integration
     faq
     code

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/scripts/ci/kadm5.acl
----------------------------------------------------------------------
diff --git a/scripts/ci/kadm5.acl b/scripts/ci/kadm5.acl
new file mode 100644
index 0000000..414b52f
--- /dev/null
+++ b/scripts/ci/kadm5.acl
@@ -0,0 +1,13 @@
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+*/admin@TEST.LOCAL    *

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/scripts/ci/kdc.conf
----------------------------------------------------------------------
diff --git a/scripts/ci/kdc.conf b/scripts/ci/kdc.conf
new file mode 100644
index 0000000..30e1d47
--- /dev/null
+++ b/scripts/ci/kdc.conf
@@ -0,0 +1,24 @@
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+[kdcdefaults]
+kdc_ports = 88
+kdc_tcp_ports = 88
+
+[realms]
+TEST.LOCAL = {
+  #master_key_type = aes256-cts
+  acl_file = /var/kerberos/krb5kdc/kadm5.acl
+  dict_file = /usr/share/dict/words
+  admin_keytab = /var/kerberos/krb5kdc/kadm5.keytab
+  supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
+}

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/scripts/ci/krb5.conf
----------------------------------------------------------------------
diff --git a/scripts/ci/krb5.conf b/scripts/ci/krb5.conf
new file mode 100644
index 0000000..a971bf4
--- /dev/null
+++ b/scripts/ci/krb5.conf
@@ -0,0 +1,31 @@
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+[logging]
+default = FILE:/var/log/krb5libs.log
+kdc = FILE:/var/log/krb5kdc.log
+admin_server = FILE:/var/log/kadmind.log
+
+[libdefaults]
+default_realm = TEST.LOCAL
+dns_lookup_realm = false
+dns_lookup_kdc = false
+ticket_lifetime = 24h
+renew_lifetime = 7d
+forwardable = true
+
+[realms]
+ TEST.LOCAL = {
+   kdc = localhost:88
+   kdc = localhost:88
+ }

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/scripts/ci/minikdc.properties
----------------------------------------------------------------------
diff --git a/scripts/ci/minikdc.properties b/scripts/ci/minikdc.properties
index e025498..282f9fd 100644
--- a/scripts/ci/minikdc.properties
+++ b/scripts/ci/minikdc.properties
@@ -18,5 +18,6 @@ kdc.port=8888
 instance=DefaultKrbServer
 max.ticket.lifetime=86400000
 max.renewable.lifetime=604800000
-transport=UDP
-debug=false
\ No newline at end of file
+transport=TCP
+debug=true
+

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/scripts/ci/requirements.txt
----------------------------------------------------------------------
diff --git a/scripts/ci/requirements.txt b/scripts/ci/requirements.txt
index 8dac10c..7ea2b26 100644
--- a/scripts/ci/requirements.txt
+++ b/scripts/ci/requirements.txt
@@ -49,6 +49,7 @@ python-daemon
 python-dateutil
 redis
 requests
+requests-kerberos
 setproctitle
 slackclient
 sphinx

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/scripts/ci/run_tests.sh
----------------------------------------------------------------------
diff --git a/scripts/ci/run_tests.sh b/scripts/ci/run_tests.sh
index caa5c7e..0ae22ef 100755
--- a/scripts/ci/run_tests.sh
+++ b/scripts/ci/run_tests.sh
@@ -33,6 +33,9 @@ if [ "${TRAVIS}" ]; then
 
     ROOTDIR="$(dirname $(dirname $DIR))"
     export AIRFLOW__CORE__DAGS_FOLDER="$ROOTDIR/tests/dags"
+
+    # kdc init happens in setup_kdc.sh
+    kinit -kt ${KRB5_KTNAME} airflow
 fi
 
 echo Backend: $AIRFLOW__CORE__SQL_ALCHEMY_CONN

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/scripts/ci/setup_kdc.sh
----------------------------------------------------------------------
diff --git a/scripts/ci/setup_kdc.sh b/scripts/ci/setup_kdc.sh
index 6b0ba00..c4f5d8c 100755
--- a/scripts/ci/setup_kdc.sh
+++ b/scripts/ci/setup_kdc.sh
@@ -13,32 +13,40 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-MINIKDC_VERSION=2.7.1
-MINIKDC_HOME=/tmp/minikdc
-MINIKDC_CACHE=${CACHE}/minikdc
-# Setup MiniKDC environment
-mkdir -p ${MINIKDC_HOME}
-mkdir -p ${MINIKDC_CACHE}
-
-URL=http://search.maven.org/remotecontent?filepath=org/apache/ivy/ivy/2.3.0/ivy-2.3.0.jar
-echo "Downloading ivy"
-curl -z ${MINIKDC_CACHE}/ivy.jar -o ${MINIKDC_CACHE}/ivy.jar -L ${URL}
-
-if [ $? != 0 ]; then
-    echo "Failed to download ivy"
-    exit 1
-fi
-
-echo "Getting minikdc dependencies"
-java -jar ${MINIKDC_CACHE}/ivy.jar -dependency org.apache.hadoop hadoop-minikdc ${MINIKDC_VERSION} \
-           -cache ${MINIKDC_CACHE} \
-           -retrieve "${MINIKDC_CACHE}/lib/[artifact]-[revision](-[classifier]).[ext]"
-
-if [ $? != 0 ]; then
-    echo "Failed to download dependencies for minikdc"
-    exit 1
-fi
-
-cp -r ${MINIKDC_CACHE}/* ${MINIKDC_HOME}/
-
-mkdir -p ${MINIKDC_HOME}/work
+cat /etc/hosts
+
+FQDN=`hostname`
+
+echo "hostname: ${FQDN}"
+
+ADMIN="admin"
+PASS="airflow"
+
+DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+cp ${DIR}/kdc.conf /etc/krb5kdc/kdc.conf
+
+ln -sf /dev/urandom /dev/random
+
+cp ${DIR}/kadm5.acl /etc/krb5kdc/kadm5.acl
+
+cp ${DIR}/krb5.conf /etc/krb5.conf
+
+# create admin
+echo -e "${PASS}\n${PASS}" | kdb5_util create -s
+
+echo -e "${PASS}\n${PASS}" | kadmin.local -q "addprinc ${ADMIN}/admin"
+echo -e "${PASS}\n${PASS}" | kadmin.local -q "addprinc -randkey airflow"
+echo -e "${PASS}\n${PASS}" | kadmin.local -q "addprinc -randkey airflow/${FQDN}"
+kadmin.local -q "ktadd -k ${KRB5_KTNAME} airflow"
+kadmin.local -q "ktadd -k ${KRB5_KTNAME} airflow/${FQDN}"
+
+service krb5-kdc restart
+
+# make sure the keytab is readable to anyone
+chmod 664 ${KRB5_KTNAME}
+
+# don't do a kinit here as this happens under super user privileges
+# on travis
+# kinit -kt ${KRB5_KTNAME} airflow
+

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/setup.py
----------------------------------------------------------------------
diff --git a/setup.py b/setup.py
index cf76a3e..877b887 100644
--- a/setup.py
+++ b/setup.py
@@ -149,9 +149,11 @@ slack = ['slackclient>=1.0.0']
 statsd = ['statsd>=3.0.1, <4.0']
 vertica = ['vertica-python>=0.5.1']
 ldap = ['ldap3>=0.9.9.1']
-kerberos = ['pykerberos>=1.1.8',
+kerberos = ['pykerberos>=1.1.13',
+            'requests_kerberos>=0.10.0',
             'thrift_sasl>=0.2.0',
-            'snakebite[kerberos]>=2.7.8']
+            'snakebite[kerberos]>=2.7.8',
+            'kerberos>=1.2.5']
 password = [
     'bcrypt>=2.0.0',
     'flask-bcrypt>=0.7.1',
@@ -197,6 +199,7 @@ def do_setup():
             'flask-admin==1.4.1',
             'flask-cache>=0.13.1, <0.14',
             'flask-login==0.2.11',
+            'flask-swagger==0.2.13',
             'flask-wtf==0.12',
             'funcsigs>=1.0.2, <1.1',
             'future>=0.15.0, <0.16',

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/tests/www/api/experimental/endpoints.py
----------------------------------------------------------------------
diff --git a/tests/www/api/experimental/endpoints.py b/tests/www/api/experimental/endpoints.py
deleted file mode 100644
index 32da137..0000000
--- a/tests/www/api/experimental/endpoints.py
+++ /dev/null
@@ -1,83 +0,0 @@
-# -*- coding: utf-8 -*-
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from __future__ import print_function
-
-import os
-import unittest
-from datetime import datetime, time, timedelta
-
-from airflow import configuration
-
-configuration.load_test_config()
-from airflow import models, settings
-from airflow.www import app as application
-from airflow.settings import Session
-
-NUM_EXAMPLE_DAGS = 16
-DEV_NULL = '/dev/null'
-TEST_DAG_FOLDER = os.path.join(
-    os.path.dirname(os.path.realpath(__file__)), 'dags')
-DEFAULT_DATE = datetime(2015, 1, 1)
-DEFAULT_DATE_ISO = DEFAULT_DATE.isoformat()
-DEFAULT_DATE_DS = DEFAULT_DATE_ISO[:10]
-TEST_DAG_ID = 'unit_tests'
-
-
-try:
-    import cPickle as pickle
-except ImportError:
-    # Python 3
-    import pickle
-
-
-def reset(dag_id=TEST_DAG_ID):
-    session = Session()
-    tis = session.query(models.TaskInstance).filter_by(dag_id=dag_id)
-    tis.delete()
-    session.commit()
-    session.close()
-
-
-class ApiExperimentalTests(unittest.TestCase):
-    def setUp(self):
-        reset()
-        configuration.load_test_config()
-        app = application.create_app()
-        app.config['TESTING'] = True
-        self.app = app.test_client()
-
-        self.dagbag = models.DagBag(
-            dag_folder=DEV_NULL, include_examples=True)
-        self.dag_bash = self.dagbag.dags['example_bash_operator']
-        self.runme_0 = self.dag_bash.get_task('runme_0')
-
-    def test_task_info(self):
-        url_template = '/api/experimental/dags/{}/tasks/{}'
-
-        response = self.app.get(url_template.format('example_bash_operator', 'runme_0'))
-        assert '"email"' in response.data.decode('utf-8')
-        assert 'error' not in response.data.decode('utf-8')
-        self.assertEqual(200, response.status_code)
-       
-        response = self.app.get(url_template.format('example_bash_operator', 'DNE'))
-        assert 'error' in response.data.decode('utf-8')
-        self.assertEqual(404, response.status_code)
-
-        response = self.app.get(url_template.format('DNE', 'DNE'))
-        assert 'error' in response.data.decode('utf-8')
-        self.assertEqual(404, response.status_code)
-
-    def tearDown(self):
-        self.dag_bash.clear(start_date=DEFAULT_DATE, end_date=datetime.now())

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/tests/www/api/experimental/test_endpoints.py
----------------------------------------------------------------------
diff --git a/tests/www/api/experimental/test_endpoints.py b/tests/www/api/experimental/test_endpoints.py
new file mode 100644
index 0000000..3046584
--- /dev/null
+++ b/tests/www/api/experimental/test_endpoints.py
@@ -0,0 +1,62 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+import unittest
+
+from datetime import datetime
+
+import json
+
+
+class ApiExperimentalTests(unittest.TestCase):
+    def setUp(self):
+        from airflow import configuration
+        configuration.load_test_config()
+        from airflow.www import app as application
+        app = application.create_app(testing=True)
+        self.app = app.test_client()
+
+    def test_task_info(self):
+        url_template = '/api/experimental/dags/{}/tasks/{}'
+
+        response = self.app.get(url_template.format('example_bash_operator', 'runme_0'))
+        assert '"email"' in response.data.decode('utf-8')
+        assert 'error' not in response.data.decode('utf-8')
+        self.assertEqual(200, response.status_code)
+       
+        response = self.app.get(url_template.format('example_bash_operator', 'DNE'))
+        assert 'error' in response.data.decode('utf-8')
+        self.assertEqual(404, response.status_code)
+
+        response = self.app.get(url_template.format('DNE', 'DNE'))
+        assert 'error' in response.data.decode('utf-8')
+        self.assertEqual(404, response.status_code)
+
+    def test_trigger_dag(self):
+        url_template = '/api/experimental/dags/{}/dag_runs'
+        response = self.app.post(
+            url_template.format('example_bash_operator'),
+            data=json.dumps(dict(run_id='my_run' + datetime.now().isoformat())),
+            content_type="application/json"
+        )
+
+        self.assertEqual(200, response.status_code)
+
+        response = self.app.post(
+            url_template.format('does_not_exist_dag'),
+            data=json.dumps(dict()),
+            content_type="application/json"
+        )
+        self.assertEqual(404, response.status_code)
+
+

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/tests/www/api/experimental/test_kerberos_endpoints.py
----------------------------------------------------------------------
diff --git a/tests/www/api/experimental/test_kerberos_endpoints.py b/tests/www/api/experimental/test_kerberos_endpoints.py
new file mode 100644
index 0000000..2fce019
--- /dev/null
+++ b/tests/www/api/experimental/test_kerberos_endpoints.py
@@ -0,0 +1,97 @@
+# -*- coding: utf-8 -*-
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import json
+import mock
+import os
+import socket
+import unittest
+
+from datetime import datetime
+
+from airflow import configuration
+from airflow.api.auth.backend.kerberos_auth import client_auth
+from airflow.www import app as application
+
+
+@unittest.skipIf('KRB5_KTNAME' not in os.environ,
+                 'Skipping Kerberos API tests due to missing KRB5_KTNAME')
+class ApiKerberosTests(unittest.TestCase):
+    def setUp(self):
+        configuration.load_test_config()
+        try:
+            configuration.conf.add_section("api")
+        except:
+            pass
+        configuration.conf.set("api",
+                               "auth_backend",
+                               "airflow.api.auth.backend.kerberos_auth")
+        try:
+            configuration.conf.add_section("kerberos")
+        except:
+            pass
+        configuration.conf.set("kerberos",
+                               "keytab",
+                               os.environ['KRB5_KTNAME'])
+
+        self.app = application.create_app(testing=True)
+
+    def test_trigger_dag(self):
+        with self.app.test_client() as c:
+            url_template = '/api/experimental/dags/{}/dag_runs'
+            response = c.post(
+                url_template.format('example_bash_operator'),
+                data=json.dumps(dict(run_id='my_run' + datetime.now().isoformat())),
+                content_type="application/json"
+            )
+            self.assertEqual(401, response.status_code)
+
+            response.url = 'http://{}'.format(socket.getfqdn())
+
+            class Request():
+                headers = {}
+
+            response.request = Request()
+            response.content = ''
+            response.raw = mock.MagicMock()
+            response.connection = mock.MagicMock()
+            response.connection.send = mock.MagicMock()
+
+            # disable mutual authentication for testing
+            client_auth.mutual_authentication = 3
+
+            # case can influence the results
+            client_auth.hostname_override = socket.getfqdn()
+
+            client_auth.handle_response(response)
+            self.assertIn('Authorization', response.request.headers)
+
+            response2 = c.post(
+                url_template.format('example_bash_operator'),
+                data=json.dumps(dict(run_id='my_run' + datetime.now().isoformat())),
+                content_type="application/json",
+                headers=response.request.headers
+            )
+            self.assertEqual(200, response2.status_code)
+
+    def test_unauthorized(self):
+        with self.app.test_client() as c:
+            url_template = '/api/experimental/dags/{}/dag_runs'
+            response = c.post(
+                url_template.format('example_bash_operator'),
+                data=json.dumps(dict(run_id='my_run' + datetime.now().isoformat())),
+                content_type="application/json"
+            )
+
+            self.assertEqual(401, response.status_code)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/d5ac6bd9/tox.ini
----------------------------------------------------------------------
diff --git a/tox.ini b/tox.ini
index f447659..ae7f5f5 100644
--- a/tox.ini
+++ b/tox.ini
@@ -32,14 +32,12 @@ setenv =
   COVERALLS_REPO_TOKEN=ic8IH7CrUrtweVbmY3VZQ7ncEGe1XJA5E
   cdh: HADOOP_DISTRO=cdh
   cdh: HADOOP_HOME=/tmp/hadoop-cdh
-  cdh: KRB5_CONFIG=/tmp/minikdc/work
-  cdh: HADOOP_OPTS=-D/tmp/minikdc/work/krb5.conf
+  cdh: HADOOP_OPTS=-D/tmp/krb5.conf
   cdh: MINICLUSTER_HOME=/tmp/minicluster-1.1-SNAPSHOT
   cdh: HIVE_HOME=/tmp/hive
   hdp: HADOOP_DISTRO=hdp
   hdp: HADOOP_HOME=/tmp/hadoop-hdp
-  hdp: KRB5_CONFIG=/tmp/minikdc/work
-  hdp: HADOOP_OPTS=-D/tmp/minikdc/work/krb5.conf
+  hdp: HADOOP_OPTS=-D/tmp/krb5.conf
   hdp: MINICLUSTER_HOME=/tmp/minicluster-1.1-SNAPSHOT
   hdp: HIVE_HOME=/tmp/hive
   airflow_backend_mysql: AIRFLOW__CORE__SQL_ALCHEMY_CONN=mysql://root@localhost/airflow
@@ -58,10 +56,12 @@ passenv =
     TRAVIS_PULL_REQUEST
     PATH
     BOTO_CONFIG
+    KRB5_CONFIG
+    KRB5_KTNAME
 commands =
   pip wheel -w {homedir}/.wheelhouse -f {homedir}/.wheelhouse -r scripts/ci/requirements.txt
   pip install --find-links={homedir}/.wheelhouse --no-index -r scripts/ci/requirements.txt
-  #{toxinidir}/scripts/ci/setup_kdc.sh
+  sudo {toxinidir}/scripts/ci/setup_kdc.sh
   {toxinidir}/scripts/ci/setup_env.sh
   {toxinidir}/scripts/ci/ldap.sh
   {toxinidir}/scripts/ci/load_fixtures.sh


Mime
View raw message