airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <>
Subject [GitHub] kaxil closed pull request #3845: [AIRFLOW-3005] Replace 'Airbnb Airflow' with 'Apache Airflow'
Date Wed, 05 Sep 2018 00:04:44 GMT
kaxil closed pull request #3845: [AIRFLOW-3005] Replace 'Airbnb Airflow' with 'Apache Airflow'

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/airflow/contrib/example_dags/ b/airflow/contrib/example_dags/
index d7218bbfc2..0f3aededd4 100644
--- a/airflow/contrib/example_dags/
+++ b/airflow/contrib/example_dags/
@@ -33,6 +33,6 @@ CREATE TABLE toTwitter_A(id BIGINT, id_str STRING
 When you review the code for the DAG, you will notice that these tasks are generated using
for loop. These two for loops could be combined into one loop. However, in most cases, you
will be running different analysis on your incoming incoming and outgoing tweets, and hence
they are kept separated in this example.
 Final step is a running the broker script,, which will run queries in Hive and
store the summarized data to MySQL in our case. To connect to Hive, pyhs2 library is extremely
useful and easy to use. To insert data into MySQL from Python, sqlalchemy is also a good one
to use.
-I hope you find this tutorial useful. If you have question feel free to ask me on [Twitter](
or via the live Airflow chatroom room in [Gitter](<p>
+I hope you find this tutorial useful. If you have question feel free to ask me on [Twitter](
or via the live Airflow chatroom room in [Gitter](<p>
 -Ekhtiar Syed
 Last Update: 8-April-2016
diff --git a/airflow/ b/airflow/
index 02ad0a24c9..be38a58207 100755
--- a/airflow/
+++ b/airflow/
@@ -3644,8 +3644,7 @@ def subdags(self):
         Returns a list of the subdag objects associated to this DAG
-        # Check SubDag for class but don't check class directly, see
-        #
+        # Check SubDag for class but don't check class directly
         from airflow.operators.subdag_operator import SubDagOperator
         subdag_lst = []
         for task in self.tasks:
diff --git a/airflow/operators/ b/airflow/operators/
index c5a69456fb..3382bc2788 100644
--- a/airflow/operators/
+++ b/airflow/operators/
@@ -115,8 +115,8 @@ def __init__(self,
                  text='No message has been set.\n'
                       'Here is a cat video instead\n'
-                 icon_url=''
-                          '/airbnb/airflow/master/airflow/www/static/pin_100.png',
+                 icon_url=''
+                          'incubator-airflow/master/airflow/www/static/pin_100.jpg',
                  *args, **kwargs):
         self.method = 'chat.postMessage'
diff --git a/docs/scheduler.rst b/docs/scheduler.rst
index 3e895898fc..4d40270aee 100644
--- a/docs/scheduler.rst
+++ b/docs/scheduler.rst
@@ -94,7 +94,7 @@ interval series.
     Code that goes along with the Airflow tutorial located at:
     from airflow import DAG
     from airflow.operators.bash_operator import BashOperator
diff --git a/tests/ b/tests/
index f9c07b96c9..dc3381e8e0 100644
--- a/tests/
+++ b/tests/
@@ -425,8 +425,6 @@ def test_backfill_ordered_concurrent_execute(self):
     def test_backfill_pooled_tasks(self):
         Test that queued tasks are executed by BackfillJob
-        Test for
         session = settings.Session()
         pool = Pool(pool='test_backfill_pooled_task_pool', slots=1)
diff --git a/tests/sensors/ b/tests/sensors/
index de95137244..5e55aa56e9 100644
--- a/tests/sensors/
+++ b/tests/sensors/
@@ -140,7 +140,7 @@ class FakeSession(object):
     def __init__(self):
         self.response = requests.Response()
         self.response.status_code = 200
-        self.response._content = 'airbnb/airflow'.encode('ascii', 'ignore')
+        self.response._content = 'apache/incubator-airflow'.encode('ascii', 'ignore')
     def send(self, request, **kwargs):
         return self.response
@@ -178,7 +178,7 @@ def test_get_response_check(self):
             data={"client": "ubuntu", "q": "airflow"},
-            response_check=lambda response: ("airbnb/airflow" in response.text),
+            response_check=lambda response: ("apache/incubator-airflow" in response.text),
             dag=self.dag), end_date=DEFAULT_DATE, ignore_ti_state=True)
@@ -192,7 +192,7 @@ def test_sensor(self):
             request_params={"client": "ubuntu", "q": "airflow", 'date': '{{ds}}'},
             response_check=lambda response: (
-                "airbnb/airflow/" + DEFAULT_DATE.strftime('%Y-%m-%d')
+                "apache/incubator-airflow/" + DEFAULT_DATE.strftime('%Y-%m-%d')
                 in response.text),


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:

With regards,
Apache Git Services

View raw message