From commits-return-13082-archive-asf-public=cust-asf.ponee.io@airflow.incubator.apache.org Wed Mar 7 10:28:15 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 7174A180656 for ; Wed, 7 Mar 2018 10:28:14 +0100 (CET) Received: (qmail 13597 invoked by uid 500); 7 Mar 2018 09:28:13 -0000 Mailing-List: contact commits-help@airflow.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@airflow.incubator.apache.org Delivered-To: mailing list commits@airflow.incubator.apache.org Received: (qmail 13584 invoked by uid 99); 7 Mar 2018 09:28:13 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 07 Mar 2018 09:28:13 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id F3A431A0218 for ; Wed, 7 Mar 2018 09:28:12 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -4.231 X-Spam-Level: X-Spam-Status: No, score=-4.231 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, RCVD_IN_DNSWL_HI=-5, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01] autolearn=disabled Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id leRdVYfv0yGT for ; Wed, 7 Mar 2018 09:28:11 +0000 (UTC) Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with SMTP id 8C1F05FAC2 for ; Wed, 7 Mar 2018 09:28:10 +0000 (UTC) Received: (qmail 13550 invoked by uid 99); 7 Mar 2018 09:28:09 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 07 Mar 2018 09:28:09 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 92209F16CF; Wed, 7 Mar 2018 09:28:09 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: fokko@apache.org To: commits@airflow.incubator.apache.org Message-Id: <549fb14bedf74195bae341b1f91c148a@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: incubator-airflow git commit: [AIRFLOW-2186] Change the way logging is carried out in few ops Date: Wed, 7 Mar 2018 09:28:09 +0000 (UTC) Repository: incubator-airflow Updated Branches: refs/heads/master 7cba83333 -> 0f9f4605f [AIRFLOW-2186] Change the way logging is carried out in few ops - Changed the way logging is implemented in `PostgresToGoogleCloudStorageOperator` and `HiveToDynamoDBTransferOperator`. Changed `logging.info` to `self.log.info` Closes #3106 from kaxil/AIRFLOW-2186 Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/0f9f4605 Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/0f9f4605 Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/0f9f4605 Branch: refs/heads/master Commit: 0f9f4605f6dffb1722447156b7dab6d875e4eac2 Parents: 7cba833 Author: Kaxil Naik Authored: Wed Mar 7 10:28:03 2018 +0100 Committer: Fokko Driesprong Committed: Wed Mar 7 10:28:03 2018 +0100 ---------------------------------------------------------------------- airflow/contrib/operators/hive_to_dynamodb.py | 17 ++++++++++------- .../contrib/operators/postgres_to_gcs_operator.py | 3 +-- 2 files changed, 11 insertions(+), 9 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/0f9f4605/airflow/contrib/operators/hive_to_dynamodb.py ---------------------------------------------------------------------- diff --git a/airflow/contrib/operators/hive_to_dynamodb.py b/airflow/contrib/operators/hive_to_dynamodb.py index 55eca45..5c7bb8e 100644 --- a/airflow/contrib/operators/hive_to_dynamodb.py +++ b/airflow/contrib/operators/hive_to_dynamodb.py @@ -13,7 +13,6 @@ # limitations under the License. import json -import logging from airflow.contrib.hooks.aws_dynamodb_hook import AwsDynamoDBHook from airflow.hooks.hive_hooks import HiveServer2Hook @@ -82,20 +81,24 @@ class HiveToDynamoDBTransferOperator(BaseOperator): def execute(self, context): hive = HiveServer2Hook(hiveserver2_conn_id=self.hiveserver2_conn_id) - logging.info('Extracting data from Hive') - logging.info(self.sql) + self.log.info('Extracting data from Hive') + self.log.info(self.sql) data = hive.get_pandas_df(self.sql, schema=self.schema) dynamodb = AwsDynamoDBHook(aws_conn_id=self.aws_conn_id, - table_name=self.table_name, table_keys=self.table_keys, region_name=self.region_name) + table_name=self.table_name, + table_keys=self.table_keys, + region_name=self.region_name) - logging.info('Inserting rows into dynamodb') + self.log.info('Inserting rows into dynamodb') if self.pre_process is None: dynamodb.write_batch_data( json.loads(data.to_json(orient='records'))) else: dynamodb.write_batch_data( - self.pre_process(data=data, args=self.pre_process_args, kwargs=self.pre_process_kwargs)) + self.pre_process(data=data, + args=self.pre_process_args, + kwargs=self.pre_process_kwargs)) - logging.info('Done.') + self.log.info('Done.') http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/0f9f4605/airflow/contrib/operators/postgres_to_gcs_operator.py ---------------------------------------------------------------------- diff --git a/airflow/contrib/operators/postgres_to_gcs_operator.py b/airflow/contrib/operators/postgres_to_gcs_operator.py index 441ccf5..ab6fdf4 100644 --- a/airflow/contrib/operators/postgres_to_gcs_operator.py +++ b/airflow/contrib/operators/postgres_to_gcs_operator.py @@ -14,7 +14,6 @@ import sys import json -import logging import time import datetime @@ -176,7 +175,7 @@ class PostgresToGoogleCloudStorageOperator(BaseOperator): 'mode': field_mode, }) - logging.info('Using schema for %s: %s', self.schema_filename, schema) + self.log.info('Using schema for %s: %s', self.schema_filename, schema) tmp_schema_file_handle = NamedTemporaryFile(delete=True) s = json.dumps(schema, sort_keys=True) if PY3: