Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 253E1200D01 for ; Thu, 7 Sep 2017 17:36:10 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 25091160D30; Thu, 7 Sep 2017 15:36:10 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 6C8F91610D8 for ; Thu, 7 Sep 2017 17:36:09 +0200 (CEST) Received: (qmail 39455 invoked by uid 500); 7 Sep 2017 15:36:08 -0000 Mailing-List: contact commits-help@airflow.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@airflow.incubator.apache.org Delivered-To: mailing list commits@airflow.incubator.apache.org Received: (qmail 39270 invoked by uid 99); 7 Sep 2017 15:36:08 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 07 Sep 2017 15:36:08 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 0B6091A22B7 for ; Thu, 7 Sep 2017 15:36:08 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -100.002 X-Spam-Level: X-Spam-Status: No, score=-100.002 tagged_above=-999 required=6.31 tests=[RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id mG_-0dkmLSdW for ; Thu, 7 Sep 2017 15:36:03 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 687315FB9F for ; Thu, 7 Sep 2017 15:36:03 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id E1E87E00A9 for ; Thu, 7 Sep 2017 15:36:02 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 790F92414B for ; Thu, 7 Sep 2017 15:36:00 +0000 (UTC) Date: Thu, 7 Sep 2017 15:36:00 +0000 (UTC) From: "Siddharth (JIRA)" To: commits@airflow.incubator.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (AIRFLOW-1575) Add AWS Kinesis Firehose hook for inserting batch records MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Thu, 07 Sep 2017 15:36:10 -0000 [ https://issues.apache.org/jira/browse/AIRFLOW-1575?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Siddharth updated AIRFLOW-1575: ------------------------------- Description: One of the key components of ETL is data ingestion into multiple sources. Airflow provides a great platform for multiple data sources to integrate with each other and transfer data (hive to druid or hive to S3 etc). In AWS ecosystem Kinesis Firehose is an important component which transfers data to other systems within AWS. Data can directly read in Airflow from any system (druid, hive, s3, mysql or csv) and can be pushed to Firehose. This PR creates a firehose hook for inserting batch items. Next - we can build an airflow operator to transfer data from hive to firehose. > Add AWS Kinesis Firehose hook for inserting batch records > --------------------------------------------------------- > > Key: AIRFLOW-1575 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1575 > Project: Apache Airflow > Issue Type: New Feature > Reporter: Siddharth > Assignee: Siddharth > > One of the key components of ETL is data ingestion into multiple sources. Airflow provides a great platform for multiple data sources to integrate with each other and transfer data (hive to druid or hive to S3 etc). In AWS ecosystem Kinesis Firehose is an important component which transfers data to other systems within AWS. Data can directly read in Airflow from any system (druid, hive, s3, mysql or csv) and can be pushed to Firehose. This PR creates a firehose hook for inserting batch items. Next - we can build an airflow operator to transfer data from hive to firehose. -- This message was sent by Atlassian JIRA (v6.4.14#64029)