airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Michael Crawford (JIRA)" <>
Subject [jira] [Commented] (AIRFLOW-115) Migrate and Refactor AWS integration to use boto3 and better structured hooks
Date Mon, 09 Oct 2017 13:24:02 GMT


Michael Crawford commented on AIRFLOW-115:

AIRFLOW-1114 is also proposing to redo these hooks.

One issue I have run into is that the connections that back the hooks are not consistent.
 For EMR connections you can specify the key and secret in the login and password sections
but in the s3 you have to do it with json.

I would suggest being able to specify key, secret and region as proper params rather than
relying on the extra field. 
This would make thing clearer and allow for keeping the secret key hidden.

We could just use the existing functionality to override the names of things or something.

> Migrate and Refactor AWS integration to use boto3 and better structured hooks
> -----------------------------------------------------------------------------
>                 Key: AIRFLOW-115
>                 URL:
>             Project: Apache Airflow
>          Issue Type: Improvement
>          Components: aws, boto3, hooks
>            Reporter: Arthur Wiedmer
>            Assignee: Arthur Wiedmer
>            Priority: Minor
> h2. Current State
> The current AWS integration is mostly done through the S3Hook, which uses non standard
credentials parsing on top of using boto instead of boto3 which is the current supported AWS
sdk for Python.
> h2. Proposal
> an AWSHook should be provided that maps Airflow connections to the boto3 API. Operators
working with s3, as well as other AWS services would then inherit from this hook but extend
the functionality with service specific methods like get_key for S3, start_cluster for EMR,
enqueue for SQS, send_email for SES etc...
> * AWSHook
> ** S3Hook
> ** EMRHook
> ** SQSHook
> ** SESHook
> ...

This message was sent by Atlassian JIRA

View raw message