hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Steve Loughran (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-15267) S3A multipart upload fails when SSE-C encryption is enabled
Date Wed, 07 Mar 2018 19:22:00 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-15267?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16390061#comment-16390061
] 

Steve Loughran commented on HADOOP-15267:
-----------------------------------------

backported to 3.0.x; not got time right now to do look at & retest branch-2...the 3.0
commit should be the one to pick up if anyone wants to

> S3A multipart upload fails when SSE-C encryption is enabled
> -----------------------------------------------------------
>
>                 Key: HADOOP-15267
>                 URL: https://issues.apache.org/jira/browse/HADOOP-15267
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>    Affects Versions: 3.1.0
>         Environment: Hadoop 3.1 Snapshot
>            Reporter: Anis Elleuch
>            Assignee: Anis Elleuch
>            Priority: Critical
>             Fix For: 3.1.0, 3.0.2
>
>         Attachments: HADOOP-15267-001.patch, HADOOP-15267-002.patch, HADOOP-15267-003.patch
>
>
> When I enable SSE-C encryption in Hadoop 3.1 and set  fs.s3a.multipart.size to 5 Mb,
storing data in AWS doesn't work anymore. For example, running the following code:
> {code}
> >>> df1 = spark.read.json('/home/user/people.json')
> >>> df1.write.mode("overwrite").json("s3a://testbucket/people.json")
> {code}
> shows the following exception:
> {code:java}
> com.amazonaws.services.s3.model.AmazonS3Exception: The multipart upload initiate requested
encryption. Subsequent part requests must include the appropriate encryption parameters.
> {code}
> After some investigation, I discovered that hadoop-aws doesn't send SSE-C headers in
Put Object Part as stated in AWS specification: [https://docs.aws.amazon.com/AmazonS3/latest/API/mpUploadUploadPart.html]
> {code:java}
> If you requested server-side encryption using a customer-provided encryption key in your
initiate multipart upload request, you must provide identical encryption information in each
part upload using the following headers.
> {code}
>  
> You can find a patch attached to this issue for a better clarification of the problem.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


Mime
View raw message