hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Genmao Yu (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-13498) the number of multi-part upload part should not bigger than 10000
Date Wed, 17 Aug 2016 08:48:21 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-13498?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15424134#comment-15424134

Genmao Yu commented on HADOOP-13498:

In my last comment, 1G is still too large for a practical test. So, I choose to test the logic
of calculating size of multipart piece, i.e. skip testing on Aliun OSS service.

> the number of multi-part upload part should not bigger than 10000
> -----------------------------------------------------------------
>                 Key: HADOOP-13498
>                 URL: https://issues.apache.org/jira/browse/HADOOP-13498
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs
>    Affects Versions: HADOOP-12756
>            Reporter: Genmao Yu
>            Assignee: Genmao Yu
>             Fix For: HADOOP-12756
>         Attachments: HADOOP-13498-HADOOP-12756.001.patch, HADOOP-13498-HADOOP-12756.002.patch
> We should not only throw exception when exceed 10000 limit of multi-part number, but
should guarantee to upload any object no matter how big it is. 

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org

View raw message