hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Steve Loughran (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-12420) While trying to access Amazon S3 through hadoop-aws(Spark basically) I was getting Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
Date Thu, 22 Oct 2015 17:14:27 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-12420?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14969483#comment-14969483
] 

Steve Loughran commented on HADOOP-12420:
-----------------------------------------

no, that uses jet3t

> While trying to access Amazon S3 through hadoop-aws(Spark basically) I was getting Exception
in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-12420
>                 URL: https://issues.apache.org/jira/browse/HADOOP-12420
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs/s3
>    Affects Versions: 2.7.1
>            Reporter: Tariq Mohammad
>            Assignee: Tariq Mohammad
>            Priority: Minor
>             Fix For: 2.8.0
>
>
> While trying to access data stored in Amazon S3 through Apache Spark, which  internally
uses hadoop-aws jar I was getting the following exception :
> Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
> Probable reason could be the fact that aws java sdk expects a long parameter for the
setMultipartUploadThreshold(long multiPartThreshold) method, but hadoop-aws was using a parameter
of type int(multiPartThreshold). 
> I tried using the downloaded hadoop-aws jar and the build through its maven dependency,
but in both the cases I encountered the same exception. Although I can see private long multiPartThreshold;
in hadoop-aws GitHub repo, it's not getting reflected in the downloaded jar or in the jar
created from maven dependency.
> Following lines in the S3AFileSystem class create this difference :
> Build from trunk : 
> private long multiPartThreshold;
> this.multiPartThreshold = conf.getLong("fs.s3a.multipart.threshold", 2147483647L); =>
Line 267
> Build through maven dependency : 
> private int multiPartThreshold;
> multiPartThreshold = conf.getInt(MIN_MULTIPART_THRESHOLD, DEFAULT_MIN_MULTIPART_THRESHOLD);
=> Line 249



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message