hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Steve Loughran (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-12420) While trying to access Amazon S3 through hadoop-aws(Spark basically) I was getting Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
Date Wed, 16 Sep 2015 19:01:47 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-12420?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14790949#comment-14790949
] 

Steve Loughran commented on HADOOP-12420:
-----------------------------------------

OK, this looks likes an AWS library version change, the one handled in HADOOP-12269

Summary: Hadoop 2.7.1 needs aws-java-sdk version 1.7.4; the aws release, aws-java-sdk-s3 v
1.10.6 has changed the signature and doesn't work.

Right now there's not a lot we can do except wait for people to type in the stack trace into
their browser and end up here; for 2.7.2 we could think about backporting the library change.

> While trying to access Amazon S3 through hadoop-aws(Spark basically) I was getting Exception
in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-12420
>                 URL: https://issues.apache.org/jira/browse/HADOOP-12420
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs/s3
>    Affects Versions: 2.7.1
>            Reporter: Tariq Mohammad
>            Assignee: Tariq Mohammad
>            Priority: Minor
>
> While trying to access data stored in Amazon S3 through Apache Spark, which  internally
uses hadoop-aws jar I was getting the following exception :
> Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
> Probable reason could be the fact that aws java sdk expects a long parameter for the
setMultipartUploadThreshold(long multiPartThreshold) method, but hadoop-aws was using a parameter
of type int(multiPartThreshold). 
> I tried using the downloaded hadoop-aws jar and the build through its maven dependency,
but in both the cases I encountered the same exception. Although I can see private long multiPartThreshold;
in hadoop-aws GitHub repo, it's not getting reflected in the downloaded jar or in the jar
created from maven dependency.
> Following lines in the S3AFileSystem class create this difference :
> Build from trunk : 
> private long multiPartThreshold;
> this.multiPartThreshold = conf.getLong("fs.s3a.multipart.threshold", 2147483647L); =>
Line 267
> Build through maven dependency : 
> private int multiPartThreshold;
> multiPartThreshold = conf.getInt(MIN_MULTIPART_THRESHOLD, DEFAULT_MIN_MULTIPART_THRESHOLD);
=> Line 249



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message