Return-Path: X-Original-To: apmail-hadoop-common-dev-archive@www.apache.org Delivered-To: apmail-hadoop-common-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 262131898E for ; Wed, 16 Sep 2015 15:22:49 +0000 (UTC) Received: (qmail 25100 invoked by uid 500); 16 Sep 2015 15:22:47 -0000 Delivered-To: apmail-hadoop-common-dev-archive@hadoop.apache.org Received: (qmail 24971 invoked by uid 500); 16 Sep 2015 15:22:47 -0000 Mailing-List: contact common-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-dev@hadoop.apache.org Delivered-To: mailing list common-dev@hadoop.apache.org Received: (qmail 24798 invoked by uid 99); 16 Sep 2015 15:22:47 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 16 Sep 2015 15:22:47 +0000 Date: Wed, 16 Sep 2015 15:22:47 +0000 (UTC) From: "Tariq Mohammad (JIRA)" To: common-dev@hadoop.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Created] (HADOOP-12420) While trying to access Amazon S3 through hadoop-aws(Spark basically) I was getting Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 Tariq Mohammad created HADOOP-12420: --------------------------------------- Summary: While trying to access Amazon S3 through hadoop-aws(Spark basically) I was getting Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V Key: HADOOP-12420 URL: https://issues.apache.org/jira/browse/HADOOP-12420 Project: Hadoop Common Issue Type: Improvement Components: fs/s3 Affects Versions: 2.7.1 Reporter: Tariq Mohammad Assignee: Tariq Mohammad Priority: Minor While trying to access data stored in Amazon S3 through Apache Spark, which internally uses hadoop-aws jar I was getting the following exception : Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V Probable reason could be the fact that aws java sdk expects a long parameter for the setMultipartUploadThreshold(long multiPartThreshold) method, but hadoop-aws was using a parameter of type int(multiPartThreshold). I tried using the downloaded hadoop-aws jar and the build through its maven dependency, but in both the cases I encountered the same exception. Although I can see private long multiPartThreshold; in hadoop-aws GitHub repo, it's not getting reflected in the downloaded jar or in the jar created from maven dependency. Following lines in the S3AFileSystem class create this difference : Build from trunk : private long multiPartThreshold; this.multiPartThreshold = conf.getLong("fs.s3a.multipart.threshold", 2147483647L); => Line 267 Build through maven dependency : private int multiPartThreshold; multiPartThreshold = conf.getInt(MIN_MULTIPART_THRESHOLD, DEFAULT_MIN_MULTIPART_THRESHOLD); => Line 249 -- This message was sent by Atlassian JIRA (v6.3.4#6332)