Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 07584200D28 for ; Mon, 23 Oct 2017 16:50:10 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 061E61609E0; Mon, 23 Oct 2017 14:50:10 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 254CD1609CE for ; Mon, 23 Oct 2017 16:50:08 +0200 (CEST) Received: (qmail 26983 invoked by uid 500); 23 Oct 2017 14:50:08 -0000 Mailing-List: contact issues-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@flink.apache.org Delivered-To: mailing list issues@flink.apache.org Received: (qmail 26974 invoked by uid 99); 23 Oct 2017 14:50:08 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 23 Oct 2017 14:50:08 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 883DCC3195 for ; Mon, 23 Oct 2017 14:50:07 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -99.202 X-Spam-Level: X-Spam-Status: No, score=-99.202 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id 1PTxWXurO4MN for ; Mon, 23 Oct 2017 14:50:02 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id E778861139 for ; Mon, 23 Oct 2017 14:50:01 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 4C440E0D4E for ; Mon, 23 Oct 2017 14:50:01 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 76872243A5 for ; Mon, 23 Oct 2017 14:50:00 +0000 (UTC) Date: Mon, 23 Oct 2017 14:50:00 +0000 (UTC) From: "Chesnay Schepler (JIRA)" To: issues@flink.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (FLINK-7905) HadoopS3FileSystemITCase failed on travis MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Mon, 23 Oct 2017 14:50:10 -0000 [ https://issues.apache.org/jira/browse/FLINK-7905?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Chesnay Schepler updated FLINK-7905: ------------------------------------ Environment: https://travis-ci.org/zentol/flink/jobs/291550295 https://travis-ci.org/tillrohrmann/flink/jobs/291491026 was: https://travis-ci.org/zentol/flink/jobs/291550295 https://travis-ci.org/tillrohrmann/flink/jobs/291491026 > HadoopS3FileSystemITCase failed on travis > ----------------------------------------- > > Key: FLINK-7905 > URL: https://issues.apache.org/jira/browse/FLINK-7905 > Project: Flink > Issue Type: Bug > Components: FileSystem, Tests > Affects Versions: 1.4.0 > Environment: https://travis-ci.org/zentol/flink/jobs/291550295 > https://travis-ci.org/tillrohrmann/flink/jobs/291491026 > Reporter: Chesnay Schepler > Labels: test-stability > > {code} > ------------------------------------------------------- > T E S T S > ------------------------------------------------------- > Running org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase > Tests run: 3, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 3.354 sec <<< FAILURE! - in org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase > testDirectoryListing(org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase) Time elapsed: 0.208 sec <<< ERROR! > java.nio.file.AccessDeniedException: s3://[secure]/tests-9273972a-70c2-4f06-862e-d02936313fea/testdir: getFileStatus on s3://[secure]/tests-9273972a-70c2-4f06-862e-d02936313fea/testdir: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 9094999D7456C589), S3 Extended Request ID: fVIcROQh4E1/GjWYYV6dFp851rjiKtFgNSCO8KkoTmxWbuxz67aDGqRiA/a09q7KS6Mz1Tnyab4= > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1579) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1249) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1030) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:742) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:716) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) > at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4194) > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4141) > at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1256) > at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1232) > at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:904) > at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:1553) > at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:117) > at org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.getFileStatus(HadoopFileSystem.java:77) > at org.apache.flink.core.fs.FileSystem.exists(FileSystem.java:509) > at org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase.testDirectoryListing(HadoopS3FileSystemITCase.java:163) > testSimpleFileWriteAndRead(org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase) Time elapsed: 0.275 sec <<< ERROR! > java.nio.file.AccessDeniedException: s3://[secure]/tests-9273972a-70c2-4f06-862e-d02936313fea/test.txt: getFileStatus on s3://[secure]/tests-9273972a-70c2-4f06-862e-d02936313fea/test.txt: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: B3D8126BE6CF169F), S3 Extended Request ID: T34sn+a/CcCFv+kFR/UbfozAkXXtiLDu2N31Ok5EydgKeJF5I2qXRCC/MkxSi4ymiiVWeSyb8FY= > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1579) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1249) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1030) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:742) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:716) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) > at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4194) > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4141) > at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1256) > at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1232) > at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:904) > at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:1553) > at org.apache.hadoop.fs.s3a.S3AFileSystem.delete(S3AFileSystem.java:1234) > at org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.delete(HadoopFileSystem.java:134) > at org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase.testSimpleFileWriteAndRead(HadoopS3FileSystemITCase.java:147) > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029)