flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mustafa AKIN <mustaf...@gmail.com>
Subject Using Hadoop 2.8.0 in Flink Project for S3A Path Style Access
Date Tue, 11 Jul 2017 08:47:27 GMT
Hi all,

I am trying to use S3 backend with custom endpoint. However, it is not
supported in hadoop-aws@2.7.3, I need to use at least 2.8.0 version. The
underyling reason is that the requests are being sent as following

DEBUG [main] (AmazonHttpClient.java:337) - Sending Request: HEAD
http://mustafa.localhost:9000 / Headers:

Because "fs.s3a.path.style.access" is not recognized in old version.I want
the domain to remain same, the bucket name to be appended in the path (

I cannot blindly increase aws-java-sdk version to latest, it causes:

Caused by: java.lang.NoClassDefFoundError: Could not initialize class
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:182)

So, If I increase the hadoop-aws to 2.8.0 with latest client, it causes the
following error:

According to, I need hadoop-aws@2.7.2 and

Caused by: java.lang.IllegalAccessError: tried to access method
from class org.apache.hadoop.fs.s3a.S3AInstrumentation

Should I be excluding hadoop-common from Flink somehow? Building flink from
source with mvn clean install -DskipTests -Dhadoop.version=2.8.0 works but
I want to manage it via maven as much as possible.

View raw message