flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eron Wright <eronwri...@gmail.com>
Subject Re: Using Hadoop 2.8.0 in Flink Project for S3A Path Style Access
Date Wed, 09 Aug 2017 18:00:19 GMT
For reference: [FLINK-6466] Build Hadoop 2.8.0 convenience binaries

On Wed, Aug 9, 2017 at 6:41 AM, Aljoscha Krettek <aljoscha@apache.org>
wrote:

> So you're saying that this works if you manually compile Flink for Hadoop
> 2.8.0? If yes, I think the solution is that we have to provide binaries for
> Hadoop 2.8.0. If we did that with a possible Flink 1.3.3 release and
> starting from Flink 1.4.0, would this be an option for you?
>
> Best,
> Aljoscha
>
> On 11. Jul 2017, at 10:47, Mustafa AKIN <mustafa91@gmail.com> wrote:
>
> Hi all,
>
> I am trying to use S3 backend with custom endpoint. However, it is not
> supported in hadoop-aws@2.7.3, I need to use at least 2.8.0 version. The
> underyling reason is that the requests are being sent as following
>
> DEBUG [main] (AmazonHttpClient.java:337) - Sending Request: HEAD
> http://mustafa.localhost:9000 / Headers:
>
> Because "fs.s3a.path.style.access" is not recognized in old version.I want
> the domain to remain same, the bucket name to be appended in the path (
> http://localhost:9000/mustafa/.. <http://localhost:9000/>.)
>
> I cannot blindly increase aws-java-sdk version to latest, it causes:
>
> Caused by: java.lang.NoClassDefFoundError: Could not initialize class
> com.amazonaws.ClientConfiguration
> at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(
> S3AFileSystem.java:182)
>
> So, If I increase the hadoop-aws to 2.8.0 with latest client, it causes
> the following error:
>
>
> According to, I need hadoop-aws@2.7.2 and
> https://ci.apache.org/projects/flink/flink-docs-
> release-1.3/setup/aws.html#provide-s3-filesystem-dependency
>
> Caused by: java.lang.IllegalAccessError: tried to access method
> org.apache.hadoop.metrics2.lib.MutableCounterLong.<init>(
> Lorg/apache/hadoop/metrics2/MetricsInfo;J)V from class
> org.apache.hadoop.fs.s3a.S3AInstrumentation
> at org.apache.hadoop.fs.s3a.S3AInstrumentation.streamCounter(
> S3AInstrumentation.java:194)
>
>
> Should I be excluding hadoop-common from Flink somehow? Building flink
> from source with mvn clean install -DskipTests -Dhadoop.version=2.8.0 works
> but I want to manage it via maven as much as possible.
>
>
>

Mime
View raw message