flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-8439) Document using a custom AWS Credentials Provider with flink-3s-fs-hadoop
Date Fri, 27 Jul 2018 09:07:00 GMT

    [ https://issues.apache.org/jira/browse/FLINK-8439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16559442#comment-16559442
] 

ASF GitHub Bot commented on FLINK-8439:
---------------------------------------

azagrebin commented on a change in pull request #6405: [FLINK-8439] Add Flink shading to AWS
credential provider s3 hadoop c…
URL: https://github.com/apache/flink/pull/6405#discussion_r205711773
 
 

 ##########
 File path: flink-filesystems/flink-hadoop-fs/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopConfigLoader.java
 ##########
 @@ -0,0 +1,134 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.runtime.fs.hdfs;
+
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.runtime.util.HadoopUtils;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import javax.annotation.Nonnull;
+
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.Set;
+
+/** This class lazily loads hadoop configuration from resettable Flink's configuration. */
+public class HadoopConfigLoader {
+	private static final Logger LOG = LoggerFactory.getLogger(HadoopConfigLoader.class);
+
+	private static final Set<String> PACKAGE_PREFIXES_TO_SHADE =
+		new HashSet<>(Collections.singletonList("com.amazonaws."));
+
+	/** The prefixes that Flink adds to the Hadoop fs config. */
+	private final String[] flinkConfigPrefixes;
+
+	/** Keys that are replaced (after prefix replacement, to give a more uniform experience
+	 * across different file system implementations. */
+	private final String[][] mirroredConfigKeys;
+
+	/** Hadoop config prefix to replace Flink prefix. */
+	private final String hadoopConfigPrefix;
+
+	private final Set<String> configKeysToShade;
+	private final String flinkShadingPrefix;
+
+	/** Flink's configuration object. */
+	private Configuration flinkConfig;
+
+	/** Hadoop's configuration for the file systems, lazily initialized. */
+	private org.apache.hadoop.conf.Configuration hadoopConfig;
+
+	public HadoopConfigLoader(
+		@Nonnull String[] flinkConfigPrefixes,
+		@Nonnull String[][] mirroredConfigKeys,
+		@Nonnull String hadoopConfigPrefix,
+		@Nonnull Set<String> configKeysToShade,
+		@Nonnull String flinkShadingPrefix) {
+		this.flinkConfigPrefixes = flinkConfigPrefixes;
+		this.mirroredConfigKeys = mirroredConfigKeys;
+		this.hadoopConfigPrefix = hadoopConfigPrefix;
+		this.configKeysToShade = configKeysToShade;
+		this.flinkShadingPrefix = flinkShadingPrefix;
+	}
+
+	public void setFlinkConfig(Configuration config) {
+		flinkConfig = config;
+		hadoopConfig = null;
+	}
+
+	/** get the loaded Hadoop config (or fall back to one loaded from the classpath). */
+	public org.apache.hadoop.conf.Configuration getOrLoadHadoopConfig() {
+		org.apache.hadoop.conf.Configuration hadoopConfig = this.hadoopConfig;
+		if (hadoopConfig == null) {
+			if (flinkConfig != null) {
+				hadoopConfig = mirrorCertianHadoopConfig(loadHadoopConfigFromFlink());
+			}
+			else {
+				LOG.warn("The factory has not been configured prior to loading the S3 file system."
+					+ " Using Hadoop configuration from the classpath.");
+				hadoopConfig = new org.apache.hadoop.conf.Configuration();
+			}
+		}
+		this.hadoopConfig = hadoopConfig;
+		return hadoopConfig;
+	}
+
+	// add additional config entries from the Flink config to the Hadoop config
+	private org.apache.hadoop.conf.Configuration loadHadoopConfigFromFlink() {
+		org.apache.hadoop.conf.Configuration hadoopConfig = HadoopUtils.getHadoopConfiguration(flinkConfig);
+		for (String key : flinkConfig.keySet()) {
+			for (String prefix : flinkConfigPrefixes) {
+				if (key.startsWith(prefix)) {
+					String value = flinkConfig.getString(key, null);
+					String newKey = hadoopConfigPrefix + key.substring(prefix.length());
+					String newValue = fixHadoopConfig(key, flinkConfig.getString(key, null));
+					hadoopConfig.set(newKey, newValue);
+
+					LOG.debug("Adding Flink config entry for {} as {}={} to Hadoop config", key, newKey,
value);
+				}
+			}
+		}
+		return hadoopConfig;
+	}
+
+	// mirror certain keys to make use more uniform across implementations
+	// with different keys
+	private org.apache.hadoop.conf.Configuration mirrorCertianHadoopConfig(
+		org.apache.hadoop.conf.Configuration hadoopConfig) {
+		for (String[] mirrored : mirroredConfigKeys) {
+			String value = hadoopConfig.get(mirrored[0], null);
+			if (value != null) {
+				hadoopConfig.set(mirrored[1], value);
+			}
+		}
+		return hadoopConfig;
+	}
+
+	private String fixHadoopConfig(String key, String value) {
+		return key != null && configKeysToShade.contains(key) ?
+			shadeAwsClassConfig(value) : value;
+	}
+
+	private String shadeAwsClassConfig(String awsCredProvider) {
 
 Review comment:
   true, overlooked

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> Document using a custom AWS Credentials Provider with flink-3s-fs-hadoop
> ------------------------------------------------------------------------
>
>                 Key: FLINK-8439
>                 URL: https://issues.apache.org/jira/browse/FLINK-8439
>             Project: Flink
>          Issue Type: Improvement
>          Components: Documentation
>            Reporter: Dyana Rose
>            Assignee: Andrey Zagrebin
>            Priority: Critical
>              Labels: pull-request-available
>             Fix For: 1.4.3, 1.5.3
>
>
> This came up when using the s3 for the file system backend and running under ECS.
> With no credentials in the container, hadoop-aws will default to EC2 instance level credentials
when accessing S3. However when running under ECS, you will generally want to default to the
task definition's IAM role.
> In this case you need to set the hadoop property
> {code:java}
> fs.s3a.aws.credentials.provider{code}
> to a fully qualified class name(s). see [hadoop-aws docs|https://github.com/apache/hadoop/blob/1ba491ff907fc5d2618add980734a3534e2be098/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/index.md]
> This works as expected when you add this setting to flink-conf.yaml but there is a further
'gotcha.'  Because the AWS sdk is shaded, the actual full class name for, in this case, the ContainerCredentialsProvider
is
> {code:java}
> org.apache.flink.fs.s3hadoop.shaded.com.amazonaws.auth.ContainerCredentialsProvider{code}
>  
> meaning the full setting is:
> {code:java}
> fs.s3a.aws.credentials.provider: org.apache.flink.fs.s3hadoop.shaded.com.amazonaws.auth.ContainerCredentialsProvider{code}
> If you instead set it to the unshaded class name you will see a very confusing error
stating that the ContainerCredentialsProvider doesn't implement AWSCredentialsProvider (which
it most certainly does.)
> Adding this information (how to specify alternate Credential Providers, and the name
space gotcha) to the [AWS deployment docs|https://ci.apache.org/projects/flink/flink-docs-release-1.4/ops/deployment/aws.html] would
be useful to anyone else using S3.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message