hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John Zhuge <jzh...@cloudera.com>
Subject Re: Bug in SSLFactory to use CredentialProvider API?
Date Fri, 13 Jan 2017 17:43:12 GMT
Created https://issues.apache.org/jira/browse/HADOOP-13987

John Zhuge
Software Engineer, Cloudera

On Thu, Jan 12, 2017 at 11:34 PM, John Zhuge <jzhuge@cloudera.com> wrote:

> Hi gurus,
>
> I am testing CredentialProvider with KMS: populated the credentials file,
> added
> "hadoop.security.credential.provider.path" to core-site.xml, but "hadoop
> key list" failed due to incorrect password. So I added
> "hadoop.security.credential.provider.path" to ssl-client.xml, "hadoop key
> list" worked! Really strange.
>
> In the SSLFactory constructor, a new Configuration "sslConf" that only
> reads "ssl-client.xml" or "ssl-server.xml" is passed to
> FileBasedKeyStoresFactory which calls Configuration.getPassword() to
> initialize, but "sslConf" does not contain the property
> "hadoop.security.credential.provider.path" because it is usually added to
> "core-site.xml" or component site xml. Is this a known bug? JIRA is down :(
>  Did I miss something?
>
>   public SSLFactory(Mode mode, Configuration conf) {
>>     ...
>>     Configuration sslConf = readSSLConfiguration(mode);
>>     Class<? extends KeyStoresFactory> klass
>>       = conf.getClass(KEYSTORES_FACTORY_CLASS_KEY,
>>                       FileBasedKeyStoresFactory.class,
>> KeyStoresFactory.class);
>>     keystoresFactory = ReflectionUtils.newInstance(klass, sslConf);
>
>
> Backtrace for "hadoop key list":
> * getProviders:76, CredentialProviderFactory {org.apache.hadoop.security.
> alias}
> * getPasswordFromCredentialProviders:2048, Configuration
> {org.apache.hadoop.conf}
> * getPassword:2027, Configuration {org.apache.hadoop.conf}
> * getPassword:240, FileBasedKeyStoresFactory {org.apache.hadoop.security.
> ssl}
> * init:203, FileBasedKeyStoresFactory {org.apache.hadoop.security.ssl}
> * init:187, SSLFactory {org.apache.hadoop.security.ssl}
> * :442, KMSClientProvider {org.apache.hadoop.crypto.key.kms}
> * createProvider:350, KMSClientProvider$Factory
> {org.apache.hadoop.crypto.key.kms}
> * createProvider:341, KMSClientProvider$Factory
> {org.apache.hadoop.crypto.key.kms}
> * get:96, KeyProviderFactory {org.apache.hadoop.crypto.key}
> * getProviders:68, KeyProviderFactory {org.apache.hadoop.crypto.key}
> * getKeyProvider:181, KeyShell$Command {org.apache.hadoop.crypto.key}
> * validate:230, KeyShell$ListCommand {org.apache.hadoop.crypto.key}
> * run:71, CommandShell {org.apache.hadoop.tools}
> * run:76, ToolRunner {org.apache.hadoop.util}
> * main:478, KeyShell {org.apache.hadoop.crypto.key}
>
> SSLFactory is created by:
> * LogLevel
> * Fetcher
> * KMSClientProvider (used by "hadoop key" command)
> * URLConnectionFactory
> * ShuffleHandler
> * TimelineClientImpl
> * DatanodeHttpServer
> So many commands or servers may be affected if this is a real issue.
>
> Thanks,
> John Zhuge
> Software Engineer, Cloudera
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message