accumulo-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jorge Machado <jom...@me.com>
Subject Re: Spark and Accumulo Delegation tokens
Date Fri, 23 Mar 2018 07:10:26 GMT
Hi Jerry, 

where do you see that Class on Spark ? I only found HadoopDelegationTokenManager and I don’t
see any way to add my Provider into it. 

private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
  val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
    new HiveDelegationTokenProvider,
    new HBaseDelegationTokenProvider)

  // Filter out providers for which spark.security.credentials.{service}.enabled is false.
  providers
    .filter { p => isServiceEnabled(p.serviceName) }
    .map { p => (p.serviceName, p) }
    .toMap
}

If you could give me a tipp there would be great. 
Thanks 

Jorge Machado





> On 23 Mar 2018, at 07:38, Saisai Shao <sai.sai.shao@gmail.com> wrote:
> 
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
> 
> Thanks
> Jerry
> 
> 2018-03-23 14:29 GMT+08:00 Jorge Machado <jomach@me.com>:
> 
>> Hi Guys,
>> 
>> I’m on the middle of writing a spark Datasource connector for Apache Spark
>> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
>> trick because Spark only handles the Delegation Tokens from Hbase, hive and
>> hdfs.
>> 
>> Would be a PR for a implementation of HadoopDelegationTokenProvider for
>> Accumulo be accepted ?
>> 
>> 
>> Jorge Machado
>> 
>> 
>> 
>> 
>> 
>> 


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message