spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Gabor Somogyi (Jira)" <>
Subject [jira] [Commented] (SPARK-31817) Pass-through of Kerberos credentials from Spark SQL to a jdbc source
Date Fri, 29 May 2020 13:36:00 GMT


Gabor Somogyi commented on SPARK-31817:

Thanks [~hyukjin.kwon] pinging me! Interesting topic and happy to take a look at the approach.
Now I'm going to have a week off but probably this won't be so super fast :)
>From my perspective it would be good to have some sort of document which describes the
use-case(s) what we would like to solve. It would be easyer to speak the same language. When
we have that we'll see hopefully obviously whether it's possible or not.

As an initial thought. A JDBC source needs deployed keytab files on all nodes to do authentication
(databases don't support delegation tokens) and I personally don't see any other option at
the moment.

> Pass-through of Kerberos credentials from Spark SQL to a jdbc source
> --------------------------------------------------------------------
>                 Key: SPARK-31817
>                 URL:
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.4.5
>            Reporter: Luis Lozano Coira
>            Priority: Major
> I am connecting to Spark SQL through the Thrift JDBC/ODBC server using kerberos. From
Spark SQL I have connected to a JDBC source using basic authentication but I am interested
in doing a pass-through of kerberos credentials to this JDBC source. 
> Would it be possible to do something like that? If not possible, could you consider adding
this functionality?
> Anyway I would like to start testing this pass-through and try to develop an approach
by myself. How could this functionality be added? Could you give me any indication to start
this development?

This message was sent by Atlassian Jira

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message