spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Zhan Zhang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true
Date Thu, 14 Jan 2016 19:50:40 GMT

    [ https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15098734#comment-15098734
] 

Zhan Zhang commented on SPARK-5159:
-----------------------------------

This issue is definitely broken. But fixing it needs a complete design being review first.


For example, to enable the impersonation (doAs) at runtime, how do we solve the RDD sharing
between different users?

We can propagate the user to the executor piggybacked by TaskDescription. But what happen
if two user operate on two RDDs which share the same parent, cache created by another user.
Currently, RDD scope is SparkContext without any user information. It means even we do impersonation,
it is meaningless per my understanding.

> Thrift server does not respect hive.server2.enable.doAs=true
> ------------------------------------------------------------
>
>                 Key: SPARK-5159
>                 URL: https://issues.apache.org/jira/browse/SPARK-5159
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: Andrew Ray
>         Attachments: spark_thrift_server_log.txt
>
>
> I'm currently testing the spark sql thrift server on a kerberos secured cluster in YARN
mode. Currently any user can access any table regardless of HDFS permissions as all data is
read as the hive user. In HiveServer2 the property hive.server2.enable.doAs=true causes all
access to be done as the submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message