spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chaoran Yu (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-24338) Spark SQL fails to create a table in Hive when running in a Apache Sentry-secured Environment
Date Tue, 22 May 2018 03:44:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-24338?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Chaoran Yu updated SPARK-24338:
-------------------------------
    Description: 
This [commit|https://github.com/apache/spark/commit/ce13c2672318242748f7520ed4ce6bcfad4fb428] introduced
a bug that caused Spark SQL "CREATE TABLE" statement to fail in Hive when Apache Sentry is
used to control cluster authorization. This bug exists in Spark 2.1.0 and all later releases.
The error message thrown is in the attached file.[^exception.txt]

Cloudera in their fork of Spark fixed this bug as shown [here|https://github.com/cloudera/spark/blob/spark2-2.2.0-cloudera2/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala#L229].
It would make sense for this fix to be merged back upstream.

  was:
This [commit|https://github.com/apache/spark/commit/ce13c2672318242748f7520ed4ce6bcfad4fb428] introduced
a bug that caused Spark SQL "CREATE TABLE" statement to fail in Hive when Apache Sentry is
used to control cluster authorization. This bug exists in Spark 2.1.0 and all later releases.
The error message thrown is the following:

org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException:
MetaException(message:User soadusr does not have privileges for CREATETABLE);
 at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
 at org.apache.spark.sql.hive.HiveExternalCatalog.doCreateTable(HiveExternalCatalog.scala:215)
 at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.createTable(ExternalCatalog.scala:110)
 at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:316)
 at org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:127)
 at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
 at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
 at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
 at org.apache.spark.sql.Dataset.<init>(Dataset.scala:182)
 at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
 at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
 at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:691)
 at test.HiveTestJob$.runJob(HiveTestJob.scala:86)
 at test.HiveTestJob$.runJob(HiveTestJob.scala:73)
 at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$8.apply(JobManagerActor.scala:407)
 at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
 at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
 at monitoring.MdcPropagatingExecutionContext$$anon$1.run(MdcPropagatingExecutionContext.scala:24)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
 at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:User soadusr
does not have privileges for CREATETABLE)
 at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:720)
 at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply$mcV$sp(HiveClientImpl.scala:446)
 at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:446)
 at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:446)
 at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:290)
 at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:231)
 at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:230)
 at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:273)
 at org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:445)
 at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply$mcV$sp(HiveExternalCatalog.scala:256)
 at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply(HiveExternalCatalog.scala:215)
 at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$doCreateTable$1.apply(HiveExternalCatalog.scala:215)
 at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
 ... 20 more
Caused by: MetaException(message:User soadusr does not have privileges for CREATETABLE)
 at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:29983)
 at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:29951)
 at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:29877)
 at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
 at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1075)
 at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1061)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2050)
 at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:669)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:657)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498)
 at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
 at com.sun.proxy.$Proxy21.createTable(Unknown Source)
 at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:714)
 ... 32 more 


> Spark SQL fails to create a table in Hive when running in a Apache Sentry-secured Environment
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-24338
>                 URL: https://issues.apache.org/jira/browse/SPARK-24338
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Chaoran Yu
>            Priority: Critical
>         Attachments: exception.txt
>
>
> This [commit|https://github.com/apache/spark/commit/ce13c2672318242748f7520ed4ce6bcfad4fb428] introduced
a bug that caused Spark SQL "CREATE TABLE" statement to fail in Hive when Apache Sentry is
used to control cluster authorization. This bug exists in Spark 2.1.0 and all later releases.
The error message thrown is in the attached file.[^exception.txt]
> Cloudera in their fork of Spark fixed this bug as shown [here|https://github.com/cloudera/spark/blob/spark2-2.2.0-cloudera2/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala#L229].
It would make sense for this fix to be merged back upstream.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message