spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sunil Rangwani (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-14492) Spark SQL 1.6.0 does not work with Hive version lower than 1.2.0; its not backwards compatible with earlier version
Date Thu, 30 Mar 2017 22:17:41 GMT

    [ https://issues.apache.org/jira/browse/SPARK-14492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15949965#comment-15949965
] 

Sunil Rangwani commented on SPARK-14492:
----------------------------------------

Ok, I raised this a year ago and no longer have access to the same environment anymore. I
was trying to use Spark with an external Hive metastore version 0.14 and that didn't work
with the various config options. The way I got around it was upgrading Hive and its metastore
database to version 1.2.1
I progressively upgraded from 0.14 to 1.0.0 to 1.2.0 and until I upgraded to 1.2.1, I kept
getting a java.lang.NoSuchFieldError for some or the other hive config. I just remember that
it was a bit messy.


> Spark SQL 1.6.0 does not work with Hive version lower than 1.2.0; its not backwards compatible
with earlier version
> -------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-14492
>                 URL: https://issues.apache.org/jira/browse/SPARK-14492
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Sunil Rangwani
>            Priority: Critical
>
> Spark SQL when configured with a Hive version lower than 1.2.0 throws a java.lang.NoSuchFieldError
for the field METASTORE_CLIENT_SOCKET_LIFETIME because this field was introduced in Hive 1.2.0
so its not possible to use Hive metastore version lower than 1.2.0 with Spark. The details
of the Hive changes can be found here: https://issues.apache.org/jira/browse/HIVE-9508 
> {code:java}
> Exception in thread "main" java.lang.NoSuchFieldError: METASTORE_CLIENT_SOCKET_LIFETIME
> 	at org.apache.spark.sql.hive.HiveContext.configure(HiveContext.scala:500)
> 	at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:250)
> 	at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237)
> 	at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441)
> 	at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
> 	at org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
> 	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> 	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> 	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> 	at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:267)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:139)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message