spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-11702) Guava ClassLoading Issue When Using Different Hive Metastore Version
Date Sat, 14 Nov 2015 13:38:11 GMT

     [ https://issues.apache.org/jira/browse/SPARK-11702?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen updated SPARK-11702:
------------------------------
    Component/s: Spark Core

Got it, makes more sense now.

> Guava ClassLoading Issue When Using Different Hive Metastore Version
> --------------------------------------------------------------------
>
>                 Key: SPARK-11702
>                 URL: https://issues.apache.org/jira/browse/SPARK-11702
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.5.1
>            Reporter: Joey Paskhay
>
> A Guava classloading error can occur when using a different version of the Hive metastore.
> Running the latest version of Spark at this time (1.5.1) and patched versions of Hadoop
2.2.0 and Hive 1.0.0. We set "spark.sql.hive.metastore.version" to "1.0.0" and "spark.sql.hive.metastore.jars"
to "<path_to_hive>/lib/*:<output_of_hadoop_classpath_cmd>". When trying to launch
the spark-shell, the sqlContext would fail to initialize with:
> {code}
> java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: com/google/common/base/Predicate
when creating Hive client using classpath: <all the jars>
> Please make sure that jars for your version of hive and hadoop are included in the paths
passed to SQLConfEntry(key = spark.sql.hive.metastore.jars, defaultValue=builtin, doc=...
> {code}
> We verified the Guava libraries are in the huge list of the included jars, but we saw
that in the org.apache.spark.sql.hive.client.IsolatedClientLoader.isSharedClass method it
seems to assume that *all* "com.google" (excluding "com.google.cloud") classes should be loaded
from the base class loader. The Spark libraries seem to have *some* "com.google.common.base"
classes shaded in but not all.
> See [https://mail-archives.apache.org/mod_mbox/spark-user/201511.mbox/%3CCAB51Vx4ipV34e=EiSHLg7BZLdm0uefD_MpyqfE4dodbnbv9MKg@mail.gmail.com%3E]
and its replies.
> The work-around is to add the guava JAR to the "spark.driver.extraClassPath" property.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message