hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sergey Shelukhin (JIRA)" <>
Subject [jira] [Commented] (HIVE-12880) spark-assembly causes Hive class version problems
Date Mon, 18 Jan 2016 19:37:40 GMT


Sergey Shelukhin commented on HIVE-12880:

It seems like the default spark-assembly built from Spark itself includes Hive.
This is what I'd expect most independent users will have...
If I am correct about this (not very familiar with spark build), I wonder if it makes sense
to either (1) add new published jar to Spark that excludes this spurious Hive version, and
use that (2) disable the assembly being added by default with this in mind? On a higher level,
we don't add e.g. Tez jars unless they are added explicitly (and they don't even package Hive

> spark-assembly causes Hive class version problems
> -------------------------------------------------
>                 Key: HIVE-12880
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Hui Zheng
> It looks like spark-assembly contains versions of Hive classes (e.g. HiveConf), and these
sometimes (always?) come from older versions of Hive.
> We've seen problems where depending on classpath perturbations, NoSuchField errors may
be thrown for recently added ConfVars because the HiveConf class comes from spark-assembly.
> Would making sure spark-assembly comes last in the classpath solve the problem?
> Otherwise, can we depend on something that does not package older Hive classes?
> Currently, HIVE-12179 provides a workaround (in non-Spark use case, at least; I am assuming
this issue can also affect Hive-on-Spark).

This message was sent by Atlassian JIRA

View raw message