hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Boris Lublinsky <>
Subject Re: setup spark engine to hive ,the hive version and spark build problem
Date Sat, 17 Jun 2017 16:33:39 GMT
You need to exlicitely build spark without Hive. Look at getting started doc

Get Outlook for Android

On Sat, Jun 17, 2017 at 5:26 AM -0400, "wuchang" <> wrote:

I want to build hive and spark to make my hive based on spark engine.I choose Hive 2.3.0 and
Spark 2.0.0, which is claimed to be compatible by hive official document.According to the
hive officials document ,I  have to build spark without hive profile to avoid the conflict
between original hive and spark-integrated hive. Yes, I build successfully , but then the
problem comes:I cannot use spark-sql anymore because spark-sql relies on the hive library
and my spark is a no-hive build.

[appuser@ab-10-11-22-209 spark]$ spark-sql
java.lang.ClassNotFoundException: org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver
        at java.lang.ClassLoader.loadClass(
        at java.lang.ClassLoader.loadClass(
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(
        at org.apache.spark.util.Utils$.classForName(Utils.scala:225)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:686)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Failed to load main class org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.
You need to build Spark with -Phive and -Phive-thriftserver.
How can I build and setup spark and to make hive on sparkWork properly and my spark-sql、pyspark
and spark-shell work properly?

I don’t know the relationship between spark-integrated hive and original hive. Below is
the spark-integrated hive jars:
It seems that Spark 2.0.0 relies on hive 1.2.1.

View raw message