spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From StanZhai <m...@zhaishidan.cn>
Subject Re: [SparkSQL]Could not alter table in Spark 1.5 use HiveContext
Date Fri, 11 Sep 2015 06:29:25 GMT
Thanks a lot! I've fixed this issue by set: 
spark.sql.hive.metastore.version = 0.13.1
spark.sql.hive.metastore.jars = maven


Yin Huai-2 wrote
> Yes, Spark 1.5 use Hive 1.2's metastore client by default. You can change
> it by putting the following settings in your spark conf.
> 
> spark.sql.hive.metastore.version = 0.13.1
> spark.sql.hive.metastore.jars = maven or the path of your hive 0.13 jars
> and hadoop jars
> 
> For spark.sql.hive.metastore.jars, basically, it tells spark sql where to
> find metastore client's classes of Hive 0.13.1. If you set it to maven, we
> will download needed jars directly (it is an easy way to do testing work).
> 
> On Thu, Sep 10, 2015 at 7:45 PM, StanZhai &lt;

> mail@

> &gt; wrote:
> 
>> Thank you for the swift reply!
>>
>> The version of my hive metastore server is 0.13.1, I've build spark use
>> sbt
>> like this:
>> build/sbt -Pyarn -Phadoop-2.4 -Phive -Phive-thriftserver assembly
>>
>> Is spark 1.5 bind the hive client version of 1.2 by default?
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-developers-list.1001551.n3.nabble.com/SparkSQL-Could-not-alter-table-in-Spark-1-5-use-HiveContext-tp14029p14044.html
>> Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: 

> dev-unsubscribe@.apache

>> For additional commands, e-mail: 

> dev-help@.apache

>>
>>





--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/SparkSQL-Could-not-alter-table-in-Spark-1-5-use-HiveContext-tp14029p14047.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message