hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xuefu Zhang <xzh...@cloudera.com>
Subject Re: Hive on Spark
Date Fri, 23 Oct 2015 11:40:05 GMT
Yeah. for that, you cannot really cache anything through Hive on Spark.
Could you detail more what you want to achieve?

When needed, Hive on Spark uses memory+disk for storage level.

On Fri, Oct 23, 2015 at 4:29 AM, Jone Zhang <joyoungzhang@gmail.com> wrote:

> 1.But It's no way to set Storage Level through properties file in spark,
> Spark provided "def persist(newLevel: StorageLevel)"
> api only...
>
> 2015-10-23 19:03 GMT+08:00 Xuefu Zhang <xzhang@cloudera.com>:
>
>> quick answers:
>> 1. you can pretty much set any spark configuration at hive using set
>> command.
>> 2. no. you have to make the call.
>>
>>
>>
>> On Thu, Oct 22, 2015 at 10:32 PM, Jone Zhang <joyoungzhang@gmail.com>
>> wrote:
>>
>>> 1.How can i set Storage Level when i use Hive on Spark?
>>> 2.Do Spark have any intention of  dynamically determined Hive on
>>> MapReduce or Hive on Spark, base on SQL features.
>>>
>>> Thanks in advance
>>> Best regards
>>>
>>
>>
>

Mime
View raw message