hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rui Li (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-15302) Relax the requirement that HoS needs Spark built w/o Hive
Date Thu, 01 Dec 2016 07:40:59 GMT

    [ https://issues.apache.org/jira/browse/HIVE-15302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15711207#comment-15711207
] 

Rui Li commented on HIVE-15302:
-------------------------------

[~kellyzly], you're right about the ideas. But the needed spark jars may not be the same as
those listed in wiki now. Those listed are needed when linking spark to hive side, while spark.yarn.archive
and spark.yarn.jars are intended for the containers on YARN side. But I guess the needed jars
should be quite similar to those for local mode in our current wiki.

bq. because user has already set spark.yarn.jars so they can directory download a spark tarball
from webside
I'm not sure what you mean here. We still need user to have spark installed in their cluster,
either downloaded or built by themselves. But we can relax the limitation that the spark must
be built w/o hive, in some cases.

> Relax the requirement that HoS needs Spark built w/o Hive
> ---------------------------------------------------------
>
>                 Key: HIVE-15302
>                 URL: https://issues.apache.org/jira/browse/HIVE-15302
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: Rui Li
>            Assignee: Rui Li
>
> This requirement becomes more and more unacceptable as SparkSQL becomes widely adopted.
Let's use this JIRA to find out how we can relax the limitation.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message