hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Szehon Ho (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-8836) Enable automatic tests with remote spark client.[Spark Branch]
Date Tue, 25 Nov 2014 05:55:13 GMT

    [ https://issues.apache.org/jira/browse/HIVE-8836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14224098#comment-14224098
] 

Szehon Ho commented on HIVE-8836:
---------------------------------

bq. Spark assembly dependency can be fetched from public maven reposiotry

Hi Chengxiang, I dont understand this part, how are you downloading this from public repo?
 When I try your patch I get the error:
{noformat}
Could not resolve dependencies for project org.apache.hive:hive-it-qfile-spark:jar:0.15.0-SNAPSHOT:
Could not find artifact org.apache.spark:spark-assembly_2.10:jar:1.2.0-SNAPSHOT in spark-snapshot
(http://ec2-50-18-79-139.us-west-1.compute.amazonaws.com/data/spark_2.10-1.2-SNAPSHOT/
{noformat}

And we cant find it in any public repo.  Thats why we assumed we had to build it and upload
to hosted location.

Also another question, as we were trying to set spark.home, which looks for bin/spark-submit,
which then pulled in scripts like compute-classpath.sh, load-spark-env.sh, spark-class, and
finally spark-assembly itself.  I see you are using another way (spark.test.home, spark.testing),
how does that avoid looking for these artifacts to start the spark process?

> Enable automatic tests with remote spark client.[Spark Branch]
> --------------------------------------------------------------
>
>                 Key: HIVE-8836
>                 URL: https://issues.apache.org/jira/browse/HIVE-8836
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Rui Li
>              Labels: Spark-M3
>         Attachments: HIVE-8836-brock-1.patch, HIVE-8836-brock-2.patch, HIVE-8836-brock-3.patch,
HIVE-8836.1-spark.patch, HIVE-8836.2-spark.patch
>
>
> In real production environment, remote spark client should be used to submit spark job
for Hive mostly, we should enable automatic test with remote spark client to make sure the
Hive feature workable with it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message