hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Brock Noland (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HIVE-8835) identify dependency scope for Remote Spark Context.[Spark Branch]
Date Tue, 18 Nov 2014 17:47:34 GMT

     [ https://issues.apache.org/jira/browse/HIVE-8835?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Brock Noland updated HIVE-8835:
-------------------------------
       Resolution: Fixed
    Fix Version/s: spark-branch
           Status: Resolved  (was: Patch Available)

Thank you! I have committed this to spark.

> identify dependency scope for Remote Spark Context.[Spark Branch]
> -----------------------------------------------------------------
>
>                 Key: HIVE-8835
>                 URL: https://issues.apache.org/jira/browse/HIVE-8835
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Chengxiang Li
>              Labels: Spark-M3
>             Fix For: spark-branch
>
>         Attachments: HIVE-8835.1-spark.patch
>
>
> While submit job through Remote Spark Context, spark RDD graph generation and job submit
is executed in remote side, so we have to add hive  related dependency into its classpath
with spark.driver.extraClassPath. instead of add all hive/hadoop dependency, we should narrow
the scope and identify what dependency remote spark context required. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message