phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nick Dimiduk (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (PHOENIX-2040) Mark spark/scala dependencies as 'provided'
Date Mon, 15 Jun 2015 19:37:00 GMT

    [ https://issues.apache.org/jira/browse/PHOENIX-2040?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14586539#comment-14586539
] 

Nick Dimiduk commented on PHOENIX-2040:
---------------------------------------

I think this looks alright to me, so long as we're in agreement we want to be able to launch
spark jobs from the client jar.

[~jmahonin] have you had a chance to test that sqlline.py and psql.py both work after this
patch is applied? As we've seen, these uberjar assemblies are pretty fragile. I've done my
testing against a local mode HBase instance, basically it's enough to make sure everything
is running in different JVMs. If that testing passes, +1 from me as well.

> Mark spark/scala dependencies as 'provided'
> -------------------------------------------
>
>                 Key: PHOENIX-2040
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-2040
>             Project: Phoenix
>          Issue Type: Bug
>            Reporter: Josh Mahonin
>            Assignee: Josh Mahonin
>             Fix For: 5.0.0, 4.5.0
>
>         Attachments: PHOENIX-2040.patch
>
>
> The Spark runtime provides both the scala library, as well as the Spark dependencies,
so these should be marked as 'provided' in the phoenix-spark module. This greatly reduces
the size of the resulting client JAR.
> This patch also adds back phoenix-spark to the list of modules in the assembly JAR, to
be included in the client JAR.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message