spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mike Sukmanowsky (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-13587) Support virtualenv in PySpark
Date Wed, 30 Mar 2016 19:29:25 GMT

    [ https://issues.apache.org/jira/browse/SPARK-13587?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15218659#comment-15218659
] 

Mike Sukmanowsky commented on SPARK-13587:
------------------------------------------

That's the (hopefully) beautiful thing about pex.

{noformat}
$ pex numpy pandas -o c_requirements.pex
$ ./c_requirements.pex
Python 2.7.10 (default, Aug  4 2015, 19:54:05)
[GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
(InteractiveConsole)
>>> import numpy
>>> import pandas
>>> print numpy.__version__
1.11.0
>>> print pandas.__version__
0.18.0
>>>
{noformat}

The catch of course is that the pex file itself would have to be built on a node with the
same arch as Spark workers (i.e. can't build pex on Mac OS and ship to Linux cluster unless
all dependencies are pure python). To build a platform agnostic env, we'd have to look at
conda.

I know there was an effort to support pex with pyspark https://github.com/URXtech/spex but
it hasn't seen much activity recently. I tried reaching out to the author but got no response.

I could take a shot at adding support for this unless @zjffdu already has plans.

> Support virtualenv in PySpark
> -----------------------------
>
>                 Key: SPARK-13587
>                 URL: https://issues.apache.org/jira/browse/SPARK-13587
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark
>            Reporter: Jeff Zhang
>
> Currently, it's not easy for user to add third party python packages in pyspark.
> * One way is to using --py-files (suitable for simple dependency, but not suitable for
complicated dependency, especially with transitive dependency)
> * Another way is install packages manually on each node (time wasting, and not easy to
switch to different environment)
> Python has now 2 different virtualenv implementation. One is native virtualenv another
is through conda. This jira is trying to migrate these 2 tools to distributed environment



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message