spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Felix Cheung (JIRA)" <>
Subject [jira] [Commented] (SPARK-22406) pyspark version tag is wrong on PyPi
Date Sat, 11 Nov 2017 06:40:00 GMT


Felix Cheung commented on SPARK-22406:

is this still being targeted for 2.2.1?

> pyspark version tag is wrong on PyPi
> ------------------------------------
>                 Key: SPARK-22406
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.2.0
>            Reporter: Kerrick Staley
>            Assignee: holdenk
>            Priority: Minor
> On, the pyspark package is tagged with version {{2.2.0.post0}}:
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install pyspark==2.2.0}}, it
won't work. Instead you have to do {{pip install pyspark==2.2.0.post0}}. Then, if you later
run the same command ({{pip install pyspark==2.2.0.post0}}), it won't recognize the existing
pyspark installation (because it has version {{2.2.0}}) and instead will reinstall it, which
is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you end up waiting
a lot longer than necessary because every time you run {{pip install -r requirements.txt}}
it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message