spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "holdenk (JIRA)" <>
Subject [jira] [Updated] (SPARK-22406) pyspark version tag is wrong on PyPi
Date Sat, 06 Jan 2018 00:48:00 GMT


holdenk updated SPARK-22406:
    Fix Version/s: 2.1.2

> pyspark version tag is wrong on PyPi
> ------------------------------------
>                 Key: SPARK-22406
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.2.0
>            Reporter: Kerrick Staley
>            Assignee: holdenk
>            Priority: Minor
>             Fix For: 2.1.2, 2.2.1
> On, the pyspark package is tagged with version {{2.2.0.post0}}:
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install pyspark==2.2.0}}, it
won't work. Instead you have to do {{pip install pyspark==2.2.0.post0}}. Then, if you later
run the same command ({{pip install pyspark==2.2.0.post0}}), it won't recognize the existing
pyspark installation (because it has version {{2.2.0}}) and instead will reinstall it, which
is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you end up waiting
a lot longer than necessary because every time you run {{pip install -r requirements.txt}}
it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message