spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From nchammas <>
Subject [GitHub] spark issue #15659: [SPARK-1267][SPARK-18129] Allow PySpark to be pip instal...
Date Sun, 06 Nov 2016 16:51:48 GMT
Github user nchammas commented on the issue:
    I tested this out with Python 3 on my system with the following commands:
    # Inside ./spark/.
    python3 -m venv venv
    source venv/bin/activate
    ./dev/ --pip
    pip install -e ./python/
    which pyspark
    Seems there is a bug with how `SPARK_HOME` is computed:
    [ output snipped]
    $ pip install -e ./python/
    Obtaining file:///.../apache/spark/python
    Collecting py4j==0.10.4 (from pyspark==2.1.0.dev1)
      Downloading py4j-0.10.4-py2.py3-none-any.whl (186kB)
        100% |████████████████████████████████|
194kB 2.0MB/s 
    Installing collected packages: py4j, pyspark
      Running develop for pyspark
    Successfully installed py4j-0.10.4 pyspark
    $ which pyspark
    $ pyspark
    Could not find valid SPARK_HOME while searching <map object at 0x102bc15f8>
    .../apache/spark/venv/bin/pyspark: line 24: None/bin/ No such file or
    .../apache/spark/venv/bin/pyspark: line 77: .../apache/spark/None/bin/spark-submit: No
such file or directory

If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at or file a JIRA ticket
with INFRA.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message