spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Uri Laserson <>
Subject Installing PySpark on a local machine
Date Mon, 23 Dec 2013 00:29:37 GMT
Is there a documented/preferred method for installing PySpark on a local
machine?  I want to be able to run a Python interpreter on my local
machine, point it to my Spark cluster and go.  There doesn't appear to be a file anywhere, nor is pyspark registered with PyPI.  I'm happy to
contribute these, but want to hear what the preferred method is first.


Uri Laserson, PhD
Data Scientist, Cloudera
Twitter/GitHub: @laserson
+1 617 910 0447

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message