spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Uri Laserson <laser...@cloudera.com>
Subject Installing PySpark on a local machine
Date Mon, 23 Dec 2013 00:29:37 GMT
Is there a documented/preferred method for installing PySpark on a local
machine?  I want to be able to run a Python interpreter on my local
machine, point it to my Spark cluster and go.  There doesn't appear to be a
setup.py file anywhere, nor is pyspark registered with PyPI.  I'm happy to
contribute these, but want to hear what the preferred method is first.

Uri

-- 
Uri Laserson, PhD
Data Scientist, Cloudera
Twitter/GitHub: @laserson
+1 617 910 0447
laserson@cloudera.com

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message