spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
Subject [02/13] git commit: Update Python API features
Date Wed, 11 Sep 2013 17:26:41 GMT
Update Python API features


Branch: refs/heads/branch-0.8
Commit: 2425eb85ca709273c48958f81a81c8a04657ea1f
Parents: 8c14f4b
Author: Matei Zaharia <>
Authored: Tue Sep 10 11:12:59 2013 -0700
Committer: Matei Zaharia <>
Committed: Tue Sep 10 11:12:59 2013 -0700

 docs/ | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/ b/docs/
index 5662e7d..f67a1cc 100644
--- a/docs/
+++ b/docs/
@@ -16,7 +16,7 @@ This guide will show how to use the Spark features described there in Python.
 There are a few key differences between the Python and Scala APIs:
 * Python is dynamically typed, so RDDs can hold objects of multiple types.
-* PySpark does not yet support a few API calls, such as `lookup`, `sort`, and `persist` at
custom storage levels. See the [API docs](api/pyspark/index.html) for details.
+* PySpark does not yet support a few API calls, such as `lookup`, `sort`, and non-text input
files, though these will be added in future releases.
 In PySpark, RDDs support the same methods as their Scala counterparts but take Python functions
and return Python collection types.
 Short functions can be passed to RDD methods using Python's [`lambda`](

View raw message