spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From marmb...@apache.org
Subject spark git commit: [SPARK-5944] [PySpark] fix version in Python API docs
Date Wed, 25 Feb 2015 23:13:51 GMT
Repository: spark
Updated Branches:
  refs/heads/master 838a48036 -> f3f4c87b3


[SPARK-5944] [PySpark] fix version in Python API docs

use RELEASE_VERSION when building the Python API docs

Author: Davies Liu <davies@databricks.com>

Closes #4731 from davies/api_version and squashes the following commits:

c9744c9 [Davies Liu] Update create-release.sh
08cbc3f [Davies Liu] fix python docs


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f3f4c87b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f3f4c87b
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f3f4c87b

Branch: refs/heads/master
Commit: f3f4c87b3d944c10d1200dfe49091ebb2a149be6
Parents: 838a480
Author: Davies Liu <davies@databricks.com>
Authored: Wed Feb 25 15:13:34 2015 -0800
Committer: Michael Armbrust <michael@databricks.com>
Committed: Wed Feb 25 15:13:34 2015 -0800

----------------------------------------------------------------------
 dev/create-release/create-release.sh | 2 +-
 python/docs/conf.py                  | 6 +++---
 python/docs/pyspark.sql.rst          | 2 +-
 python/pyspark/rdd.py                | 4 ++++
 4 files changed, 9 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/f3f4c87b/dev/create-release/create-release.sh
----------------------------------------------------------------------
diff --git a/dev/create-release/create-release.sh b/dev/create-release/create-release.sh
index 607ce1c..da15ce3 100755
--- a/dev/create-release/create-release.sh
+++ b/dev/create-release/create-release.sh
@@ -237,7 +237,7 @@ if [[ ! "$@" =~ --skip-package ]]; then
   sbt/sbt clean
   cd docs
   # Compile docs with Java 7 to use nicer format
-  JAVA_HOME=$JAVA_7_HOME PRODUCTION=1 jekyll build
+  JAVA_HOME="$JAVA_7_HOME" PRODUCTION=1 RELEASE_VERSION="$RELEASE_VERSION" jekyll build
   echo "Copying release documentation"
   rc_docs_folder=${rc_folder}-docs
   ssh $ASF_USERNAME@people.apache.org \

http://git-wip-us.apache.org/repos/asf/spark/blob/f3f4c87b/python/docs/conf.py
----------------------------------------------------------------------
diff --git a/python/docs/conf.py b/python/docs/conf.py
index cbbf7ff..163987d 100644
--- a/python/docs/conf.py
+++ b/python/docs/conf.py
@@ -48,16 +48,16 @@ master_doc = 'index'
 
 # General information about the project.
 project = u'PySpark'
-copyright = u'2014, Author'
+copyright = u''
 
 # The version info for the project you're documenting, acts as replacement for
 # |version| and |release|, also used in various other places throughout the
 # built documents.
 #
 # The short X.Y version.
-version = '1.3-SNAPSHOT'
+version = 'master'
 # The full version, including alpha/beta/rc tags.
-release = '1.3-SNAPSHOT'
+release = os.environ.get('RELEASE_VERSION', version)
 
 # The language for content autogenerated by Sphinx. Refer to documentation
 # for a list of supported languages.

http://git-wip-us.apache.org/repos/asf/spark/blob/f3f4c87b/python/docs/pyspark.sql.rst
----------------------------------------------------------------------
diff --git a/python/docs/pyspark.sql.rst b/python/docs/pyspark.sql.rst
index 2e3f69b..6259379 100644
--- a/python/docs/pyspark.sql.rst
+++ b/python/docs/pyspark.sql.rst
@@ -17,7 +17,7 @@ pyspark.sql.types module
 
 
 pyspark.sql.functions module
-------------------------
+----------------------------
 .. automodule:: pyspark.sql.functions
     :members:
     :undoc-members:

http://git-wip-us.apache.org/repos/asf/spark/blob/f3f4c87b/python/pyspark/rdd.py
----------------------------------------------------------------------
diff --git a/python/pyspark/rdd.py b/python/pyspark/rdd.py
index d3148de..cb12fed 100644
--- a/python/pyspark/rdd.py
+++ b/python/pyspark/rdd.py
@@ -2111,6 +2111,7 @@ class RDD(object):
     def countApprox(self, timeout, confidence=0.95):
         """
         .. note:: Experimental
+
         Approximate version of count() that returns a potentially incomplete
         result within a timeout, even if not all tasks have finished.
 
@@ -2124,6 +2125,7 @@ class RDD(object):
     def sumApprox(self, timeout, confidence=0.95):
         """
         .. note:: Experimental
+
         Approximate operation to return the sum within a timeout
         or meet the confidence.
 
@@ -2140,6 +2142,7 @@ class RDD(object):
     def meanApprox(self, timeout, confidence=0.95):
         """
         .. note:: Experimental
+
         Approximate operation to return the mean within a timeout
         or meet the confidence.
 
@@ -2156,6 +2159,7 @@ class RDD(object):
     def countApproxDistinct(self, relativeSD=0.05):
         """
         .. note:: Experimental
+
         Return approximate number of distinct elements in the RDD.
 
         The algorithm used is based on streamlib's implementation of


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org


Mime
View raw message