spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From andrewo...@apache.org
Subject svn commit: r1642013 - in /spark: documentation.md downloads.md js/downloads.js news/_posts/2014-11-26-spark-1-1-1-released.md releases/_posts/2014-11-26-spark-release-1-1-1.md
Date Thu, 27 Nov 2014 03:02:49 GMT
Author: andrewor14
Date: Thu Nov 27 03:02:48 2014
New Revision: 1642013

URL: http://svn.apache.org/r1642013
Log:
Add release notes and update website for 1.1.1

Added:
    spark/news/_posts/2014-11-26-spark-1-1-1-released.md
    spark/releases/_posts/2014-11-26-spark-release-1-1-1.md
Modified:
    spark/documentation.md
    spark/downloads.md
    spark/js/downloads.js

Modified: spark/documentation.md
URL: http://svn.apache.org/viewvc/spark/documentation.md?rev=1642013&r1=1642012&r2=1642013&view=diff
==============================================================================
--- spark/documentation.md (original)
+++ spark/documentation.md Thu Nov 27 03:02:48 2014
@@ -12,7 +12,7 @@ navigation:
 <p>Setup instructions, programming guides, and other documentation are available for
each version of Spark below:</p>
 
 <ul>
-  <li><a href="{{site.url}}docs/latest/">Spark 1.1.0 (latest release)</a></li>
+  <li><a href="{{site.url}}docs/latest/">Spark 1.1.1 (latest release)</a></li>
   <li><a href="{{site.url}}docs/1.0.2/">Spark 1.0.2</a></li>
   <li><a href="{{site.url}}docs/0.9.2/">Spark 0.9.2</a></li>
   <li><a href="{{site.url}}docs/0.8.1/">Spark 0.8.1</a></li>

Modified: spark/downloads.md
URL: http://svn.apache.org/viewvc/spark/downloads.md?rev=1642013&r1=1642012&r2=1642013&view=diff
==============================================================================
--- spark/downloads.md (original)
+++ spark/downloads.md Thu Nov 27 03:02:48 2014
@@ -16,9 +16,9 @@ $(document).ready(function() {
 
 ## Download Spark
 
-The latest release of Spark is Spark 1.1.0, released on September 11, 2014
-<a href="{{site.url}}releases/spark-release-1-1-0.html">(release notes)</a>
-<a href="https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=2f9b2bd7844ee8393dc9c319f4fefedf95f5e460">(git
tag)</a><br/>
+The latest release of Spark is Spark 1.1.1, released on November 26, 2014
+<a href="{{site.url}}releases/spark-release-1-1-1.html">(release notes)</a>
+<a href="https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=3693ae5d3c01861557e06edbc32a8112683f3d86">(git
tag)</a><br/>
 
 1. Chose a Spark release:
   <select id="sparkVersionSelect" onChange="javascript:onVersionSelect();"></select><br>
@@ -38,7 +38,7 @@ Spark artifacts are [hosted in Maven Cen
 
     groupId: org.apache.spark
     artifactId: spark-core_2.10
-    version: 1.1.0
+    version: 1.1.1
 
 ### Development and Maintenance Branches
 If you are interested in working with the newest under-development code or contributing to
Spark development, you can also check out the master branch from Git:
@@ -46,7 +46,7 @@ If you are interested in working with th
     # Master development branch
     git clone git://github.com/apache/spark.git
 
-    # 1.1 maintenance branch with stability fixes on top of Spark 1.1.0
+    # 1.1 maintenance branch with stability fixes on top of Spark 1.1.1
     git clone git://github.com/apache/spark.git -b branch-1.1
 
 Once you've downloaded Spark, you can find instructions for installing and building it on
the <a href="{{site.url}}documentation.html">documentation page</a>.

Modified: spark/js/downloads.js
URL: http://svn.apache.org/viewvc/spark/js/downloads.js?rev=1642013&r1=1642012&r2=1642013&view=diff
==============================================================================
--- spark/js/downloads.js (original)
+++ spark/js/downloads.js Thu Nov 27 03:02:48 2014
@@ -26,6 +26,7 @@ var packagesV3 = packagesV2.concat([mapr
 // 1.1.0+
 var packagesV4 = packagesV1.concat([hadoop2p3, hadoop2p4, mapr3, mapr4]);
 
+addRelease("1.1.1", new Date("11/26/2014"), packagesV4, true);
 addRelease("1.1.0", new Date("9/11/2014"), packagesV4, true);
 addRelease("1.0.2", new Date("8/5/2014"), packagesV3, true);
 addRelease("1.0.1", new Date("7/11/2014"), packagesV3);

Added: spark/news/_posts/2014-11-26-spark-1-1-1-released.md
URL: http://svn.apache.org/viewvc/spark/news/_posts/2014-11-26-spark-1-1-1-released.md?rev=1642013&view=auto
==============================================================================
--- spark/news/_posts/2014-11-26-spark-1-1-1-released.md (added)
+++ spark/news/_posts/2014-11-26-spark-1-1-1-released.md Thu Nov 27 03:02:48 2014
@@ -0,0 +1,16 @@
+---
+layout: post
+title: Spark 1.1.1 released
+categories:
+- News
+tags: []
+status: publish
+type: post
+published: true
+meta:
+  _edit_last: '4'
+  _wpas_done_all: '1'
+---
+We are happy to announce the availability of <a href="{{site.url}}releases/spark-release-1-1-1.html"
title="Spark Release 1.1.1">Spark 1.1.1</a>! This is a maintenance release that includes
contributions from 55 developers. Spark 1.1.1 includes fixes across several areas of Spark,
including the core API, Streaming, PySpark, SQL, GraphX, and MLlib.
+
+Visit the <a href="{{site.url}}releases/spark-release-1-1-1.html" title="Spark Release
1.1.1">release notes</a> to read about this release or <a href="{{site.url}}downloads.html">download</a>
the release today.

Added: spark/releases/_posts/2014-11-26-spark-release-1-1-1.md
URL: http://svn.apache.org/viewvc/spark/releases/_posts/2014-11-26-spark-release-1-1-1.md?rev=1642013&view=auto
==============================================================================
--- spark/releases/_posts/2014-11-26-spark-release-1-1-1.md (added)
+++ spark/releases/_posts/2014-11-26-spark-release-1-1-1.md Thu Nov 27 03:02:48 2014
@@ -0,0 +1,109 @@
+---
+layout: post
+title: Spark Release 1.1.1
+categories: []
+tags: []
+status: publish
+type: post
+published: true
+meta:
+  _edit_last: '4'
+  _wpas_done_all: '1'
+---
+
+Spark 1.1.1 is a maintenance release with bug fixes. This release is based on the [branch-1.1](https://github.com/apache/spark/tree/branch-1.1)
maintenance branch of Spark. We recommend all 1.1.0 users to upgrade to this stable release.
Contributions to this release came from 55 developers.
+
+To download Spark 1.1.1 visit the <a href="{{site.url}}downloads.html">downloads</a>
page.
+
+### Fixes
+Spark 1.1.1 contains bug fixes in several components. Some of the more important fixes are
highlighted below. You can visit the [Spark issue tracker](http://s.apache.org/z9h) for the
full list of fixes.
+
+#### Spark Core
+- Avoid many small spills in external data structures ([SPARK-4480](https://issues.apache.org/jira/browse/SPARK-4480))
+- Memory leak in connection manager timeout thread ([SPARK-4393](https://issues.apache.org/jira/browse/SPARK-4393))
+- Incorrect of channel read return value may lead to data truncation ([SPARK-4107](https://issues.apache.org/jira/browse/SPARK-4107))
+- Stream corruption exceptions observed in sort-based shuffle ([SPARK-3948](https://issues.apache.org/jira/browse/SPARK-3948))
+- Integer overflow in sort-based shuffle key comparison ([SPARK-3032](https://issues.apache.org/jira/browse/SPARK-3032))
+- Lack of thread safety in Hadoop configuration usage in Spark ([SPARK-2546](https://issues.apache.org/jira/browse/SPARK-2546))
+
+#### SQL
+- Wrong Parquet filters are created for all inequality predicates with literals on the left
hand side ([SPARK-4468](https://issues.apache.org/jira/browse/SPARK-4468))
+- Support backticks in aliases ([SPARK-3708](https://issues.apache.org/jira/browse/SPARK-3708)
and [SPARK-3834](https://issues.apache.org/jira/browse/SPARK-3834))
+- ColumnValue types do not match in Spark rows vs Hive rows ([SPARK-3704](https://github.com/apache/spark/pull/3704))
+
+#### PySpark
+- Fix sortByKey on empty RDD ([SPARK-4304](https://issues.apache.org/jira/browse/SPARK-4304))
+- Avoid using the same random seed for all partitions ([SPARK-4148](https://issues.apache.org/jira/browse/SPARK-4148))
+- Avoid OOMs when take() is run on empty partitions ([SPARK-3211](https://issues.apache.org/jira/browse/SPARK-3211))
+
+#### MLlib
+- KryoException caused by ALS.trainImplicit in PySpark ([SPARK-3990](https://issues.apache.org/jira/browse/SPARK-3990))
+
+#### Streaming
+- Block replication continuously fails if target is down ([SPARK-3495](https://issues.apache.org/jira/browse/SPARK-3495))
+- Block replication may choose driver as target ([SPARK-3496](https://issues.apache.org/jira/browse/SPARK-3496))
+
+#### GraphX
+- Ensure VertexRDD.apply uses mergeFunc ([SPARK-2062](https://issues.apache.org/jira/browse/SPARK-2062))
+
+### Contributors
+The following developers contributed to this release:
+
+* Andrew Ash - Documentation and bug fixes in Core
+* Andrew Or - Improvements in Core; bug fixes in Windows, Core, Block Manager, and Shuffle
+* Aniket Bhatnagar - Bug fixes in Core and Streaming
+* Benjamin Piering - Improvements in GraphX
+* Bertrand Bossy - Bug fixes in Core
+* Brenden Matthews - Bug fixes in Mesos
+* Chao Chen - Documentation in Core
+* Cheng Hao - Test in SQL
+* Cheng Lian - Bug fixes in PySpark, MLlib, and SQL
+* Chirag Aggarwal - Bug fixes in SQL
+* Chris Cope - Bug fixes in YARN
+* Davies Liu - Improvements in PySpark; bug fixes in Core, SQL, and PySpark
+* Eric Eijkelenboom - Bug fixes in Core
+* Eric Liang - Bug fixes in Core and SQL
+* Eugen Cepoi - Improvements in Core
+* Fei Wang - Improvements in Core and SQL; bug fixes in Core; documentation in Streaming
+* Grega Kespret - Documentation in Core
+* Guoqiang Li - Bug fixes in Web UI
+* Henry Cook - Documentation in Core
+* Hossein Falaki - Bug fixes in Web UI
+* Ian Hummel - Improvements in Core
+* Jakub Dubovsky - Bug fixes in Core
+* Jerry Shao - Bug fixes in Shuffle
+* Jongyoul Lee - Bug fixes in Core and Mesos
+* Josh Rosen - Improvements in Core; bug fixes in Streaming and Core
+* Kousuke Saruta - Improvements in Core and Web UI; bug fixes in Core, Web UI, and PySpark
+* Larry Xiao - Bug fixes in GraphX
+* Lianhui Wang - Bug fixes in GraphX
+* Liang-Chi Hsieh - Bug fixes in Core
+* Lu Lu - Improvements in GraphX
+* Ma Ji - Bug fixes in Streaming
+* Marcelo Vanzin - Bug fixes in YARN
+* Mark Hamstra - Bug fixes in Core
+* Masayoshi Tsuzuki - Improvements in Core, Shell, and PySpark; bug fixes in Windows and
PySpark
+* Michael Armbrust - Documentation in Core
+* Michael Griffiths - Bug fixes in PySpark
+* Min Shen - Bug fixes in YARN
+* Mubarak Seyed - Improvements in Streaming
+* Nicholas Chammas - Documentation in Core
+* Niklas Wilcke - Bug fixes in Core
+* Oded Zimerman - Bug fixes in GraphX
+* Reynold Xin - New features in Core; bug fixes in Core and SQL
+* Rongquan Su - Improvements in Streaming
+* Sandy Ryza - Bug fixes in Core
+* Sean Owen - Bug fixes in Java API, Core, and Streaming
+* Shane Knapp - Bug fixes in Core
+* Shixiong Zhu - Improvements in Web UI; bug fixes in Core and YARN
+* Shuo Xiang - Bug fixes in MLlib
+* Tal Sliwowicz - Bug fixes in Core and Block Manager
+* Tao Wang - Improvements and bug fixes in Core
+* Tathagata Das - Improvements in Streaming; bug fixes in Core, Block Manager, and Streaming
+* Xiangrui Meng - Improvements in Web UI and PySpark; bug fixes in Core, MLlib, and PySpark
+* Yantang Zhai - Bug fixes in Core and Web UI
+* Yash Datta - Improvements in SQL
+* Yin Huai - Documentation in Core
+
+_Thanks to everyone who contributed!_
+



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org


Mime
View raw message