spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Olivier Girardot <ssab...@gmail.com>
Subject Re: [VOTE] Release Apache Spark 1.4.0 (RC2)
Date Mon, 25 May 2015 17:42:48 GMT
I've just tested the new window functions using PySpark in the Spark 1.4.0
rc2 distribution for hadoop 2.4 with and without hive support.
It works well with the hive support enabled distribution and fails as
expected on the other one (with an explicit error :  "Could not resolve
window function 'lead'. Note that, using window functions currently
requires a HiveContext").

Thank you for your work.

Regards,

Olivier.

Le lun. 25 mai 2015 à 11:25, Wang, Daoyuan <daoyuan.wang@intel.com> a
écrit :

> Good catch! BTW, SPARK-6784 is duplicate to SPAKR-7790, didn't notice we
> changed the title of SPARK-7853..
>
>
> -----Original Message-----
> From: Cheng, Hao [mailto:hao.cheng@intel.com]
> Sent: Monday, May 25, 2015 4:47 PM
> To: Sean Owen; Patrick Wendell
> Cc: dev@spark.apache.org
> Subject: RE: [VOTE] Release Apache Spark 1.4.0 (RC2)
>
> Add another Blocker issue, just created! It seems a regression.
>
> https://issues.apache.org/jira/browse/SPARK-7853
>
>
> -----Original Message-----
> From: Sean Owen [mailto:sowen@cloudera.com]
> Sent: Monday, May 25, 2015 3:37 PM
> To: Patrick Wendell
> Cc: dev@spark.apache.org
> Subject: Re: [VOTE] Release Apache Spark 1.4.0 (RC2)
>
> We still have 1 blocker for 1.4:
>
> SPARK-6784 Make sure values of partitioning columns are correctly
> converted based on their data types
>
> CC Davies Liu / Adrian Wang to check on the status of this.
>
> There are still 50 Critical issues tagged for 1.4, and 183 issues targeted
> for 1.4 in general. Obviously almost all of those won't be in 1.4. How do
> people want to deal with those? The field can be cleared, but do people
> want to take a pass at bumping a few to 1.4.1 that really truly are
> supposed to go into 1.4.1?
>
>
> On Sun, May 24, 2015 at 8:22 AM, Patrick Wendell <pwendell@gmail.com>
> wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> 1.4.0!
> >
> > The tag to be voted on is v1.4.0-rc2 (commit 03fb26a3):
> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=03fb26a
> > 3e50e00739cc815ba4e2e82d71d003168
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc2-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > [published as version: 1.4.0]
> > https://repository.apache.org/content/repositories/orgapachespark-1103
> > /
> > [published as version: 1.4.0-rc2]
> > https://repository.apache.org/content/repositories/orgapachespark-1104
> > /
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc2-docs
> > /
> >
> > Please vote on releasing this package as Apache Spark 1.4.0!
> >
> > The vote is open until Wednesday, May 27, at 08:12 UTC and passes if a
> > majority of at least 3 +1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 1.4.0 [ ] -1 Do not
> > release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > == What has changed since RC1 ==
> > Below is a list of bug fixes that went into this RC:
> > http://s.apache.org/U1M
> >
> > == How can I help test this release? == If you are a Spark user, you
> > can help us test this release by taking a Spark 1.3 workload and
> > running on this release candidate, then reporting any regressions.
> >
> > == What justifies a -1 vote for this release? == This vote is
> > happening towards the end of the 1.4 QA period, so -1 votes should
> > only occur for significant regressions from 1.3.1.
> > Bugs already present in 1.3.X, minor regressions, or bugs related to
> > new features will not block this release.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org For
> > additional commands, e-mail: dev-help@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org For additional
> commands, e-mail: dev-help@spark.apache.org
>
>

Mime
View raw message