spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@hortonworks.com>
Subject Re: Straw poll: dropping support for things like Scala 2.10
Date Thu, 27 Oct 2016 08:19:00 GMT

On 27 Oct 2016, at 10:03, Sean Owen <sowen@cloudera.com<mailto:sowen@cloudera.com>>
wrote:

Seems OK by me.
How about Hadoop < 2.6, Python 2.6? Those seem more removeable. I'd like to add that to
a list of things that will begin to be unsupported 6 months from now.


If you go to java 8 only, then hadoop 2.6+ is mandatory.


On Wed, Oct 26, 2016 at 8:49 PM Koert Kuipers <koert@tresata.com<mailto:koert@tresata.com>>
wrote:
that sounds good to me

On Wed, Oct 26, 2016 at 2:26 PM, Reynold Xin <rxin@databricks.com<mailto:rxin@databricks.com>>
wrote:
We can do the following concrete proposal:

1. Plan to remove support for Java 7 / Scala 2.10 in Spark 2.2.0 (Mar/Apr 2017).

2. In Spark 2.1.0 release, aggressively and explicitly announce the deprecation of Java 7
/ Scala 2.10 support.

(a) It should appear in release notes, documentations that mention how to build Spark

(b) and a warning should be shown every time SparkContext is started using Scala 2.10 or Java
7.



Mime
View raw message