systemml-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Matthias Boehm1" <Matthias.Boe...@ibm.com>
Subject Re: Minimum required Spark version
Date Mon, 20 Feb 2017 22:29:40 GMT

that's a good catch Felix! I would recommend to cast this exception to a
warning and move it to a central place like SparkExecutionContext to ensure
consistency across all APIs and deployments.

Regards,
Matthias




From:	Deron Eriksson <deroneriksson@gmail.com>
To:	dev@systemml.incubator.apache.org
Date:	02/20/2017 02:14 PM
Subject:	Re: Minimum required Spark version



Hi Felix,

I agree that the 2.1 hard requirement is a bit restrictive. If someone can
validate that Spark versions less than 2.1 and greater than 2.0.* work,
this seems like a great idea to me.

Deron


On Mon, Feb 20, 2017 at 1:43 PM, <fschueler@posteo.de> wrote:

> Hi,
>
> the current master and 0.13 release have a hard requirement in MLContext
> for Spark 2.1. Is this really necessary or could we set it to >= 2.0?
Only
> supporting the latest Spark release seems a little restrictive to me.
>
>
> -Felix
>



--
Deron Eriksson
Spark Technology Center
http://www.spark.tc/



Mime
  • Unnamed multipart/related (inline, None, 0 bytes)
View raw message