mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Trevor Grant <trevor.d.gr...@gmail.com>
Subject Re: [DISCUSS] How many binary combos do we want to release?
Date Mon, 10 Jul 2017 23:56:02 GMT
>From the Spark website:

"Note: Starting version 2.0, Spark is built with Scala 2.11 by default.
Scala 2.10 users should download the Spark source package and build with
Scala 2.10 support."

Given that, the minimum set (imho) would be:

Spark-1.6, Scala-2.10, viennacl, viennacl-omp
Spark-2.0, Scala-2.11, viennacl, viennacl-omp
Spark-2.1, Scala-2.11, viennacl, viennacl-omp

It has been pointed out that our spark-2.0 may cover all of spark 2.x, but
I haven't tested that.




On Mon, Jul 10, 2017 at 5:51 PM, Andrew Palumbo <ap.dev@outlook.com> wrote:

> Awesome!
>
>
> One point:
>
>
> INFRA may have an issue here. And we may need to move some of the older
> releases to the archives...
>
>
> We have a waiver for > the standard 200Mb cap, which should still be in
> place.. But if you start to notice that you're having trouble Uploading
> artifacts to the staging ground, It may be that we've blown their caps.
> Please let me know if this happens, and I'll figure out what needs to be
> done.
>
>
> Thanks
>
>
> --andy
>
> ________________________________
> From: Trevor Grant <trevor.d.grant@gmail.com>
> Sent: Monday, July 10, 2017 1:30:46 PM
> To: Mahout Dev List
> Subject: [DISCUSS] How many binary combos do we want to release?
>
> In 0.13.1 we had one binary tarball.
>
> A full spread would look something like this in 0.13.2-
>
> Spark-1.6, Scala-2.10
> Spark-2.0, Scala-2.10
> Spark-2.1, Scala-2.10
> Spark-1.6, Scala-2.11
> Spark-2.0, Scala-2.11
> Spark-2.1, Scala-2.11
>
> Spark-1.6, Scala-2.10, viennacl
> Spark-2.0, Scala-2.10, viennacl
> Spark-2.1, Scala-2.10, viennacl
> Spark-1.6, Scala-2.11, viennacl
> Spark-2.0, Scala-2.11, viennacl
> Spark-2.1, Scala-2.11, viennacl
>
> Spark-1.6, Scala-2.10, viennacl-omp
> Spark-2.0, Scala-2.10, viennacl-omp
> Spark-2.1, Scala-2.10, viennacl-omp
> Spark-1.6, Scala-2.11, viennacl-omp
> Spark-2.0, Scala-2.11, viennacl-omp
> Spark-2.1, Scala-2.11, viennacl-omp
>
> Spark-1.6, Scala-2.10, viennacl, viennacl-omp
> Spark-2.0, Scala-2.10, viennacl, viennacl-omp
> Spark-2.1, Scala-2.10, viennacl, viennacl-omp
> Spark-1.6, Scala-2.11, viennacl, viennacl-omp
> Spark-2.0, Scala-2.11, viennacl, viennacl-omp
> Spark-2.1, Scala-2.11, viennacl, viennacl-omp
>
> That's 24 tarballs of pre-compiled binaries.
>
> The main thing I'm concerned about is getting all combos of spark/scala,
> viennacl/scala, viennacl-omp/scala into Maven repositories.  This can be
> accomplished with 6 tarballs:
>
> Spark-1.6, Scala-2.10, viennacl, viennacl-omp
> Spark-2.0, Scala-2.10, viennacl, viennacl-omp
> Spark-2.1, Scala-2.10, viennacl, viennacl-omp
> Spark-1.6, Scala-2.11, viennacl, viennacl-omp
> Spark-2.0, Scala-2.11, viennacl, viennacl-omp
> Spark-2.1, Scala-2.11, viennacl, viennacl-omp
>
>
> Not all users want ViennaCL (I would imagine) - A compromise might be the
> first and last 6 combinations:
>
> Spark-1.6, Scala-2.10
> Spark-2.0, Scala-2.10
> Spark-2.1, Scala-2.10
> Spark-1.6, Scala-2.11
> Spark-2.0, Scala-2.11
> Spark-2.1, Scala-2.11
>
> Spark-1.6, Scala-2.10, viennacl, viennacl-omp
> Spark-2.0, Scala-2.10, viennacl, viennacl-omp
> Spark-2.1, Scala-2.10, viennacl, viennacl-omp
> Spark-1.6, Scala-2.11, viennacl, viennacl-omp
> Spark-2.0, Scala-2.11, viennacl, viennacl-omp
> Spark-2.1, Scala-2.11, viennacl, viennacl-omp
>
> Thoughts?
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message