mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pat Ferrel <>
Subject Question about Spark versions
Date Thu, 26 Feb 2015 18:23:07 GMT
Spark releases every few weeks. In the meantime some users will have chosen a version to stay
with for awhile. Now that we are moving to 1.2.1 what does that mean for users who are working
with the version of Mahout that is using 1.1.0? 

Should we be releasing or tagging builds to sync with Spark versions? Otherwise we may be
creating a headache for users. I say this because one of my clients is on Spark 1.1.0 and
is hesitant to upgrade. Since there has been no release or tag we are giving no guidance about
what point in Mahout to use.

I guess a light weight thing to do would be tag every time we move to a new build of Spark
and annotate the tag with the version of Spark. The harder thing to do would be support multiple
versions in the poms like we do for Hadoop. This is probably going to be required at some
point, right?
View raw message