crunch-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Wills <>
Subject spark/scala version changes for 0.10.0/0.8.3
Date Fri, 25 Apr 2014 20:45:54 GMT
Hey all,

I've been working on doing a new release candidate, and I think that we're
going to need to upgrade to Spark 0.9.1 in order for Crunch-on-Spark to
work in Spark standalone/Spark-on-YARN on top of Hadoop, as opposed to the
local mode that works now. There are a couple of implications to this, but
the big one is that Spark 0.9.1 is only developed against Scala 2.10, which
would mean that we would need to switch over to 2.10 for at least
crunch-spark, and it would seem much easier to me for us to upgrade Scrunch
to 2.10 as well.

Couple of options here off the top of my head:
1) Do the upgrade to Scala 2.10/Spark 0.9.1 in 0.10.0, but not 0.8.3.
2) Do the upgrade in both 0.8.3 and 0.10.0.
3) Only upgrade Spark to 2.10 (in either 0.8.3 or 0.10.0), but still have
Scrunch build against 2.9.3.

I'm in favor of #2 myself, but am open to suggestion here, and would
especially like to know if anyone feels strongly about staying on 2.9.3 for
any reason.


Director of Data Science
Cloudera <>
Twitter: @josh_wills <>

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message