spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Milos Nikolic <milos.nikoli...@gmail.com>
Subject Spark build error
Date Wed, 11 Dec 2013 16:57:36 GMT
Hello,

I'm facing the following problem when trying to compile Spark 0.8.0 (sbt/sbt assembly) on
Solaris.

[info] Compiling 247 Scala sources and 11 Java sources to /export/home/mnikolic/spark-0.8.0-incubating/core/target/scala-2.9.3/classes...
...
[error] /export/home/mnikolic/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkContext.scala:58:
StandaloneSchedulerBackend is not a member of org.apache.spark.scheduler.cluster
[error] import org.apache.spark.scheduler.cluster.{StandaloneSchedulerBackend, SparkDeploySchedulerBackend,
[error]        ^
[error] /export/home/mnikolic/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkContext.scala:61:
object mesos is not a member of package org.apache.spark.scheduler
[error] import org.apache.spark.scheduler.mesos.{CoarseMesosSchedulerBackend, MesosSchedulerBackend}
[error]                                   ^

Both sbt and maven fail at the same point. I tried to compile with Java 1.6 and 1.7 and also
with the newest version of sbt. Failed in all cases.


Thanks in advance,
Milos


Mime
View raw message