spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adrian Tanase <>
Subject Re: Building with SBT and Scala 2.11
Date Wed, 14 Oct 2015 17:09:29 GMT
You are correct, of course. Gave up on sbt for spark long ago, I never managed to get it working
while mvn works great.

Sent from my iPhone

On 14 Oct 2015, at 16:52, Ted Yu <<>>

Likely you were using maven.

Jakob's report was with sbt.


On Tue, Oct 13, 2015 at 10:05 PM, Adrian Tanase <<>>
Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also compiling the 1.5.1
version with scala 2.11 and hadoop 2.6 and it works.


Sent from my iPhone

On 14 Oct 2015, at 03:53, Jakob Odersky <<>>

I'm having trouble compiling Spark with SBT for Scala 2.11. The command I use is:

    build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11

followed by


in the sbt shell.

The error I get specifically is:

spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308: no valid targets
for annotation on value conf - it is discarded unused. You may specify targets with meta-annotations,
e.g. @(transient @param)
[error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)

However I am also getting a large amount of deprecation warnings, making me wonder if I am
supplying some incompatible/unsupported options to sbt. I am using Java 1.8 and the latest
Spark master sources.
Does someone know if I am doing anything wrong or is the sbt build broken?

thanks for you help,

View raw message