[Repost to mailing list]
Sorry about the typo, I of course meant hadoop-2.6, not 2.11.
I suspect something bad happened with my Ivy cache, since when reverting back to scala 2.10, I got a very strange IllegalStateException (something something IvyNode, I can't remember the details).
Kilking the cache made 2.10 work at least, I'll retry with 2.11
Thx for your help
Adrian:Likely you were using maven.Jakob's report was with sbt.CheersOn Tue, Oct 13, 2015 at 10:05 PM, Adrian Tanase <email@example.com> wrote:Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works.
Sent from my iPhone
On 14 Oct 2015, at 03:53, Jakob Odersky <firstname.lastname@example.org> wrote:
--Jakobthanks for you help,followed byI'm having trouble compiling Spark with SBT for Scala 2.11. The command I use is:
in the sbt shell.
The error I get specifically is:
spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308: no valid targets for annotation on value conf - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
However I am also getting a large amount of deprecation warnings, making me wonder if I am supplying some incompatible/unsupported options to sbt. I am using Java 1.8 and the latest Spark master sources.Does someone know if I am doing anything wrong or is the sbt build broken?