spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Charles RISCH <>
Subject Difference bewteen library dependencies version
Date Thu, 04 Jun 2015 09:53:51 GMT
*(Before everything : I use IntellijIdea 14.0.1, SBT and Scala 2.11.6)*

This morning, I was looking to resolve the "Failed to locate the winutils
binary in the hadoop binary path" error.

I noticed that I can solve it configuring my build.sbt to


libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "1.0.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1" excludeAll(
  ExclusionRule(organization = "org.apache.hadoop")

libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1" excludeAll(
  ExclusionRule(organization = "org.apache.hadoop")


but if i change the line

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "1.0.4"


libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0"

the error is back.

What does it mean? Spark is build for an old version of hadoop? I really
want to understand.

*Also, a bonus question : *
As you can see I am using spark 1.3.1 and spark-mllib APIs. I am using the
last version, but my APIs are not corresponding to the latest official APIs

For example, to run a KMeans algo, I have to use KMeans.train() whereas it
does not exist in the latest API.

First time, I ask something in the mailing list, I hope I use it well.
Sorry for my bad english.

Thank you and have a good day,


View raw message