spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From SK <>
Subject correct upgrade process
Date Fri, 01 Aug 2014 18:59:42 GMT


I upgraded to 1.0.1 from 1.0 a couple of weeks ago and have been able to use
some of the features advertised in 1.0.1. However, I get some compilation
errors in some cases and based on user response, these errors have been
addressed in the 1.0.1 version and so I should not be getting these errors.
So I want to make sure I followed the correct upgrade process as below (I am
running Spark on single machine in standalone mode - so no cluster

- set SPARK_HOME to the new version

- run "sbt assembly" in SPARK_HOME to build the new Spark jars

- in the project sbt file point the libraryDependencies for spark-core and
other libraries to the 1.0.1 version and run "sbt assembly" to build the
project jar.

Is there anything else I need to do to ensure that no old jars are being
used? For example do I need to manually delete any old jars?


View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message