spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sanjay Subramanian <sanjaysubraman...@yahoo.com.INVALID>
Subject Re: Spark inside Eclipse
Date Sat, 04 Oct 2014 01:11:47 GMT
So some progress but still errors 

object WordCount {  def main(args: Array[String]) {    if (args.length < 1) {     
System.err.println("Usage: WordCount <file>")      System.exit(1)    }    val
conf = new SparkConf().setMaster("local").setAppName(s"Whatever")    val sc = new SparkContext(conf);
    val file = args(0)    val counts = sc.textFile(file).       flatMap(line =>
line.split("\\W")).       map(word => (word,1)).       reduceByKey((v1,v2) =>
v1+v2)
    counts.take(10).foreach(println)  }}

The errors I am getting are 14/10/03 18:09:17 INFO spark.SecurityManager: Changing view acls
to: sansub0114/10/03 18:09:17 INFO spark.SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(sansub01)Exception in thread
"main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)
at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala) at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150) at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111) at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:152)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:202) at com.roberthalf.common.utils.WordCount$.main(WordCount.scala:14)
at com.roberthalf.common.utils.WordCount.main(WordCount.scala)Caused by: java.lang.ClassNotFoundException:
scala.collection.GenTraversableOnce$class at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native
Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 12 more
I am gonna keep working to solve this. Meanwhile if u can provide some guidance that would
be cool 
sanjay        From: Daniel Siegmann <daniel.siegmann@velos.io>
 To: Ashish Jain <ashish.sdr@gmail.com> 
Cc: Sanjay Subramanian <sanjaysubramanian@yahoo.com>; "user@spark.apache.org" <user@spark.apache.org>

 Sent: Thursday, October 2, 2014 6:52 AM
 Subject: Re: Spark inside Eclipse
   
You don't need to do anything special to run in local mode from within Eclipse. Just create
a simple SparkConf and create a SparkContext from that. I have unit tests which execute on
a local SparkContext, and they work from inside Eclipse as well as SBT.

val conf = new SparkConf().setMaster("local").setAppName(s"Whatever")
val sc = new SparkContext(sparkConf)

Keep in mind you can only have one local SparkContext at a time, otherwise you will get some
weird errors. If you have tests running sequentially, make sure to close the SparkContext
in your tear down method. If tests run in parallel you'll need to share the SparkContext between
tests.

For unit testing, you can make use of SparkContext.parallelize to set up your test inputs
and RDD.collect to retrieve the outputs.




On Wed, Oct 1, 2014 at 7:43 PM, Ashish Jain <ashish.sdr@gmail.com> wrote:

Hello Sanjay,This can be done, and is a very effective way to debug.1) Compile and package
your project to get a fat jar
2) In your SparkConf use setJars and give location of this jar. Also set your master here
as local in SparkConf
3) Use this SparkConf when creating JavaSparkContext
4) Debug your program like you would any normal program.Hope this helps.Thanks
AshishOn Oct 1, 2014 4:35 PM, "Sanjay Subramanian" <sanjaysubramanian@yahoo.com.invalid>
wrote:

hey guys
Is there a way to run Spark in local mode from within Eclipse.I am running Eclipse Kepler
on a Macbook Pro with MavericksLike one can run hadoop map/reduce applications from within
Eclipse and debug and learn.
thanks
sanjay    




-- 
Daniel Siegmann, Software Developer
Velos
Accelerating Machine Learning

440 NINTH AVENUE, 11TH FLOOR, NEW YORK, NY 10001
E: daniel.siegmann@velos.io W: www.velos.io

  
Mime
View raw message