mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Musselman <andrew.mussel...@gmail.com>
Subject Re: Spark shell broken
Date Tue, 24 Feb 2015 19:04:11 GMT
I'll try that out, make the branch, and push the last commit to master.

On Tue, Feb 24, 2015 at 10:55 AM, Pat Ferrel <pat@occamsmachete.com> wrote:

> to be safe I’d “git reset —hard xyz” to the commit previous to the 1.2.1
>
> I merged a big commit with this and upgraded my cluster to 1.2.1 so will
> stick with this for a bit.
>
> If anyone has a clue please speak up. It seems related to starting a
> context. The error in spark-itemsimilarity is much simpler than the shell
> one.
>
> 15/02/24 10:17:57 INFO spark.SecurityManager: Changing view acls to: pat,
> 15/02/24 10:17:57 INFO spark.SecurityManager: Changing modify acls to: pat,
> 15/02/24 10:17:57 INFO spark.SecurityManager: SecurityManager:
> authentication disabled; ui acls disabled; users with view permissions:
> Set(pat, ); users with modify permissions: Set(pat, )
> Exception in thread "main" com.typesafe.config.ConfigException$Missing: No
> configuration setting found for key 'akka.event-handlers'
>     at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
>     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
>     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
>     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
>     at com.typesafe.config.impl.SimpleConfig.getList(SimpleConfig.java:203)
>     at
> com.typesafe.config.impl.SimpleConfig.getHomogeneousUnwrappedList(SimpleConfig.java:260)
>     at
> com.typesafe.config.impl.SimpleConfig.getStringList(SimpleConfig.java:318)
>     at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
>     at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
>     at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>     at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>     at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>     at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>     at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
>     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>     at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
>     at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
>     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
>     at
> org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:95)
>     at
> org.apache.mahout.drivers.MahoutSparkDriver.start(MahoutSparkDriver.scala:81)
>     at
> org.apache.mahout.drivers.ItemSimilarityDriver$.start(ItemSimilarityDriver.scala:118)
>
>
> On Feb 24, 2015, at 10:48 AM, Andrew Musselman <andrew.musselman@gmail.com>
> wrote:
>
> Roll back meaning just the entry in the pom?
>
> On Tue, Feb 24, 2015 at 10:31 AM, Pat Ferrel <pat@occamsmachete.com>
> wrote:
>
> > 1.2? I thought the previous version was Spark 1.1.0?
> >
> > I need 1.2  so I’m up for trying to fix this. It was a contribution,
> maybe
> > the originator has a clue.
> >
> > BTW can’t run spark-itemsimilarity on the cluster either though all unit
> > tests pass and the cluster seems to be working with their shell and
> > examples.
> >
> > I get:
> >
> > Exception in thread "main" com.typesafe.config.ConfigException$Missing:
> No
> > configuration setting found for key 'akka.event-handlers'
> >        at
> > com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
> >        at
> > com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
> >        at
> > com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
> >        at
> > com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
> >        at
> > com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
> >        at
> > com.typesafe.config.impl.SimpleConfig.getList(SimpleConfig.java:203)
> >        at
> >
> com.typesafe.config.impl.SimpleConfig.getHomogeneousUnwrappedList(SimpleConfig.java:260)
> >        at
> >
> com.typesafe.config.impl.SimpleConfig.getStringList(SimpleConfig.java:318)
> >        at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
> >        at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
> >        ...
> >
> > On Feb 24, 2015, at 10:22 AM, Dmitriy Lyubimov <dlieu.7@gmail.com>
> wrote:
> >
> > As a remedy, i'd suggest to branch out spark 1.2 work and rollback 1.2.1
> > commit on master until 1.2 branch is fixed.
> >
> > On Tue, Feb 24, 2015 at 10:19 AM, Dmitriy Lyubimov <dlieu.7@gmail.com>
> > wrote:
> >
> >> oops.
> >>
> >> tests dont test shell startup.
> >>
> >> apparently stuff got out of sync with 1.2
> >>
> >> On Tue, Feb 24, 2015 at 10:02 AM, Pat Ferrel <pat@occamsmachete.com>
> >> wrote:
> >>
> >>> Me too and I built with 1.2.1
> >>>
> >>> On Feb 24, 2015, at 9:50 AM, Andrew Musselman <
> > andrew.musselman@gmail.com>
> >>> wrote:
> >>>
> >>> I've just rebuild mahout master and spark v1.2.1-rc2 and am getting
> this
> >>> error when I try out the spark-shell; am I missing something?
> >>>
> >>> $ bin/mahout spark-shell
> >>> SLF4J: Class path contains multiple SLF4J bindings.
> >>> SLF4J: Found binding in
> >>>
> >>>
> >
> [jar:file:/home/akm/mahout/mrlegacy/target/mahout-mrlegacy-1.0-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: Found binding in
> >>>
> >>>
> >
> [jar:file:/home/akm/mahout/spark/target/mahout-spark_2.10-1.0-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: Found binding in
> >>>
> >>>
> >
> [jar:file:/home/akm/spark/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop1.1.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>> explanation.
> >>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> >>> error:
> >>>   while compiling: <init>
> >>>      during phase: typer
> >>>   library version: version 2.10.4
> >>>  compiler version: version 2.10.0
> >>> reconstructed args:
> >>>
> >>> last tree to typer: Literal(Constant(()))
> >>>            symbol: null
> >>> symbol definition: null
> >>>               tpe: Unit
> >>>     symbol owners:
> >>>    context owners: package <empty>
> >>>
> >>> == Enclosing template or block ==
> >>>
> >>> Block( // tree.tpe=Unit
> >>> {}
> >>> ()
> >>> )
> >>>
> >>> == Expanded type of tree ==
> >>>
> >>> TypeRef(TypeSymbol(final abstract class Unit extends AnyVal))
> >>>
> >>> uncaught exception during compilation: java.lang.NoSuchMethodError
> >>>
> >>> Failed to initialize compiler: NoSuchMethodError.
> >>> This is most often remedied by a full clean and recompile.
> >>> Otherwise, your classpath may continue bytecode compiled by
> >>> different and incompatible versions of scala.
> >>>
> >>> java.lang.NoSuchMethodError:
> >>>
> >>>
> >
> scala.reflect.internal.TreeInfo.firstArgument(Lscala/reflect/internal/Trees$Tree;)Lscala/reflect/internal/Trees$Tree;
> >>>  at
> >>> scala.tools.nsc.typechecker.Typers$Typer.parentTypes(Typers.scala:1550)
> >>>  at
> >>> scala.tools.nsc.typechecker.Namers$Namer.templateSig(Namers.scala:861)
> >>>  at scala.tools.nsc.typechecker.Namers$Namer.classSig(Namers.scala:907)
> >>>  at
> > scala.tools.nsc.typechecker.Namers$Namer.getSig$1(Namers.scala:1289)
> >>>  at scala.tools.nsc.typechecker.Namers$Namer.typeSig(Namers.scala:1347)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply$mcV$sp(Namers.scala:709)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:708)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:708)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer.scala$tools$nsc$typechecker$Namers$Namer$$logAndValidate(Namers.scala:1385)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:708)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:707)
> >>>  at
> >>>
> >
> scala.tools.nsc.typechecker.Namers$$anon$1.completeImpl(Namers.scala:1496)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$LockingTypeCompleter$class.complete(Namers.scala:1504)
> >>>  at
> >>> scala.tools.nsc.typechecker.Namers$$anon$1.complete(Namers.scala:1494)
> >>>  at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
> >>>  at
> > scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1374)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5119)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5458)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:29)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedStat$1(Typers.scala:2770)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$55.apply(Typers.scala:2870)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$55.apply(Typers.scala:2870)
> >>>  at scala.collection.immutable.List.loop$1(List.scala:170)
> >>>  at scala.collection.immutable.List.mapConserve(List.scala:186)
> >>>  at
> >>> scala.tools.nsc.typechecker.Typers$Typer.typedStats(Typers.scala:2870)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Typers$Typer.typedPackageDef$1(Typers.scala:5127)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5404)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5458)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:29)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5509)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.apply(Analyzer.scala:98)
> >>>  at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:461)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:90)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:90)
> >>>  at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> >>>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.run(Analyzer.scala:90)
> >>>  at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1574)
> >>>  at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1548)
> >>>  at scala.tools.nsc.Global$Run.compileSources(Global.scala:1544)
> >>>  at org.apache.spark.repl.SparkIMain.org
> >>> $apache$spark$repl$SparkIMain$$_initialize(SparkIMain.scala:187)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:208)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:961)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> >>>  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
> >>>  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
> >>>  at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
> >>>  at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
> >>> 15/02/24 09:42:52 WARN SparkILoop$SparkILoopInterpreter: Warning:
> > compiler
> >>> accessed before init set up.  Assuming no postInit code.
> >>> error:
> >>>   while compiling: <console>
> >>>      during phase: typer
> >>>   library version: version 2.10.4
> >>>  compiler version: version 2.10.0
> >>> reconstructed args:
> >>>
> >>> last tree to typer: Literal(Constant(()))
> >>>            symbol: null
> >>> symbol definition: null
> >>>               tpe: Unit
> >>>     symbol owners:
> >>>    context owners: package $line1
> >>>
> >>> == Enclosing template or block ==
> >>>
> >>> Block( // tree.tpe=Unit
> >>> {}
> >>> ()
> >>> )
> >>>
> >>> == Expanded type of tree ==
> >>>
> >>> TypeRef(TypeSymbol(final abstract class Unit extends AnyVal))
> >>>
> >>> uncaught exception during compilation: java.lang.NoSuchMethodError
> >>> Exception in thread "main" java.lang.NoSuchMethodError:
> >>>
> >>>
> >
> scala.reflect.internal.TreeInfo.firstArgument(Lscala/reflect/internal/Trees$Tree;)Lscala/reflect/internal/Trees$Tree;
> >>>  at
> >>> scala.tools.nsc.typechecker.Typers$Typer.parentTypes(Typers.scala:1550)
> >>>  at
> >>> scala.tools.nsc.typechecker.Namers$Namer.templateSig(Namers.scala:861)
> >>>  at
> > scala.tools.nsc.typechecker.Namers$Namer.getSig$1(Namers.scala:1300)
> >>>  at scala.tools.nsc.typechecker.Namers$Namer.typeSig(Namers.scala:1347)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply$mcV$sp(Namers.scala:709)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:708)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1$$anonfun$apply$1.apply(Namers.scala:708)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer.scala$tools$nsc$typechecker$Namers$Namer$$logAndValidate(Namers.scala:1385)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:708)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$Namer$$anonfun$monoTypeCompleter$1.apply(Namers.scala:707)
> >>>  at
> >>>
> >
> scala.tools.nsc.typechecker.Namers$$anon$1.completeImpl(Namers.scala:1496)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Namers$LockingTypeCompleter$class.complete(Namers.scala:1504)
> >>>  at
> >>> scala.tools.nsc.typechecker.Namers$$anon$1.complete(Namers.scala:1494)
> >>>  at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1231)
> >>>  at
> > scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1374)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5119)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5458)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:29)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedStat$1(Typers.scala:2770)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$55.apply(Typers.scala:2870)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Typers$Typer$$anonfun$55.apply(Typers.scala:2870)
> >>>  at scala.collection.immutable.List.loop$1(List.scala:170)
> >>>  at scala.collection.immutable.List.mapConserve(List.scala:186)
> >>>  at
> >>> scala.tools.nsc.typechecker.Typers$Typer.typedStats(Typers.scala:2870)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Typers$Typer.typedPackageDef$1(Typers.scala:5127)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5404)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5458)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:29)
> >>>  at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5509)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.apply(Analyzer.scala:98)
> >>>  at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:461)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:90)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:90)
> >>>  at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> >>>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.run(Analyzer.scala:90)
> >>>  at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1574)
> >>>  at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1548)
> >>>  at scala.tools.nsc.Global$Run.compileSources(Global.scala:1544)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkIMain.compileSourcesKeepingRun(SparkIMain.scala:528)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.compileAndSaveRun(SparkIMain.scala:923)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.compile(SparkIMain.scala:879)
> >>>  at org.apache.spark.repl.SparkIMain.bind(SparkIMain.scala:719)
> >>>  at org.apache.spark.repl.SparkIMain.bind(SparkIMain.scala:762)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkIMain$$anonfun$quietBind$1.apply(SparkIMain.scala:761)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkIMain$$anonfun$quietBind$1.apply(SparkIMain.scala:761)
> >>>  at
> > org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:270)
> >>>  at org.apache.spark.repl.SparkIMain.quietBind(SparkIMain.scala:761)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply$mcV$sp(SparkILoop.scala:935)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:147)
> >>>  at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:60)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
> >>>  at
> >>>
> >>>
> >
> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:962)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
> >>>  at
> >>>
> >>>
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
> >>>  at
> >>>
> >>>
> >
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> >>>  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
> >>>  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
> >>>  at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
> >>>  at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
> >>>
> >>>
> >>
> >
> >
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message