spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Christophe Billiard <christophe.billi...@gmail.com>
Subject Re: NoSuchMethodError: com.typesafe.config.Config.getDuration with akka-http/akka-stream
Date Fri, 02 Jan 2015 11:16:06 GMT
Thank you Akhil for your idea.

In spark-env.sh, I set
export
SPARK_CLASSPATH=/home/christophe/Development/spark-streaming3/config-1.2.1.jar

When I run  bin/compute-classpath.sh
I get Spark's classpath:
/home/christophe/Development/spark-streaming3/config-1.2.1.jar::/home/christophe/Development/spark-streaming3/conf:/home/christophe/Development/spark-streaming3/lib/spark-assembly-1.1.1-hadoop2.4.0.jar:/home/christophe/Development/spark-streaming3/lib/datanucleus-api-jdo-3.2.1.jar:/home/christophe/Development/spark-streaming3/lib/datanucleus-rdbms-3.2.1.jar:/home/christophe/Development/spark-streaming3/lib/datanucleus-core-3.2.2.jar

Few errors later, Spark's classpath looks like:
/home/christophe/Development/spark-streaming3/config-1.2.1.jar:akka-stream-experimental_2.10-1.0-M2.jar:reactive-streams-1.0.0.M3.jar::/home/christophe/Development/spark-streaming3/conf:/home/christophe/Development/spark-streaming3/lib/spark-assembly-1.1.1-hadoop2.4.0.jar:/home/christophe/Development/spark-streaming3/lib/datanucleus-api-jdo-3.2.1.jar:/home/christophe/Development/spark-streaming3/lib/datanucleus-rdbms-3.2.1.jar:/home/christophe/Development/spark-streaming3/lib/datanucleus-core-3.2.2.jar

And the error is now:
Exception in thread "main" java.lang.NoSuchMethodError:
akka.actor.ExtendedActorSystem.systemActorOf(Lakka/actor/Props;Ljava/lang/String;)Lakka/actor/ActorRef;
    at akka.stream.scaladsl.StreamTcp.<init>(StreamTcp.scala:147)
    at akka.stream.scaladsl.StreamTcp$.createExtension(StreamTcp.scala:140)
    at akka.stream.scaladsl.StreamTcp$.createExtension(StreamTcp.scala:32)
    at akka.actor.ActorSystemImpl.registerExtension(ActorSystem.scala:654)
    at akka.actor.ExtensionId$class.apply(Extension.scala:79)
    at akka.stream.scaladsl.StreamTcp$.apply(StreamTcp.scala:134)
    at akka.http.HttpExt.bind(Http.scala:33)
    at SimpleAppStreaming3$.main(SimpleAppStreaming3.scala:250)
    at SimpleAppStreaming3.main(SimpleAppStreaming3.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:329)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I am trying to add akka-actor_2.10-2.3.7.jar to Spark's classpath:
And the error gets worse (Spark can't even start anymore):
[ERROR] [01/02/2015 12:08:14.807]
[sparkDriver-akka.actor.default-dispatcher-4] [ActorSystem(sparkDriver)]
Uncaught fatal error from thread
[sparkDriver-akka.actor.default-dispatcher-4] shutting down ActorSystem
[sparkDriver]
java.lang.AbstractMethodError
    at akka.actor.ActorCell.create(ActorCell.scala:580)
    at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
    at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
    at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:279)
    at akka.dispatch.Mailbox.run(Mailbox.scala:220)
    at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

[ERROR] [01/02/2015 12:08:14.811]
[sparkDriver-akka.actor.default-dispatcher-3] [ActorSystem(sparkDriver)]
Uncaught fatal error from thread
[sparkDriver-akka.actor.default-dispatcher-3] shutting down ActorSystem
[sparkDriver]
java.lang.AbstractMethodError:
akka.event.slf4j.Slf4jLogger.aroundReceive(Lscala/PartialFunction;Ljava/lang/Object;)V
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
    at akka.dispatch.Mailbox.run(Mailbox.scala:221)
    at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

My guess is that akka-actor_2.10-2.3.7.jar is overiding the akka-actor's
version of Spark.
But if I am not overidding it, I can't use akka-http/akka-stream.
Is there a way to work around this problem?

Thanks,
Best regards


On Thu, Jan 1, 2015 at 9:28 AM, Akhil Das <akhil@sigmoidanalytics.com>
wrote:

> Its a typesafe jar conflict, you will need to put the jar with getDuration
> method in the first position of your classpath.
>
> Thanks
> Best Regards
>
> On Wed, Dec 31, 2014 at 4:38 PM, Christophe Billiard <
> christophe.billiard@gmail.com> wrote:
>
>> Hi all,
>>
>> I am currently trying to combine datastax's "spark-cassandra-connector"
>> and
>> typesafe's "akka-http-experimental"
>> on Spark 1.1.1 (spark-cassandra-connector for Spark 1.2.0 not out yet) and
>> scala 2.10.4
>> I am using the hadoop 2.4 pre built package. (build.sbt file at the end)
>>
>> To solve the java.lang.NoClassDefFoundError:
>> com/datastax/spark/connector/mapper/ColumnMapper
>> and other NoClassDefFoundErrors, I have to give some jars to Spark
>> (build.sbt is not enough).
>> The connectors works fine.
>>
>> My spark submit looks like:
>> sbt clean package; bin/spark-submit   --class "SimpleAppStreaming3"
>> --master local[*]  --jars
>>
>> "spark-cassandra-connector_2.10-1.1.0.jar","cassandra-driver-core-2.1.3.jar","cassandra-thrift-2.0.5.jar","joda-time-2.6.jar"
>> target/scala-2.10/simple-project_2.10-1.0.jar
>>
>> Then I am trying to add some akka-http/akka-stream features.
>> Like before I get a java.lang.NoClassDefFoundError:
>> akka/stream/FlowMaterializer$
>> Same solution, I begin to add jars.
>>
>> Now my spark submit looks like:
>> sbt clean package; bin/spark-submit   --class impleAppStreaming3"
>>  --master
>> local[*]  --jars
>>
>> "spark-cassandra-connector_2.10-1.1.0.jar","cassandra-driver-core-2.1.3.jar","cassandra-thrift-2.0.5.jar","joda-time-2.6.jar","akka-stream-experimental_2.10-1.0-M2.jar"
>> target/scala-2.10/simple-project_2.10-1.0.jar
>>
>> Then I have a new kind of error:
>> Exception in thread "main" java.lang.NoSuchMethodError:
>>
>> com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J
>>         at
>>
>> akka.stream.StreamSubscriptionTimeoutSettings$.apply(FlowMaterializer.scala:256)
>>         at
>> akka.stream.MaterializerSettings$.apply(FlowMaterializer.scala:185)
>>         at
>> akka.stream.MaterializerSettings$.apply(FlowMaterializer.scala:172)
>>         at
>> akka.stream.FlowMaterializer$$anonfun$1.apply(FlowMaterializer.scala:42)
>>         at
>> akka.stream.FlowMaterializer$$anonfun$1.apply(FlowMaterializer.scala:42)
>>         at scala.Option.getOrElse(Option.scala:120)
>>         at akka.stream.FlowMaterializer$.apply(FlowMaterializer.scala:42)
>>         at SimpleAppStreaming3$.main(SimpleAppStreaming3.scala:240)
>>         at SimpleAppStreaming3.main(SimpleAppStreaming3.scala)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:329)
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> I can't get rid of this error.
>> I tried:
>> 1) adding several jars (including "config-1.2.1.jar")
>> 2) studying the dependency tree (with
>> https://github.com/jrudolph/sbt-dependency-graph)
>> 3) excluding libraryDependencies (with dependencyOverrides)
>>
>> Any ideas?
>>
>> Bonus question: Is there a way to avoid adding all these jars with --jars?
>>
>> *My build.sbt file*
>>
>> name := "Simple Project"
>>
>> version := "1.0"
>>
>> scalaVersion := "2.10.4"
>>
>> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.1"
>> //exclude("com.typesafe", "config")
>>
>> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.1.1"
>>
>> libraryDependencies += "com.datastax.cassandra" % "cassandra-driver-core"
>> %
>> "2.1.3"
>>
>> libraryDependencies += "com.datastax.spark" %%
>> "spark-cassandra-connector" %
>> "1.1.0" withSources() withJavadoc()
>>
>> libraryDependencies += "org.apache.cassandra" % "cassandra-thrift" %
>> "2.0.5"
>>
>> libraryDependencies += "joda-time" % "joda-time" % "2.6"
>>
>>
>>
>> libraryDependencies += "com.typesafe.akka" %% "akka-actor"      % "2.3.8"
>>
>> libraryDependencies += "com.typesafe.akka" %% "akka-testkit"    % "2.3.8"
>>
>> libraryDependencies += "org.apache.hadoop" %  "hadoop-client"   % "2.4.0"
>>
>> libraryDependencies += "ch.qos.logback"    %  "logback-classic" % "1.1.2"
>>
>> libraryDependencies += "org.mockito"       %  "mockito-all"     %
>> "1.10.17"
>>
>> libraryDependencies += "org.scalatest"     %% "scalatest"       % "2.2.3"
>>
>> libraryDependencies += "org.slf4j"         %  "slf4j-api"       % "1.7.5"
>>
>> libraryDependencies += "org.apache.spark"  %% "spark-streaming" % "1.1.1"
>>
>>
>> libraryDependencies += "com.typesafe.akka" %% "akka-stream-experimental"
>> % "1.0-M2"
>>
>> libraryDependencies += "com.typesafe.akka" %% "akka-http-experimental"
>> % "1.0-M2"
>>
>> libraryDependencies += "com.typesafe.akka" %%
>> "akka-http-core-experimental"
>> % "1.0-M2"
>>
>>
>> libraryDependencies += "com.typesafe" % "config" % "1.2.1"
>>
>> dependencyOverrides += "com.typesafe" % "config" % "1.2.1"
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Mime
View raw message