spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: Got java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s when running job from intellij Idea
Date Tue, 04 Nov 2014 18:33:49 GMT
Hadoop is certainly bringing them back in. You should mark all Hadoop
and Spark deps as "provided" to not even build them into your app.

On Tue, Nov 4, 2014 at 4:49 PM, Jaonary Rabarisoa <jaonary@gmail.com> wrote:
> I don't understand why since there's no javax.servlet in my build.sbt :
>
>
> scalaVersion := "2.10.4"
>
> libraryDependencies ++= Seq(
>   "org.apache.spark" %% "spark-core" % "1.1.0",
>   "org.apache.spark" %% "spark-sql" % "1.1.0",
>   "org.apache.spark" %% "spark-mllib" % "1.1.0",
>   "org.apache.hadoop" % "hadoop-client" % "2.4.0",
>   "org.apache.commons" % "commons-math3" % "3.3",
>   "org.scalatest" % "scalatest_2.10" % "2.2.0" % "test"
> )
>
> resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
>
>
> On Tue, Nov 4, 2014 at 11:00 AM, Sean Owen <sowen@cloudera.com> wrote:
>>
>> Generally this means you included some javax.servlet dependency in
>> your project deps. You should exclude any of these as they conflict in
>> this bad way with other copies of the servlet API from Spark.
>>
>> On Tue, Nov 4, 2014 at 7:55 AM, Jaonary Rabarisoa <jaonary@gmail.com>
>> wrote:
>> > Hi all,
>> >
>> > I have a spark job that I build with sbt and I can run without any
>> > problem
>> > with sbt run. But when I run it inside IntelliJ Idea I got the following
>> > error :
>> >
>> > Exception encountered when invoking run on a nested suite - class
>> > "javax.servlet.FilterRegistration"'s signer information does not match
>> > signer information of other classes in the same package
>> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
>> > signer information does not match signer information of other classes in
>> > the
>> > same package
>> > at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
>> > at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666)
>> > at java.lang.ClassLoader.defineClass(ClassLoader.java:794)
>> > at
>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>> > at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>> > at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>> > at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>> > at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> > at java.security.AccessController.doPrivileged(Native Method)
>> > at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> > at
>> >
>> > org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
>> > at
>> >
>> > org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
>> > at
>> >
>> > org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
>> > at
>> > org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98)
>> > at
>> > org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89)
>> > at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:67)
>> > at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:60)
>> > at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:60)
>> > at
>> >
>> > scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>> > at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>> > at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:60)
>> > at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66)
>> > at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:60)
>> > at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:42)
>> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
>> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:98)
>> >
>> >
>> > How can I solve this ?
>> >
>> >
>> > Cheers,
>> >
>> >
>> > Jao
>> >
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message