flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Till Rohrmann <trohrm...@apache.org>
Subject Re: Spark and Flink
Date Tue, 19 May 2015 13:15:03 GMT
I guess it's a typo: "eu.stratosphere" should be replaced by
"org.apache.flink"

On Tue, May 19, 2015 at 1:13 PM, Alexander Alexandrov <
alexander.s.alexandrov@gmail.com> wrote:

> We managed to do this with the following config:
>
>             // properties
>             <!-- Hadoop -->
>             <hadoop.version>2.2.0</hadoop.version>
>             <!-- Flink -->
>             <flink.version>0.9-SNAPSHOT</flink.version>
>             <!-- Spark -->
>             <spark.version>1.2.1</spark.version>
>
>             // form the dependency management
>             <!-- Hadoop -->
>             <dependency>
>                 <groupId>org.apache.hadoop</groupId>
>                 <artifactId>hadoop-common</artifactId>
>                 <version>${hadoop.version}</version>
>                 <scope>provided</scope>
>             </dependency>
>             <dependency>
>                 <groupId>org.apache.hadoop</groupId>
>                 <artifactId>hadoop-hdfs</artifactId>
>                 <version>${hadoop.version}</version>
>                 <scope>provided</scope>
>             </dependency>
>
>             <!-- Flink -->
>             <dependency>
>                 <groupId>eu.stratosphere</groupId>
>                 <artifactId>flink-scala</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>             </dependency>
>             <dependency>
>                 <groupId>eu.stratosphere</groupId>
>                 <artifactId>flink-java</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>             </dependency>
>             <dependency>
>                 <groupId>eu.stratosphere</groupId>
>                 <artifactId>flink-clients</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>             </dependency>
>
>             <!-- Spark -->
>             <dependency>
>                 <groupId>org.apache.spark</groupId>
>                 <artifactId>spark-core_${scala.tools.version}</artifactId>
>                 <version>${spark.version}</version>
>                 <scope>provided</scope>
>             </dependency>
>
>             <!-- Jetty -->
>             <dependency>
>                 <groupId>org.eclipse.jetty</groupId>
>                 <artifactId>jetty-util</artifactId>
>                 <version>${jetty.version}</version>
>             </dependency>
>             <dependency>
>                 <groupId>org.eclipse.jetty</groupId>
>                 <artifactId>jetty-servlet</artifactId>
>                 <version>${jetty.version}</version>
>             </dependency>
>
>         // actual dependencies
>         <!-- Spark -->
>         <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-core_${scala.tools.version}</artifactId>
>         </dependency>
>
>         <!-- Flink -->
>         <dependency>
>             <groupId>eu.stratosphere</groupId>
>             <artifactId>flink-scala</artifactId>
>         </dependency>
>         <dependency>
>             <groupId>eu.stratosphere</groupId>
>             <artifactId>flink-java</artifactId>
>         </dependency>
>         <dependency>
>             <groupId>eu.stratosphere</groupId>
>             <artifactId>flink-clients</artifactId>
>         </dependency>
>         <!-- FIXME: this is a hacky solution for a Flink issue with the
> Jackson deps-->
>         <dependency>
>             <groupId>com.fasterxml.jackson.core</groupId>
>             <artifactId>jackson-core</artifactId>
>             <version>2.2.1</version>
>             <scope>provided</scope>
>         </dependency>
>         <dependency>
>             <groupId>com.fasterxml.jackson.core</groupId>
>             <artifactId>jackson-databind</artifactId>
>             <version>2.2.1</version>
>             <scope>provided</scope>
>         </dependency>
>         <dependency>
>             <groupId>com.fasterxml.jackson.core</groupId>
>             <artifactId>jackson-annotations</artifactId>
>             <version>2.2.1</version>
>             <scope>provided</scope>
>         </dependency>
>
>
> 2015-05-19 10:06 GMT+02:00 Pa Rö <paul.roewer1990@googlemail.com>:
>
>> it's sound good, maybe you can send me pseudo structure, that is my fist
>> maven project.
>>
>> best regards,
>> paul
>>
>> 2015-05-18 14:05 GMT+02:00 Robert Metzger <rmetzger@apache.org>:
>>
>>> Hi,
>>> I would really recommend you to put your Flink and Spark dependencies
>>> into different maven modules.
>>> Having them both in the same project will be very hard, if not
>>> impossible.
>>> Both projects depend on similar projects with slightly different
>>> versions.
>>>
>>> I would suggest a maven module structure like this:
>>> yourproject-parent (a pom module)
>>> --> yourproject-common
>>> --> yourproject-flink
>>> --> yourproject-spark
>>>
>>>
>>>
>>> On Mon, May 18, 2015 at 10:00 AM, Pa Rö <paul.roewer1990@googlemail.com>
>>> wrote:
>>>
>>>> hi,
>>>> if i add your dependency i get over 100 errors, now i change the
>>>> version number:
>>>> <dependencies>
>>>>     <dependency>
>>>>         <groupId>com.fasterxml.jackson.module</groupId>
>>>>         <artifactId>jackson-module-scala_2.10</artifactId>
>>>>         <version>2.4.4</version>
>>>>         <exclusions>
>>>>             <exclusion>
>>>>                 <groupId>com.google.guava</groupId>
>>>>                 <artifactId>guava</artifactId>
>>>>             </exclusion>
>>>>         </exclusions>
>>>>     </dependency>
>>>>
>>>> now the pom is fine, but i get the same error by run spark:
>>>> WARN component.AbstractLifeCycle: FAILED
>>>> org.eclipse.jetty.servlet.DefaultServlet-608411067:
>>>> java.lang.NoSuchMethodError:
>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>
>>>> java.lang.NoSuchMethodError:
>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>     at
>>>> org.eclipse.jetty.servlet.NIOResourceCache.<init>(NIOResourceCache.java:41)
>>>>     at
>>>> org.eclipse.jetty.servlet.DefaultServlet.init(DefaultServlet.java:223)
>>>>     at javax.servlet.GenericServlet.init(GenericServlet.java:244)
>>>>     at
>>>> org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:442)
>>>>     at
>>>> org.eclipse.jetty.servlet.ServletHolder.doStart(ServletHolder.java:270)
>>>>     at
>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>     at
>>>> org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:721)
>>>>     at
>>>> org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:279)
>>>>     at
>>>> org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:717)
>>>>     at
>>>> org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:155)
>>>>     at
>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>     at
>>>> org.eclipse.jetty.server.handler.HandlerCollection.doStart(HandlerCollection.java:229)
>>>>     at
>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:172)
>>>>     at
>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>     at
>>>> org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:95)
>>>>     at org.eclipse.jetty.server.Server.doStart(Server.java:282)
>>>>     at
>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>     at
>>>> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
>>>>     at
>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>     at
>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>     at
>>>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>>>>     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>>>     at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>>>>     at
>>>> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
>>>>     at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>>>>     at
>>>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
>>>>     at
>>>> mgm.tp.bigdata.tempGeoKmeans.Spark.SparkMain.main(SparkMain.java:37)
>>>> ...
>>>>
>>>> what i do wrong?
>>>>
>>>> best regards
>>>> paul
>>>>
>>>> 2015-05-13 15:43 GMT+02:00 Ted Yu <yuzhihong@gmail.com>:
>>>>
>>>>> You can use exclusion to remove the undesired jetty version.
>>>>> Here is syntax:
>>>>>       <dependency>
>>>>>         <groupId>com.fasterxml.jackson.module</groupId>
>>>>>         <artifactId>jackson-module-scala_2.10</artifactId>
>>>>>         <version>${fasterxml.jackson.version}</version>
>>>>>         <exclusions>
>>>>>           <exclusion>
>>>>>             <groupId>com.google.guava</groupId>
>>>>>             <artifactId>guava</artifactId>
>>>>>           </exclusion>
>>>>>         </exclusions>
>>>>>       </dependency>
>>>>>
>>>>> On Wed, May 13, 2015 at 6:41 AM, Paul Röwer <
>>>>> paul.roewer1990@googlemail.com> wrote:
>>>>>
>>>>>> Okay. And how i get it clean in my maven project?
>>>>>>
>>>>>>
>>>>>> Am 13. Mai 2015 15:15:34 MESZ, schrieb Ted Yu <yuzhihong@gmail.com>:
>>>>>>>
>>>>>>> You can run the following command:
>>>>>>> mvn dependency:tree
>>>>>>>
>>>>>>> And see what jetty versions are brought in.
>>>>>>>
>>>>>>> Cheers
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On May 13, 2015, at 6:07 AM, Pa Rö <paul.roewer1990@googlemail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>> hi,
>>>>>>>
>>>>>>> i use spark and flink in the same maven project,
>>>>>>>
>>>>>>> now i get a exception on working with spark, flink work well
>>>>>>>
>>>>>>> the problem are transitiv dependencies.
>>>>>>>
>>>>>>> maybe somebody know a solution, or versions, which work together.
>>>>>>>
>>>>>>> best regards
>>>>>>> paul
>>>>>>>
>>>>>>> ps: a cloudera maven repo flink would be desirable
>>>>>>>
>>>>>>> my pom:
>>>>>>>
>>>>>>> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="
>>>>>>> http://www.w3.org/2001/XMLSchema-instance"
>>>>>>>   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
>>>>>>> http://maven.apache.org/xsd/maven-4.0.0.xsd">
>>>>>>>   <modelVersion>4.0.0</modelVersion>
>>>>>>>
>>>>>>>   <groupId>mgm.tp.bigdata</groupId>
>>>>>>>   <artifactId>tempGeoKmeans</artifactId>
>>>>>>>   <version>0.0.1-SNAPSHOT</version>
>>>>>>>   <packaging>jar</packaging>
>>>>>>>
>>>>>>>   <name>tempGeoKmeans</name>
>>>>>>>   <url>http://maven.apache.org</url>
>>>>>>>
>>>>>>>   <properties>
>>>>>>>
>>>>>>> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>>>>>>>   </properties>
>>>>>>>
>>>>>>>     <repositories>
>>>>>>>         <repository>
>>>>>>>               <id>cloudera</id>
>>>>>>>               <url>
>>>>>>> https://repository.cloudera.com/artifactory/cloudera-repos/</url>
>>>>>>>         </repository>
>>>>>>>       </repositories>
>>>>>>>
>>>>>>>   <dependencies>
>>>>>>>     <dependency>
>>>>>>>         <groupId>org.apache.spark</groupId>
>>>>>>>         <artifactId>spark-core_2.10</artifactId>
>>>>>>>         <version>1.1.0-cdh5.2.5</version>
>>>>>>>     </dependency>
>>>>>>>     <dependency>
>>>>>>>         <groupId>org.apache.hadoop</groupId>
>>>>>>>         <artifactId>hadoop-mapreduce-client-common</artifactId>
>>>>>>>         <version>2.5.0-cdh5.2.5</version>
>>>>>>>     </dependency>
>>>>>>>     <dependency>
>>>>>>>         <groupId>org.apache.hadoop</groupId>
>>>>>>>         <artifactId>hadoop-common</artifactId>
>>>>>>>         <version>2.5.0-cdh5.2.5</version>
>>>>>>>     </dependency>
>>>>>>>     <dependency>
>>>>>>>         <groupId>org.apache.mahout</groupId>
>>>>>>>         <artifactId>mahout-core</artifactId>
>>>>>>>         <version>0.9-cdh5.2.5</version>
>>>>>>>     </dependency>
>>>>>>>     <dependency>
>>>>>>>         <groupId>junit</groupId>
>>>>>>>         <artifactId>junit</artifactId>
>>>>>>>         <version>3.8.1</version>
>>>>>>>         <scope>test</scope>
>>>>>>>     </dependency>
>>>>>>>     <dependency>
>>>>>>>         <groupId>org.apache.flink</groupId>
>>>>>>>         <artifactId>flink-core</artifactId>
>>>>>>>         <version>0.8.1</version>
>>>>>>>     </dependency>
>>>>>>>     <dependency>
>>>>>>>         <groupId>org.apache.flink</groupId>
>>>>>>>         <artifactId>flink-java</artifactId>
>>>>>>>         <version>0.8.1</version>
>>>>>>>     </dependency>
>>>>>>>     <dependency>
>>>>>>>         <groupId>org.apache.flink</groupId>
>>>>>>>         <artifactId>flink-clients</artifactId>
>>>>>>>         <version>0.8.1</version>
>>>>>>>     </dependency>
>>>>>>>   </dependencies>
>>>>>>> </project>
>>>>>>>
>>>>>>> my exception:
>>>>>>>
>>>>>>> 5/05/13 15:00:48 WARN component.AbstractLifeCycle: FAILED
>>>>>>> org.eclipse.jetty.servlet.DefaultServlet-461261579:
>>>>>>> java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>> java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.NIOResourceCache.<init>(NIOResourceCache.java:41)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.DefaultServlet.init(DefaultServlet.java:223)
>>>>>>>     at javax.servlet.GenericServlet.init(GenericServlet.java:244)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:442)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.doStart(ServletHolder.java:270)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:721)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:279)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:717)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:155)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerCollection.doStart(HandlerCollection.java:229)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:172)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:95)
>>>>>>>     at org.eclipse.jetty.server.Server.doStart(Server.java:282)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>>>>>>>     at
>>>>>>> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
>>>>>>>     at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>>>>>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>>>>>>>     at
>>>>>>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
>>>>>>>     at
>>>>>>> mgm.tp.bigdata.tempGeoKmeans.Spark.SparkMain.main(SparkMain.java:37)
>>>>>>> 15/05/13 15:00:48 WARN component.AbstractLifeCycle: FAILED
>>>>>>> o.e.j.s.ServletContextHandler{/static,null}: java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>> java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.NIOResourceCache.<init>(NIOResourceCache.java:41)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.DefaultServlet.init(DefaultServlet.java:223)
>>>>>>>     at javax.servlet.GenericServlet.init(GenericServlet.java:244)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:442)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.doStart(ServletHolder.java:270)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:721)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:279)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:717)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:155)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerCollection.doStart(HandlerCollection.java:229)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:172)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:95)
>>>>>>>     at org.eclipse.jetty.server.Server.doStart(Server.java:282)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>>>>>>>     at
>>>>>>> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
>>>>>>>     at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>>>>>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>>>>>>>     at
>>>>>>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
>>>>>>>     at
>>>>>>> mgm.tp.bigdata.tempGeoKmeans.Spark.SparkMain.main(SparkMain.java:37)
>>>>>>> 15/05/13 15:00:48 WARN component.AbstractLifeCycle: FAILED
>>>>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection@48f860cd:
>>>>>>> java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>> java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.NIOResourceCache.<init>(NIOResourceCache.java:41)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.DefaultServlet.init(DefaultServlet.java:223)
>>>>>>>     at javax.servlet.GenericServlet.init(GenericServlet.java:244)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:442)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.doStart(ServletHolder.java:270)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:721)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:279)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:717)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:155)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerCollection.doStart(HandlerCollection.java:229)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:172)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:95)
>>>>>>>     at org.eclipse.jetty.server.Server.doStart(Server.java:282)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>>>>>>>     at
>>>>>>> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
>>>>>>>     at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>>>>>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>>>>>>>     at
>>>>>>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
>>>>>>>     at
>>>>>>> mgm.tp.bigdata.tempGeoKmeans.Spark.SparkMain.main(SparkMain.java:37)
>>>>>>> 15/05/13 15:00:48 WARN component.AbstractLifeCycle: FAILED
>>>>>>> org.eclipse.jetty.server.Server@2a9b5828:
>>>>>>> java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>> java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.NIOResourceCache.<init>(NIOResourceCache.java:41)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.DefaultServlet.init(DefaultServlet.java:223)
>>>>>>>     at javax.servlet.GenericServlet.init(GenericServlet.java:244)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:442)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.doStart(ServletHolder.java:270)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:721)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:279)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:717)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:155)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerCollection.doStart(HandlerCollection.java:229)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:172)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:95)
>>>>>>>     at org.eclipse.jetty.server.Server.doStart(Server.java:282)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>>>>>>>     at
>>>>>>> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
>>>>>>>     at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>>>>>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>>>>>>>     at
>>>>>>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
>>>>>>>     at
>>>>>>> mgm.tp.bigdata.tempGeoKmeans.Spark.SparkMain.main(SparkMain.java:37)
>>>>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>>>>>> org.eclipse.jetty.server.ResourceCache.<init>(Lorg/eclipse/jetty/http/MimeTypes;)V
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.NIOResourceCache.<init>(NIOResourceCache.java:41)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.DefaultServlet.init(DefaultServlet.java:223)
>>>>>>>     at javax.servlet.GenericServlet.init(GenericServlet.java:244)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:442)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHolder.doStart(ServletHolder.java:270)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:721)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:279)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:717)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:155)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerCollection.doStart(HandlerCollection.java:229)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:172)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:95)
>>>>>>>     at org.eclipse.jetty.server.Server.doStart(Server.java:282)
>>>>>>>     at
>>>>>>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$$anonfun$4.apply(JettyUtils.scala:209)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>>>>>>>     at
>>>>>>> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>>>>>>     at
>>>>>>> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
>>>>>>>     at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>>>>>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>>>>>>>     at
>>>>>>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
>>>>>>>     at
>>>>>>> mgm.tp.bigdata.tempGeoKmeans.Spark.SparkMain.main(SparkMain.java:37)
>>>>>>>
>>>>>>> in my class SparkMain line 37 are: JavaSparkContext sc = new
>>>>>>> JavaSparkContext(conf);
>>>>>>>
>>>>>>>
>>>>>> --
>>>>>> Diese Nachricht wurde von meinem Android-Mobiltelefon mit K-9 Mail
>>>>>> gesendet.
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Mime
View raw message