Return-Path: X-Original-To: apmail-flink-dev-archive@www.apache.org Delivered-To: apmail-flink-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 7054518718 for ; Tue, 12 Jan 2016 11:22:07 +0000 (UTC) Received: (qmail 78485 invoked by uid 500); 12 Jan 2016 11:22:07 -0000 Delivered-To: apmail-flink-dev-archive@flink.apache.org Received: (qmail 78402 invoked by uid 500); 12 Jan 2016 11:22:07 -0000 Mailing-List: contact dev-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@flink.apache.org Delivered-To: mailing list dev@flink.apache.org Received: (qmail 78390 invoked by uid 99); 12 Jan 2016 11:22:06 -0000 Received: from Unknown (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 12 Jan 2016 11:22:06 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 7F2471A0761 for ; Tue, 12 Jan 2016 11:22:06 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.001 X-Spam-Level: *** X-Spam-Status: No, score=3.001 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HEADER_FROM_DIFFERENT_DOMAINS=0.001, HTML_MESSAGE=3, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-east.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id c6IZ7Pt3tq8r for ; Tue, 12 Jan 2016 11:21:54 +0000 (UTC) Received: from mail-ig0-f175.google.com (mail-ig0-f175.google.com [209.85.213.175]) by mx1-us-east.apache.org (ASF Mail Server at mx1-us-east.apache.org) with ESMTPS id EB036439C4 for ; Tue, 12 Jan 2016 11:21:53 +0000 (UTC) Received: by mail-ig0-f175.google.com with SMTP id mw1so128682781igb.1 for ; Tue, 12 Jan 2016 03:21:53 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:sender:in-reply-to:references:date:message-id:subject :from:to:content-type; bh=HOgRTZMfrNnyy1qwU95uOZP0vD8OnKjpc7SoZpXn5jQ=; b=cDn4gPFIAYIkYqDP22EFFHjVEn8gneH6IqamQ6xdowKCEh/6vjNc94eN+hMudriyzx L05aMiGCOhJRLIpTMu69EKfTrtJSG5VIlLAy7ZjQ9TWERbGtQU59AyIZpU9XvnnEL6Id bl8UK+NSPlF8v/rR/X8K+WFdkIAPrs3HQmzb84JI/FM0WZZm0nFEQqeHbatEw9J/Jcru 9kOzmr73oZ1c/NGBP/XZFr0vyAl/pjohhqZ+MClLd4cF9HwaeYtwJKzeuacv5t4u1O3C Zgu2WTgafzHD2roa/UJgEZViJGk7j9GOu+FZbBNVP49g8t9whQJaM3Yjo3+ds2e4NBEB o6Fw== MIME-Version: 1.0 X-Received: by 10.50.134.233 with SMTP id pn9mr15800735igb.71.1452597706943; Tue, 12 Jan 2016 03:21:46 -0800 (PST) Sender: ewenstephan@gmail.com Received: by 10.107.159.194 with HTTP; Tue, 12 Jan 2016 03:21:46 -0800 (PST) In-Reply-To: <3982127F-2FDE-4DBF-95D4-E21148163151@apache.org> References: <05F12AC36189EC4C94C467F604CE67662BE907F6@BGSMSX103.gar.corp.intel.com> <05F12AC36189EC4C94C467F604CE67662BE91BE8@BGSMSX103.gar.corp.intel.com> <05F12AC36189EC4C94C467F604CE67662BE92639@BGSMSX103.gar.corp.intel.com> <05F12AC36189EC4C94C467F604CE67662BE929BE@BGSMSX103.gar.corp.intel.com> <3982127F-2FDE-4DBF-95D4-E21148163151@apache.org> Date: Tue, 12 Jan 2016 12:21:46 +0100 X-Google-Sender-Auth: Yv9BnZv8BapAgolIcPEi9zuipKE Message-ID: Subject: Re: Naive question From: Stephan Ewen To: "dev@flink.apache.org" Content-Type: multipart/alternative; boundary=047d7b2e159752f49f05292142a0 --047d7b2e159752f49f05292142a0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable @Chiwan: Is this still up to date from your experience? https://ci.apache.org/projects/flink/flink-docs-release-0.10/internals/ide_= setup.html On Tue, Jan 12, 2016 at 12:04 PM, Chiwan Park wrote= : > Hi Ram, > > Because there are some Scala IDE (Eclipse) plugins needed, I recommend to > avoid `mvn eclipse:eclipse` command. Could you try just run `mvn clean > install -DskipTests` and import the project to Scala IDE directly? In > middle of importing process, Scala IDE suggests some plugins needed. > > And which version of Scala IDE you are using? > > > On Jan 12, 2016, at 7:58 PM, Vasudevan, Ramkrishna S < > ramkrishna.s.vasudevan@intel.com> wrote: > > > > Yes. I added it as Maven project only. I did mvn eclipse:eclipse to > create the project and also built the code using mvn clean install > -DskipTests. > > > > Regards > > Ram > > > > -----Original Message----- > > From: ewenstephan@gmail.com [mailto:ewenstephan@gmail.com] On Behalf Of > Stephan Ewen > > Sent: Tuesday, January 12, 2016 4:10 PM > > To: dev@flink.apache.org > > Subject: Re: Naive question > > > > Sorry to hear that it did not work out with Eclipse at all in the end, > even with all adjustments. > > > > Just making sure: You imported Flink as a Maven project, not manually > adding the big Flink dependency JAR? > > > > On Tue, Jan 12, 2016 at 5:15 AM, Vasudevan, Ramkrishna S < > ramkrishna.s.vasudevan@intel.com> wrote: > > > >> Thanks to all. I tried with Scala Eclipse IDE with all these > >> 'change-scala-version.sh'. But in vain. > >> > >> So I switched over to Intellij and thing work fine over there. I am > >> new to Intellij so will try using it. > >> > >> Once again thanks for helping me out. > >> > >> Regards > >> Ram > >> > >> -----Original Message----- > >> From: Chiwan Park [mailto:chiwanpark@apache.org] > >> Sent: Monday, January 11, 2016 4:37 PM > >> To: dev@flink.apache.org > >> Subject: Re: Naive question > >> > >> Hi Ram, > >> > >> If you want to build Flink with Scala 2.10, just checkout Flink > >> repository from github or download source code from homepage, run `mvn > >> clean install -DskipTests` and import projects to your IDE. If you > >> want to build Flink with Scala 2.11, you have to run > >> `tools/change-scala-version.sh 2.11` before build the project. You can > >> revert Scala version change by running `tools/change-scala-version.sh > 2.10`. > >> > >> About IDE, Flink community recommends IntelliJ IDEA because Scala IDE > >> have some problems in Java/Scala mixed project like Flink. But I > >> tested importing Flink project with Scala IDE 4.3.0, Scala 2.11.7 and > >> Flink 0.10.0 source code. Note that you should import the project as > maven project. > >> > >> By the way, the community welcomes any questions. Please feel free to > >> post questions. :) > >> > >>> On Jan 11, 2016, at 7:30 PM, Vasudevan, Ramkrishna S < > >> ramkrishna.s.vasudevan@intel.com> wrote: > >>> > >>> Thank you very much for the reply. > >>> I tried different ways and when I tried setting up the root pom.xml > >>> to > >>> 2.11 > >>> > >>> 2.11.6 > >>> 2.11 > >>> > >>> I got the following error > >>> [INFO] > >>> -------------------------------------------------------------------- > >>> -- > >>> -- [ERROR] Failed to execute goal on project flink-scala: Could not > >>> resolve depende ncies for project > >>> org.apache.flink:flink-scala:jar:1.0-SNAPSHOT: Could not find > >>> artifact > >>> org.scalamacros:quasiquotes_2.11:jar:2.0.1 in central > >>> (http://repo.mave > >>> n.apache.org/maven2) -> [Help 1] > >>> > >>> If I leave the scala.binary.verson to be at 2.10 and the scala > >>> version to be at 2.11.6 then I get the following problem [INFO] > >>> C:\flink\flink\flink-runtime\src\test\scala:-1: info: compiling > >>> [INFO] Compiling 366 source files to > >>> C:\flink\flink\flink-runtime\target\test-cl > >>> asses at 1452508064750 > >>> [ERROR] > >>> C:\flink\flink\flink-runtime\src\test\scala\org\apache\flink\runtime > >>> \j > >>> ob > >>> manager\JobManagerITCase.scala:700: error: can't expand macros > >>> compiled by previ ous versions of Scala > >>> [ERROR] assert(cachedGraph2.isArchived) > >>> [ERROR] ^ > >>> > >>> So am not pretty sure how to proceed with this. If I try to change > >>> the > >> version of scala to 2.10 in the IDE then I get lot of compilation > issues. > >> IS there any way to over come this? > >>> > >>> Once again thanks a lot and apologies for the na=C3=AFve question. > >>> > >>> Regards > >>> Ram > >>> -----Original Message----- > >>> From: ewenstephan@gmail.com [mailto:ewenstephan@gmail.com] On Behalf > >>> Of Stephan Ewen > >>> Sent: Friday, January 8, 2016 5:01 PM > >>> To: dev@flink.apache.org > >>> Subject: Re: Naive question > >>> > >>> Hi! > >>> > >>> This looks like a mismatch between the Scala dependency in Flink and > >>> Scala in your Eclipse. Make sure you use the same for both. By > >>> default, Flink reference Scala 2.10 > >>> > >>> If your IDE is set up for Scala 2.11, set the Scala version variable > >>> in the Flink root pom.xml also to 2.11 > >>> > >>> Greetings, > >>> Stephan > >>> > >>> > >>> > >>> > >>> On Fri, Jan 8, 2016 at 12:06 PM, Vasudevan, Ramkrishna S < > >> ramkrishna.s.vasudevan@intel.com> wrote: > >>> > >>>> I have been trying to install, learn and understand Flink. I am > >>>> using > >>>> Scala- EclipseIDE as my IDE. > >>>> > >>>> I have downloaded the flink source coded, compiled and created the > >> project. > >>>> > >>>> My work laptop is Windows based and I don't have eclipse based > >>>> workstation but I do have linux boxes for running and testing things= . > >>>> > >>>> Some of the examples given in Flink source code do run directly > >>>> from Eclipse but when I try to run the Wordcount example from > >>>> Eclipse I get this error > >>>> > >>>> Exception in thread "main" java.lang.NoSuchMethodError: > >>>> > >> scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable > >> /HashSet; > >>>> at akka.actor.ActorCell$.(ActorCell.scala:336) > >>>> at akka.actor.ActorCell$.(ActorCell.scala) > >>>> at akka.actor.RootActorPath.$div(ActorPath.scala:159) > >>>> at > >> akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:464) > >>>> at > >> akka.actor.LocalActorRefProvider.(ActorRefProvider.scala:452) > >>>> at > >>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > >>>> Method) > >>>> at > >>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown > >>>> Source) > >>>> at > >>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown > >>>> Source) > >>>> at java.lang.reflect.Constructor.newInstance(Unknown Source) > >>>> at > >>>> > >> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply( > >> DynamicAccess.scala:78) > >>>> at scala.util.Try$.apply(Try.scala:191) > >>>> at > >>>> > >> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.sca > >> la:73) > >>>> at > >>>> > >> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply( > >> DynamicAccess.scala:84) > >>>> at > >>>> > >> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply( > >> DynamicAccess.scala:84) > >>>> at scala.util.Success.flatMap(Try.scala:230) > >>>> at > >>>> > >> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.sca > >> la:84) > >>>> at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:585= ) > >>>> at akka.actor.ActorSystemImpl.(ActorSystem.scala:578) > >>>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:142) > >>>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:119) > >>>> at akka.actor.ActorSystem$.create(ActorSystem.scala:67) > >>>> at > >>>> > >> org.apache.flink.runtime.akka.AkkaUtils$.createActorSystem(AkkaUtils.s > >> cala:84) > >>>> at > >>>> > >> org.apache.flink.runtime.minicluster.FlinkMiniCluster.startJobManagerA > >> ctorSystem(FlinkMiniCluster.scala:196) > >>>> at > >>>> > >> org.apache.flink.runtime.minicluster.FlinkMiniCluster.singleActorSyste > >> m$lzycompute$1(FlinkMiniCluster.scala:225) > >>>> at org.apache.flink.runtime.minicluster.FlinkMiniCluster.org > >>>> > >> $apache$flink$runtime$minicluster$FlinkMiniCluster$$singleActorSystem$ > >> 1(FlinkMiniCluster.scala:225) > >>>> at > >>>> > >> org.apache.flink.runtime.minicluster.FlinkMiniCluster$$anonfun$1.apply > >> (FlinkMiniCluster.scala:230) > >>>> at > >>>> > >> org.apache.flink.runtime.minicluster.FlinkMiniCluster$$anonfun$1.apply > >> (FlinkMiniCluster.scala:228) > >>>> at > >>>> > >> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike. > >> scala:245) > >>>> at > >>>> > >> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike. > >> scala:245) > >>>> at scala.collection.immutable.Range.foreach(Range.scala:166) > >>>> at > >>>> scala.collection.TraversableLike$class.map(TraversableLike.scala:245= ) > >>>> at scala.collection.AbstractTraversable.map(Traversable.scala:104= ) > >>>> at > >>>> > >> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniC > >> luster.scala:228) > >>>> at > >>>> > >> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniC > >> luster.scala:219) > >>>> at > >>>> > >> org.apache.flink.streaming.api.environment.LocalStreamEnvironment.exec > >> ute(LocalStreamEnvironment.java:104) > >>>> at > >>>> org.apache.flink.streaming.examples.wordcount.WordCount.main(WordCo > >>>> un > >>>> t > >>>> .java:80) > >>>> > >>>> I know this is a na=C3=AFve question but I would like to get some he= lp > >>>> in order to over come this issue. I tried various options like > >>>> setting > >>>> scala-2.10 as the compiler for the project (then it shows > >>>> completely different error) and many of the projects don't even > >>>> compile. But with > >>>> 2.11 version I get the above stack trace. Any help here is welcome. > >>>> > >>>> Regards > >>>> Ram > >>>> > >> > >> Regards, > >> Chiwan Park > >> > > Regards, > Chiwan Park > > > --047d7b2e159752f49f05292142a0--