flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chiwan Park <chiwanp...@apache.org>
Subject Re: Naive question
Date Tue, 12 Jan 2016 11:36:33 GMT
You can import the project directly by using "File -> Import -> Maven -> Existing
Maven Projects” menu.

> On Jan 12, 2016, at 8:32 PM, Vasudevan, Ramkrishna S <ramkrishna.s.vasudevan@intel.com>
wrote:
> 
> 
> This is scala IDE Release 4.4.0. So without doing mvn eclipse:eclipse - how to you import
the project directly? 
> 
> Regards
> Ram
> -----Original Message-----
> From: Chiwan Park [mailto:chiwanpark@apache.org] 
> Sent: Tuesday, January 12, 2016 4:54 PM
> To: dev@flink.apache.org
> Subject: Re: Naive question
> 
> Because I tested with Scala IDE 4.3.0 only, the process in the documentation is slightly
different with my experience.
> 
>> On Jan 12, 2016, at 8:21 PM, Stephan Ewen <sewen@apache.org> wrote:
>> 
>> @Chiwan: Is this still up to date from your experience?
>> 
>> https://ci.apache.org/projects/flink/flink-docs-release-0.10/internals
>> /ide_setup.html
>> 
>> On Tue, Jan 12, 2016 at 12:04 PM, Chiwan Park <chiwanpark@apache.org> wrote:
>> 
>>> Hi Ram,
>>> 
>>> Because there are some Scala IDE (Eclipse) plugins needed, I 
>>> recommend to avoid `mvn eclipse:eclipse` command. Could you try just 
>>> run `mvn clean install -DskipTests` and import the project to Scala 
>>> IDE directly? In middle of importing process, Scala IDE suggests some plugins
needed.
>>> 
>>> And which version of Scala IDE you are using?
>>> 
>>>> On Jan 12, 2016, at 7:58 PM, Vasudevan, Ramkrishna S <
>>> ramkrishna.s.vasudevan@intel.com> wrote:
>>>> 
>>>> Yes. I added it as Maven project only. I did mvn eclipse:eclipse to
>>> create the project and also built the code using mvn clean install 
>>> -DskipTests.
>>>> 
>>>> Regards
>>>> Ram
>>>> 
>>>> -----Original Message-----
>>>> From: ewenstephan@gmail.com [mailto:ewenstephan@gmail.com] On Behalf 
>>>> Of
>>> Stephan Ewen
>>>> Sent: Tuesday, January 12, 2016 4:10 PM
>>>> To: dev@flink.apache.org
>>>> Subject: Re: Naive question
>>>> 
>>>> Sorry to hear that it did not work out with Eclipse at all in the 
>>>> end,
>>> even with all adjustments.
>>>> 
>>>> Just making sure: You imported Flink as a Maven project, not 
>>>> manually
>>> adding the big Flink dependency JAR?
>>>> 
>>>> On Tue, Jan 12, 2016 at 5:15 AM, Vasudevan, Ramkrishna S <
>>> ramkrishna.s.vasudevan@intel.com> wrote:
>>>> 
>>>>> Thanks to all. I tried with Scala Eclipse IDE with all these 
>>>>> 'change-scala-version.sh'. But in vain.
>>>>> 
>>>>> So I switched over to Intellij and thing work fine over there. I am 
>>>>> new to Intellij so will try using it.
>>>>> 
>>>>> Once again thanks for helping me out.
>>>>> 
>>>>> Regards
>>>>> Ram
>>>>> 
>>>>> -----Original Message-----
>>>>> From: Chiwan Park [mailto:chiwanpark@apache.org]
>>>>> Sent: Monday, January 11, 2016 4:37 PM
>>>>> To: dev@flink.apache.org
>>>>> Subject: Re: Naive question
>>>>> 
>>>>> Hi Ram,
>>>>> 
>>>>> If you want to build Flink with Scala 2.10, just checkout Flink 
>>>>> repository from github or download source code from homepage, run 
>>>>> `mvn clean install -DskipTests` and import projects to your IDE. If 
>>>>> you want to build Flink with Scala 2.11, you have to run 
>>>>> `tools/change-scala-version.sh 2.11` before build the project. You 
>>>>> can revert Scala version change by running 
>>>>> `tools/change-scala-version.sh
>>> 2.10`.
>>>>> 
>>>>> About IDE, Flink community recommends IntelliJ IDEA because Scala 
>>>>> IDE have some problems in Java/Scala mixed project like Flink. But 
>>>>> I tested importing Flink project with Scala IDE 4.3.0, Scala 2.11.7 
>>>>> and Flink 0.10.0 source code. Note that you should import the 
>>>>> project as
>>> maven project.
>>>>> 
>>>>> By the way, the community welcomes any questions. Please feel free 
>>>>> to post questions. :)
>>>>> 
>>>>>> On Jan 11, 2016, at 7:30 PM, Vasudevan, Ramkrishna S <
>>>>> ramkrishna.s.vasudevan@intel.com> wrote:
>>>>>> 
>>>>>> Thank you very much for the reply.
>>>>>> I tried different ways and when I tried setting up the root 
>>>>>> pom.xml to
>>>>>> 2.11
>>>>>> 
>>>>>>            <scala.version>2.11.6</scala.version>
>>>>>>            <scala.binary.version>2.11</scala.binary.version>
>>>>>> 
>>>>>> I got the following error
>>>>>> [INFO]
>>>>>> ------------------------------------------------------------------
>>>>>> --
>>>>>> --
>>>>>> -- [ERROR] Failed to execute goal on project flink-scala: Could 
>>>>>> not resolve depende ncies for project
>>>>>> org.apache.flink:flink-scala:jar:1.0-SNAPSHOT: Could not find 
>>>>>> artifact
>>>>>> org.scalamacros:quasiquotes_2.11:jar:2.0.1 in central 
>>>>>> (http://repo.mave
>>>>>> n.apache.org/maven2) -> [Help 1]
>>>>>> 
>>>>>> If I leave the scala.binary.verson to be at 2.10 and the scala 
>>>>>> version to be at 2.11.6 then I get the following problem [INFO]
>>>>>> C:\flink\flink\flink-runtime\src\test\scala:-1: info: compiling 
>>>>>> [INFO] Compiling 366 source files to 
>>>>>> C:\flink\flink\flink-runtime\target\test-cl
>>>>>> asses at 1452508064750
>>>>>> [ERROR]
>>>>>> C:\flink\flink\flink-runtime\src\test\scala\org\apache\flink\runti
>>>>>> me
>>>>>> \j
>>>>>> ob
>>>>>> manager\JobManagerITCase.scala:700: error: can't expand macros 
>>>>>> compiled by previ ous versions of Scala
>>>>>> [ERROR]               assert(cachedGraph2.isArchived)
>>>>>> [ERROR]                                   ^
>>>>>> 
>>>>>> So am not pretty sure how to proceed with this. If I try to change

>>>>>> the
>>>>> version of scala to 2.10 in the IDE then I get lot of compilation
>>> issues.
>>>>> IS there any way to over come this?
>>>>>> 
>>>>>> Once again thanks a lot and apologies for the naïve question.
>>>>>> 
>>>>>> Regards
>>>>>> Ram
>>>>>> -----Original Message-----
>>>>>> From: ewenstephan@gmail.com [mailto:ewenstephan@gmail.com] On 
>>>>>> Behalf Of Stephan Ewen
>>>>>> Sent: Friday, January 8, 2016 5:01 PM
>>>>>> To: dev@flink.apache.org
>>>>>> Subject: Re: Naive question
>>>>>> 
>>>>>> Hi!
>>>>>> 
>>>>>> This looks like a mismatch between the Scala dependency in Flink

>>>>>> and Scala in your Eclipse. Make sure you use the same for both. By

>>>>>> default, Flink reference Scala 2.10
>>>>>> 
>>>>>> If your IDE is set up for Scala 2.11, set the Scala version 
>>>>>> variable in the Flink root pom.xml also to 2.11
>>>>>> 
>>>>>> Greetings,
>>>>>> Stephan
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> On Fri, Jan 8, 2016 at 12:06 PM, Vasudevan, Ramkrishna S <
>>>>> ramkrishna.s.vasudevan@intel.com> wrote:
>>>>>> 
>>>>>>> I have been trying to install, learn and understand Flink. I
am 
>>>>>>> using
>>>>>>> Scala- EclipseIDE as my IDE.
>>>>>>> 
>>>>>>> I have downloaded the flink source coded, compiled and created

>>>>>>> the
>>>>> project.
>>>>>>> 
>>>>>>> My work laptop is Windows based and I don't have eclipse based

>>>>>>> workstation but I do have linux boxes for running and testing
things.
>>>>>>> 
>>>>>>> Some of the examples given in Flink source code do run directly

>>>>>>> from Eclipse but when I try to run the Wordcount example from

>>>>>>> Eclipse I get this error
>>>>>>> 
>>>>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>>>>>> 
>>>>> scala.collection.immutable.HashSet$.empty()Lscala/collection/immuta
>>>>> ble
>>>>> /HashSet;
>>>>>>>  at akka.actor.ActorCell$.<init>(ActorCell.scala:336)
>>>>>>>  at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
>>>>>>>  at akka.actor.RootActorPath.$div(ActorPath.scala:159)
>>>>>>>  at
>>>>> akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:464)
>>>>>>>  at
>>>>> akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:452)
>>>>>>>  at
>>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>>>> Method)
>>>>>>>  at
>>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown
>>>>>>> Source)
>>>>>>>  at
>>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown
>>>>>>> Source)
>>>>>>>  at java.lang.reflect.Constructor.newInstance(Unknown Source)
>>>>>>>  at
>>>>>>> 
>>>>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.app
>>>>> ly(
>>>>> DynamicAccess.scala:78)
>>>>>>>  at scala.util.Try$.apply(Try.scala:191)
>>>>>>>  at
>>>>>>> 
>>>>> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.
>>>>> sca
>>>>> la:73)
>>>>>>>  at
>>>>>>> 
>>>>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.app
>>>>> ly(
>>>>> DynamicAccess.scala:84)
>>>>>>>  at
>>>>>>> 
>>>>> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.app
>>>>> ly(
>>>>> DynamicAccess.scala:84)
>>>>>>>  at scala.util.Success.flatMap(Try.scala:230)
>>>>>>>  at
>>>>>>> 
>>>>> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.
>>>>> sca
>>>>> la:84)
>>>>>>>  at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:585)
>>>>>>>  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:578)
>>>>>>>  at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
>>>>>>>  at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
>>>>>>>  at akka.actor.ActorSystem$.create(ActorSystem.scala:67)
>>>>>>>  at
>>>>>>> 
>>>>> org.apache.flink.runtime.akka.AkkaUtils$.createActorSystem(AkkaUtil
>>>>> s.s
>>>>> cala:84)
>>>>>>>  at
>>>>>>> 
>>>>> org.apache.flink.runtime.minicluster.FlinkMiniCluster.startJobManag
>>>>> erA
>>>>> ctorSystem(FlinkMiniCluster.scala:196)
>>>>>>>  at
>>>>>>> 
>>>>> org.apache.flink.runtime.minicluster.FlinkMiniCluster.singleActorSy
>>>>> ste
>>>>> m$lzycompute$1(FlinkMiniCluster.scala:225)
>>>>>>>  at org.apache.flink.runtime.minicluster.FlinkMiniCluster.org
>>>>>>> 
>>>>> $apache$flink$runtime$minicluster$FlinkMiniCluster$$singleActorSyst
>>>>> em$
>>>>> 1(FlinkMiniCluster.scala:225)
>>>>>>>  at
>>>>>>> 
>>>>> org.apache.flink.runtime.minicluster.FlinkMiniCluster$$anonfun$1.ap
>>>>> ply
>>>>> (FlinkMiniCluster.scala:230)
>>>>>>>  at
>>>>>>> 
>>>>> org.apache.flink.runtime.minicluster.FlinkMiniCluster$$anonfun$1.ap
>>>>> ply
>>>>> (FlinkMiniCluster.scala:228)
>>>>>>>  at
>>>>>>> 
>>>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.
>>>>> scala:245)
>>>>>>>  at
>>>>>>> 
>>>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.
>>>>> scala:245)
>>>>>>>  at scala.collection.immutable.Range.foreach(Range.scala:166)
>>>>>>>  at
>>>>>>> scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
>>>>>>>  at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>>>>>>>  at
>>>>>>> 
>>>>> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMi
>>>>> niC
>>>>> luster.scala:228)
>>>>>>>  at
>>>>>>> 
>>>>> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMi
>>>>> niC
>>>>> luster.scala:219)
>>>>>>>  at
>>>>>>> 
>>>>> org.apache.flink.streaming.api.environment.LocalStreamEnvironment.e
>>>>> xec
>>>>> ute(LocalStreamEnvironment.java:104)
>>>>>>>  at
>>>>>>> org.apache.flink.streaming.examples.wordcount.WordCount.main(Word
>>>>>>> Co
>>>>>>> un
>>>>>>> t
>>>>>>> .java:80)
>>>>>>> 
>>>>>>> I know this is a naïve question but I would like to get some
help 
>>>>>>> in order to over come this issue. I tried various options like

>>>>>>> setting
>>>>>>> scala-2.10 as the compiler for the project (then it shows 
>>>>>>> completely different error) and many of the projects don't even

>>>>>>> compile. But with
>>>>>>> 2.11 version I get the above stack trace. Any help here is welcome.
>>>>>>> 
>>>>>>> Regards
>>>>>>> Ram
>>>>>>> 
>>>>> 
>>>>> Regards,
>>>>> Chiwan Park
>>>>> 
>>> 
>>> Regards,
>>> Chiwan Park
>>> 
> 
> 
> Regards,
> Chiwan Park

Regards,
Chiwan Park



Mime
View raw message