ignite-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Николай Ижиков" <nizhikov....@gmail.com>
Subject Re: TC issues. IGNITE-3084. Spark Data Frame API
Date Wed, 29 Nov 2017 04:08:04 GMT
Valentin,

For now `Ignite RDD` build runs on jdk1.7.
We need to update it to jdk1.8.

I wrote the whole versions numbers to be clear:

1. Current master - Spark version is 2.1.0.
	So both `Ignite RDD` and `Ignite RDD 2.10` runs OK on jdk1.7.

2. My branch -
	`Ignite RDD 2.10` - spark version is 2.1.2 - runs OK on jdk1.7.
	`Ignite RDD` - spark version 2.2.0 - fails on jdk1.7, *has to be changed to run on jdk1.8*


29.11.2017 03:27, Valentin Kulichenko пишет:
> Nikolay,
> 
> If Spark requires Java 8, then I guess we have no choice. How TC is configured at the
moment? My understanding is that Spark related suites are successfully executed there, so
is there an issue?
> 
> -Val
> 
> On Tue, Nov 28, 2017 at 2:42 AM, Николай Ижиков <nizhikov.dev@gmail.com
<mailto:nizhikov.dev@gmail.com>> wrote:
> 
>     Hello, Valentin.
> 
>         Added '-Dscala-2.10' to the build config. Let me know if it helps.
> 
> 
>     Yes, it helps. Thank you!
>     Now, 'Ignite RDD spark 2_10' succeed for my branch.
> 
> 
>         Do you mean that IgniteRDD does not compile on JDK7? If yes, do we know the reason?
I don't think switching it to JDK8 is a solution as it should work with both.
> 
> 
>     I mean that latest version of spark doesn't support jdk7.
> 
>     http://spark.apache.org/docs/latest/ <http://spark.apache.org/docs/latest/>
> 
>     "Spark runs on Java 8+..."
>     "For the Scala API, Spark 2.2.0 uses Scala 2.11..."
>     "Note that support for Java 7... were removed as of Spark 2.2.0"
>     "Note that support for Scala 2.10 is deprecated..."
> 
>     Moreover, We can't have IgniteCatalog for spark 2.1.
>     Please, see my explanation in jira ticket -
> 
>     https://issues.apache.org/jira/browse/IGNITE-3084?focusedCommentId=16268523&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16268523
>     <https://issues.apache.org/jira/browse/IGNITE-3084?focusedCommentId=16268523&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16268523>
> 
>     Do you see any options to support jdk7 for spark module?
> 
>     > I think all tests should be executed on TC. Can you check if they work and add
them to corresponding suites
> 
>     OK, I file a ticket and try to fix it shortly.
> 
>     https://issues.apache.org/jira/browse/IGNITE-7042 <https://issues.apache.org/jira/browse/IGNITE-7042>
> 
>     28.11.2017 03:33, Valentin Kulichenko пишет:
> 
>         Hi Nikolay,
> 
>         Please see my responses inline.
> 
>         -Val
> 
>         On Fri, Nov 24, 2017 at 2:55 AM, Николай Ижиков <nizhikov.dev@gmail.com
<mailto:nizhikov.dev@gmail.com> <mailto:nizhikov.dev@gmail.com <mailto:nizhikov.dev@gmail.com>>>
wrote:
> 
>              Hello, guys.
> 
>              I have some issues on TC with my PR [1] for IGNITE-3084(Spark Data Frame
API).
>              Can you, please, help me:
> 
> 
>              1. `Ignite RDD spark 2_10` -
> 
>              Currently this build runs with following profiles: `-Plgpl,examples,scala-2.10,-clean-libs,-release`
[2]
>              That means `scala` profile is activated too for `Ignite RDD spark 2_10`
>              Because `scala` activation is done like [3]:
> 
>              ```
>                           <activation>
>                               <property><name>!scala-2.10</name></property>
>                           </activation>
>              ```
> 
>              I think it a misconfiguration because scala(2.11) shouldn't be activated
for 2.10 build.
>              Am I miss something?
> 
>              Can someone edit build property?
>                       * Add `-scala` to profiles list
>                       * Or add `-Dscala-2.10` to jvm properties to turn off `scala`
profile in this build.
> 
> 
>         Added '-Dscala-2.10' to the build config. Let me know if it helps.
> 
> 
>              2. `Ignite RDD` -
> 
>              Currently this build run on jvm7 [4].
>              As I wrote in my previous mail [5] current version of spark(2.2) runs
only on jvm8.
> 
>              Can someone edit build property to run it on jvm8?
> 
> 
>         Do you mean that IgniteRDD does not compile on JDK7? If yes, do we know the reason?
I don't think switching it to JDK8 is a solution as it should work with both.
> 
> 
>              3. For now `Ignite RDD` and `Ignite RDD spark 2_10` only runs java tests
[6] existing in `spark` module.
>              There are several existing tests written in scala(i.e. scala-test) ignored
in TC. IgniteRDDSpec [7] for example.
>              Is it turned off by a purpose or I miss something?
>              Should we run scala-test for spark and spark_2.10 modules?
> 
>         I think all tests should be executed on TC. Can you check if they work and add
them to corresponding suites?
> 
> 
>              [1] https://github.com/apache/ignite/pull/2742 <https://github.com/apache/ignite/pull/2742>
<https://github.com/apache/ignite/pull/2742 <https://github.com/apache/ignite/pull/2742>>
>              [2] https://ci.ignite.apache.org/viewLog.html?buildId=960220&buildTypeId=Ignite20Tests_IgniteRddSpark210&tab=buildLog&_focus=379#_state=371
>         <https://ci.ignite.apache.org/viewLog.html?buildId=960220&buildTypeId=Ignite20Tests_IgniteRddSpark210&tab=buildLog&_focus=379#_state=371>
>              <https://ci.ignite.apache.org/viewLog.html?buildId=960220&buildTypeId=Ignite20Tests_IgniteRddSpark210&tab=buildLog&_focus=379#_state=371
>         <https://ci.ignite.apache.org/viewLog.html?buildId=960220&buildTypeId=Ignite20Tests_IgniteRddSpark210&tab=buildLog&_focus=379#_state=371>>
>              [3] https://github.com/apache/ignite/blob/master/pom.xml#L533 <https://github.com/apache/ignite/blob/master/pom.xml#L533>
<https://github.com/apache/ignite/blob/master/pom.xml#L533
>         <https://github.com/apache/ignite/blob/master/pom.xml#L533>>
>              [4] https://ci.ignite.apache.org/viewLog.html?buildId=960221&buildTypeId=Ignite20Tests_IgniteRdd&tab=buildParameters
>         <https://ci.ignite.apache.org/viewLog.html?buildId=960221&buildTypeId=Ignite20Tests_IgniteRdd&tab=buildParameters>
>              <https://ci.ignite.apache.org/viewLog.html?buildId=960221&buildTypeId=Ignite20Tests_IgniteRdd&tab=buildParameters
>         <https://ci.ignite.apache.org/viewLog.html?buildId=960221&buildTypeId=Ignite20Tests_IgniteRdd&tab=buildParameters>>
>              [5] http://apache-ignite-developers.2346864.n4.nabble.com/Integration-of-Spark-and-Ignite-Prototype-tp22649p23099.html
>         <http://apache-ignite-developers.2346864.n4.nabble.com/Integration-of-Spark-and-Ignite-Prototype-tp22649p23099.html>
>              <http://apache-ignite-developers.2346864.n4.nabble.com/Integration-of-Spark-and-Ignite-Prototype-tp22649p23099.html
>         <http://apache-ignite-developers.2346864.n4.nabble.com/Integration-of-Spark-and-Ignite-Prototype-tp22649p23099.html>>
>              [6] https://ci.ignite.apache.org/viewLog.html?buildId=960220&buildTypeId=Ignite20Tests_IgniteRddSpark210&tab=testsInfo
>         <https://ci.ignite.apache.org/viewLog.html?buildId=960220&buildTypeId=Ignite20Tests_IgniteRddSpark210&tab=testsInfo>
>              <https://ci.ignite.apache.org/viewLog.html?buildId=960220&buildTypeId=Ignite20Tests_IgniteRddSpark210&tab=testsInfo
>         <https://ci.ignite.apache.org/viewLog.html?buildId=960220&buildTypeId=Ignite20Tests_IgniteRddSpark210&tab=testsInfo>>
>              [7] https://github.com/apache/ignite/blob/master/modules/spark/src/test/scala/org/apache/ignite/spark/IgniteRDDSpec.scala
>         <https://github.com/apache/ignite/blob/master/modules/spark/src/test/scala/org/apache/ignite/spark/IgniteRDDSpec.scala>
>              <https://github.com/apache/ignite/blob/master/modules/spark/src/test/scala/org/apache/ignite/spark/IgniteRDDSpec.scala
>         <https://github.com/apache/ignite/blob/master/modules/spark/src/test/scala/org/apache/ignite/spark/IgniteRDDSpec.scala>>
> 
> 
> 

Mime
View raw message