ignite-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Richard Siebeling <rsiebel...@gmail.com>
Subject Re: Support for Spark 2.0
Date Mon, 19 Sep 2016 19:20:33 GMT
I'd wish someone could fix this, unfortunately I can't do anything about it
in the near future...

On Mon, Sep 19, 2016 at 3:20 PM, Phadnis, Varun <phadnis@sky.optymyze.com>
wrote:

> Hello,
>
> Just to be clear, I am not explicitly building the Hadoop edition. I get
> the described error when I simply specify the Spark version:
>
>   mvn clean package -Dspark.version=2.0.0 -DskipTests
>
> Thanks for the quick response!
>
>
> -----Original Message-----
> From: Sergey Kozlov [mailto:skozlov@gridgain.com]
> Sent: 19 September 2016 05:56
> To: dev@ignite.apache.org
> Subject: Re: Support for Spark 2.0
>
> Hi
>
> It's a known issue IGNITE-3596 Hadoop edition can't be compiled against
> spark 2.0.0 <https://issues.apache.org/jira/browse/IGNITE-3596>
>
> Unfortunately there's no progress yet
>
> On Mon, Sep 19, 2016 at 1:36 PM, Phadnis, Varun <phadnis@sky.optymyze.com>
> wrote:
>
> > Hello,
> >
> > Can someone please tell me when is the support for Spark 2.0 planned?
> >
> > Currently Ignite cannot be built for Spark 2.0. Attempting this yields
> > the following error :
> >
> >
> > ----------------------------------------------------------------------
> > --
> >
> > [INFO] Building ignite-spark 1.8.0-SNAPSHOT [INFO]
> >
> > ----------------------------------------------------------------------
> > --
> >
> > [INFO]
> >
> > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ ignite-spark
> > --- [INFO] Deleting /home/spark/code/ignite/modules/spark/target
> >
> > [INFO]
> >
> > [INFO] --- flatten-maven-plugin:1.0.0-beta-3:clean
> > (flatten.clean.before) @ ignite-spark --- [INFO] Deleting
> > /home/spark/code/ignite/ modules/spark/pom-installed.xml
> >
> > [INFO]
> >
> > [INFO] --- maven-enforcer-plugin:1.4:enforce (default) @ ignite-spark
> > --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process
> > (default) @ ignite-spark --- [INFO] [INFO] ---
> > maven-remote-resources-plugin:1.5:process
> > (ignite-dependencies) @ ignite-spark --- [INFO] [INFO] ---
> > maven-resources-plugin:2.6:resources (default-resources) @
> > ignite-spark
> > --- [INFO] Using 'UTF-8' encoding to copy filtered resources.
> >
> > [INFO] skip non existing resourceDirectory /home/spark/code/ignite/
> > modules/spark/src/main/resources
> >
> > [INFO] Copying 4 resources
> >
> > [INFO] Copying 4 resources
> >
> > [INFO]
> >
> > [INFO] --- flatten-maven-plugin:1.0.0-beta-3:flatten (flatten) @
> > ignite-spark --- [INFO] Generating flattened POM of project
> > org.apache.ignite:ignite-spark:jar:1.8.0-SNAPSHOT...
> >
> > [WARNING] Ignoring multiple XML header comment!
> >
> > [INFO]
> >
> > [INFO] --- scala-maven-plugin:3.2.0:add-source (scala-compile-first) @
> > ignite-spark --- [INFO] Add Source directory:
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala
> >
> > [INFO] Add Test Source directory:
> >
> > /home/spark/code/ignite/modules/spark/src/test/scala
> >
> > [INFO]
> >
> > [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
> > ignite-spark --- [WARNING]  Expected all dependencies to require Scala
> > version: 2.11.7 [WARNING]
> > org.apache.ignite:ignite-spark:1.8.0-SNAPSHOT
> > requires scala
> >
> > version: 2.11.7
> >
> > [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
> > [WARNING]  org.apache.spark:spark-core_2.11:2.0.0 requires scala
> version:
> >
> > 2.11.8
> >
> > [WARNING] Multiple versions of scala libraries detected!
> >
> > [INFO] /home/spark/code/ignite/modules/spark/src/main/scala:-1: info:
> >
> > compiling
> >
> > [INFO] Compiling 8 source files to
> >
> > /home/spark/code/ignite/modules/spark/target/classes at 1474028222935
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:25:
> >
> > error: object Logging is not a member of package org.apache.spark
> > [ERROR] import org.apache.spark.{Logging, SparkContext}
> >
> > [ERROR]        ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:37:
> >
> > error: not found: type Logging
> >
> > [ERROR]     ) extends Serializable with Logging {
> >
> > [ERROR]                                 ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:50:
> >
> > error: not found: value logInfo
> >
> > [ERROR]         logInfo("Will start Ignite nodes on " + workers + "
> >
> > workers")
> >
> > [ERROR]         ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:129:
> >
> > error: not found: value logInfo
> >
> > [ERROR]             logInfo("Setting IGNITE_HOME from driver not as it is
> >
> > not available on this worker: " + igniteHome)
> >
> > [ERROR]             ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:146:
> >
> > error: not found: value logError
> >
> > [ERROR]                 logError("Failed to start Ignite.", e)
> >
> > [INFO]                 ^
> >
> > [ERROR]
> >
> > /home/spark/code/ignite/modules/spark/src/main/scala/
> > org/apache/ignite/spark/IgniteContext.scala:164:
> >
> > error: not found: value logInfo
> >
> > [ERROR]                 logInfo("Will stop Ignite nodes on " + workers +
> "
> >
> > workers")
> >
> > [ERROR]                 ^
> >
> > [ERROR] 6 errors found
> >
> > [INFO]
> >
> > ----------------------------------------------------------------------
> > --
> >
> > Thanks!
> >
>
>
>
> --
> Sergey Kozlov
> GridGain Systems
> www.gridgain.com
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message