ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Richard Pelavin <rich.pela...@gmail.com>
Subject Error using spark (sharedRDD.savePairs(rdd)) with Ignite 1.5.0, but not ignite 1.4.0
Date Tue, 17 Nov 2015 04:23:23 GMT
I am using spark-shell  (spark 1.5.1) running a standalone spark cluster
and seeing  error at bottom of post on the spark workers when I execute
'sharedRDD.savePairs(rdd)'  in spark shell


It works fine when I use ignite 1.4.0

Might be due to starting things differently. For ignite 1.4.0 I use:

./bin/spark-shell \./bin/spark-shell \
  --packages org.apache.ignite:ignite-spark:1.4.0 \
  --repositories
http://www.gridgainsystems.com/nexus/content/repositories/external \
  --jars /usr/lib/ignite/libs/ignite-query-objects.jar

while after building 1.5.0 from source I use:

BASE="/usr/lib/ignite/libs"
VER="1.5.0-SNAPSHOT"
IGNITE_JARS="${BASE}/ignite-query-objects.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-core-${VER}.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/cache-api-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-shmem-1.0.0.jar"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-spring/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/ignite-indexing/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-log4j/*"
IGNITE_JARS="${IGNITE_JARS}:${BASE}/optional/ignite-spark/*"
./bin/spark-shell --jars $IGNITE_JARS

and get error

Any pointers would be appreciated.
Thanks,
Rich
---

Error stack:

15/11/16 19:54:34 ERROR TaskSetManager: Task 0 in stage 2.0 failed 4 times;
aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage
2.0 (TID 7, 10.0.0.206): java.lang.ClassNotFoundException:
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at
org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:69)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
        at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
        at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
org.apache.ignite.spark.IgniteRDD$$anonfun$savePairs$1
        at java.lang.ClassLoader.findClass(ClassLoader.java:531)
        at
org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at
org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:34)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at
org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
        at
org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:64)
        ... 30 more

Mime
View raw message