ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From F7753 <mabiao...@foxmail.com>
Subject Re: Should Ignition.start() method to be called in a spark-igniteRDD app?
Date Sat, 02 Apr 2016 03:38:26 GMT
I added 'Ignition.start()' in my first line of the 'main' method, but still
got the same error, add also notified that it seems the approach I use to
start ignite in spark cluster is wrong, the ignite instance is more than I
expected, there is 3 worker nodes in my cluster . I use
'${IGNITE_HOME}/bin/ignite.sh' to start 3 ignite worker already, but after I
add 'Ignition.start()' ,the ignite instance  became 5, this is not seems
true:
------------------------------------------------------------------------------------------------------------------
16/04/02 11:31:42 INFO AppClient$ClientEndpoint: Executor updated:
app-20160402113142-0000/2 is now RUNNING
16/04/02 11:31:42 INFO AppClient$ClientEndpoint: Executor updated:
app-20160402113142-0000/1 is now RUNNING
16/04/02 11:31:42 INFO AppClient$ClientEndpoint: Executor updated:
app-20160402113142-0000/0 is now RUNNING
16/04/02 11:31:42 INFO SparkDeploySchedulerBackend: SchedulerBackend is
ready for scheduling beginning after reached minRegisteredResourcesRatio:
0.0
[11:31:51] Topology snapshot [ver=5, servers=5, clients=0, CPUs=96,
heap=100.0GB]
[11:31:51] Topology snapshot [ver=6, servers=4, clients=0, CPUs=96,
heap=53.0GB]
[11:31:57] Topology snapshot [ver=7, servers=5, clients=0, CPUs=96,
heap=100.0GB]
[11:31:57] Topology snapshot [ver=8, servers=4, clients=0, CPUs=96,
heap=53.0GB]
[11:32:12] Topology snapshot [ver=9, servers=5, clients=0, CPUs=96,
heap=100.0GB]
[11:32:12] Topology snapshot [ver=10, servers=4, clients=0, CPUs=96,
heap=53.0GB]
[11:32:14] Topology snapshot [ver=11, servers=5, clients=0, CPUs=96,
heap=100.0GB]
[11:32:14] Topology snapshot [ver=12, servers=4, clients=0, CPUs=96,
heap=53.0GB]
-------------------------------------------------------------------------------------------------------------------
also the full stack trace is listed below, this matter seems not so easy to
solve:
-------------------------------------------------------------------------------------------------------------------
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 0.0 (TID 3, nobida145): class
org.apache.ignite.IgniteIllegalStateException: Ignite instance with provided
name doesn't exist. Did you call Ignition.start(..) to start an Ignite
instance? [name=null]
	at org.apache.ignite.internal.IgnitionEx.grid(IgnitionEx.java:1235)
	at org.apache.ignite.Ignition.ignite(Ignition.java:516)
	at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:150)
	at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply$mcVI$sp(IgniteContext.scala:58)
	at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(IgniteContext.scala:58)
	at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(IgniteContext.scala:58)
	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
	at
org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
	at
org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
	at
org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
	at org.apache.spark.scheduler.Task.run(Task.scala:89)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
	at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
	at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
	at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
	at scala.Option.foreach(Option.scala:236)
	at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
	at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:912)
	at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:910)
	at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
	at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
	at org.apache.spark.rdd.RDD.foreach(RDD.scala:910)
	at org.apache.ignite.spark.IgniteContext.<init>(IgniteContext.scala:58)
	at main.scala.StreamingJoin$.main(StreamingJoin.scala:183)
	at main.scala.StreamingJoin.main(StreamingJoin.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: class org.apache.ignite.IgniteIllegalStateException: Ignite
instance with provided name doesn't exist. Did you call Ignition.start(..)
to start an Ignite instance? [name=null]
	at org.apache.ignite.internal.IgnitionEx.grid(IgnitionEx.java:1235)
	at org.apache.ignite.Ignition.ignite(Ignition.java:516)
	at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:150)
	at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply$mcVI$sp(IgniteContext.scala:58)
	at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(IgniteContext.scala:58)
	at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(IgniteContext.scala:58)
	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
	at
org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
	at
org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
	at
org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
	at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
	at org.apache.spark.scheduler.Task.run(Task.scala:89)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
-----------------------------------------------------------------------------------------------------------------

Here is my ignite config file's content:
-----------------------------------------------------------------------------------------------------------------
 20 <beans xmlns="http://www.springframework.org/schema/beans"
 21        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 22        xsi:schemaLocation="http://www.springframework.org/schema/beans
 23                            
http://www.springframework.org/schema/beans/spring-beans.xsd">
 24     <bean class="org.apache.ignite.configuration.IgniteConfiguration"/>
 25 </beans>
-----------------------------------------------------------------------------------------------------------------
I created* two cache* 
-----------------------------------------------------------------------------------------------------------------
   
    Ignition.start()
    ... 
    // fetch the small table data
    ...
    // ignite small table cache
    val smallTableCache = smallTableContext.fromCache(smallTableCacheCfg)
    ...
    smallTableCache.savePairs(smallTableCols_rdd)
    ...
    // fetch the source table data
    ...
    // ignite source table cache
    val sourceTableCache = streamTableContext.fromCache(sourceTableCacheCfg)
    ...
    // sourceTableCols rdd save to cache
    sourceTableCache.savePairs(sourceTableCols_rdd)
    ...
    val res = sourceTableCache.sql(queryString)
-------------------------------------------------------------------------------------------------------------------



--
View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Should-Ignition-start-method-to-be-called-in-a-spark-igniteRDD-app-tp3854p3873.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Mime
View raw message