ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From what0124 <j.mendoza0...@gmail.com>
Subject Re: Running Spark app in cluster!
Date Wed, 18 Jan 2017 13:21:28 GMT
Yes, I pass the example xml, like this: 

object RDDProducer extends App {
  val conf = new SparkConf().setAppName("SparkIgnitePro")
  val sc = new SparkContext(conf)
  val ic = new IgniteContext(sc,"../examples/config/example-cache.xml")
  val sharedRDD= ic.fromCache[Integer,Integer]("a")
  val data= Array (1,2,3,4,5,6,7,8,9,1,2,3,4,5,6,7,8,9,0)
  sharedRDD.savePairs(sc.parallelize(data, 10).map(i=> (i, 1)))
  val testRDD= ic.fromCache[Integer, Integer]("test")
  println("First COUNT is:::::::"+testRDD.count())
  
}

object RDDConsumer extends App {
  val conf = new SparkConf().setAppName("SparkIgniteCon")
  val sc = new SparkContext(conf)
  val ic = new IgniteContext(sc,"../examples/config/example-cache.xml")
  val sharedRDD = ic.fromCache[Integer, Integer]("test")
  println("The count is:::::::::::: "+sharedRDD.count())
}




--
View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Running-Spark-app-in-cluster-tp10073p10117.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Mime
View raw message