spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ali Gouta <ali.go...@gmail.com>
Subject Re: How do I link JavaEsSpark.saveToEs() to a sparkConf?
Date Mon, 14 Dec 2015 13:03:13 GMT
You don't need an explicit association between your JavaEsSpark and the
SparkConf.
Actuall when you will make transformations/filtering/.. on your "sc" then
you can strore the final RDD in your ELS. Example:

val generateRDD = sc.makeRDD(Seq(SOME_STUFF))
JavaEsSpark.saveToEs(generateRDD, "foo");

That's it...
At last, be carreful while defining your sets of your "conf". For instance
you may end-up changing the localhost by the real IP adresse of your
Elasticsearch node...

Ali Gouta.

On Mon, Dec 14, 2015 at 1:52 PM, Spark Enthusiast <sparkenthusiast@yahoo.in>
wrote:

> Folks,
>
> I have the following program :
>
> SparkConf conf = new SparkConf().setMaster("local")
> .setAppName("Indexer").set("spark.driver.maxResultSize", "2g");
> conf.set("es.index.auto.create", "true");
> conf.set("es.nodes", "localhost");
> conf.set("es.port", "9200");
> conf.set("es.write.operation", "index");
> JavaSparkContext sc = new JavaSparkContext(conf);
>
>           .
>           .
>
> JavaEsSpark.saveToEs(filteredFields, "foo");
>
> I get an error saying cannot find storage. Looks like the driver program
> cannot the Elastic Search Server. Seeing the program, I have not associated
> JavaEsSpark to the SparkConf.
>
> Question: How do I associate JavaEsSpark to SparkConf?
>
>
>

Mime
View raw message