flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Lopez, Javier" <javier.lo...@zalando.de>
Subject Exception using flink-connector-elasticsearch
Date Tue, 12 Jan 2016 10:55:51 GMT
Hi,

We are using the sink for ElasticSearch and when we try to run our job we
get the following exception:

java.lang.ExceptionInInitializerError Caused by:
java.lang.IllegalArgumentException: An SPI class of type
org.apache.lucene.codecs.Codec with name 'Lucene410' does not exist.  You
need to add the corresponding JAR file supporting this SPI to your
classpath.  The current classpath supports the following names: []

We are using embedded nodes and we don't know if we are missing some
configuration for the elasticsearch client. This is the code we are using:

Map<String, String> config = Maps.newHashMap();

  config.put("bulk.flush.max.actions", "1");

  config.put("cluster.name", "flink-test");



  result.addSink(new ElasticsearchSink<>(config, new
IndexRequestBuilder<Tuple4<String, Double, Long, Double>>() {
      @Override
      public org.elasticsearch.action.index.IndexRequest
createIndexRequest(Tuple4<String, Double, Long, Double> element,
RuntimeContext ctx) {
          Map<String, Object> json = new HashMap<>();
          json.put("data", element);
          return org.elasticsearch.client.Requests.indexRequest()
                  .index("stream_test_1")
                  .type("aggregation_test")
                  .source(json);
      }
  }));

The flink server as well as the elasticsearch server are in the same local
machine.

Thanks for your help

Mime
View raw message