flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Lopez, Javier" <javier.lo...@zalando.de>
Subject Re: Exception using flink-connector-elasticsearch
Date Thu, 14 Jan 2016 13:58:16 GMT
Hi,

Thanks Aljoscha, the libraries solved the problem. It worked perfectly!.

On 12 January 2016 at 14:03, Aljoscha Krettek <aljoscha.krettek@gmail.com>
wrote:

> Hi,
> could you please try adding the lucene-core-4.10.4.jar file to your lib
> folder of Flink. (
> https://repo1.maven.org/maven2/org/apache/lucene/lucene-core/4.10.4/)
> Elasticsearch uses dependency injection to resolve the classes and maven is
> not really aware of this.
>
> Also you could try adding lucent-codecs-4.10.4.jar as well (
> https://repo1.maven.org/maven2/org/apache/lucene/lucene-codecs/4.10.4/).
>
> Cheers,
> Aljoscha
> > On 12 Jan 2016, at 11:55, Lopez, Javier <javier.lopez@zalando.de> wrote:
> >
> > Hi,
> >
> > We are using the sink for ElasticSearch and when we try to run our job
> we get the following exception:
> >
> > java.lang.ExceptionInInitializerError Caused by:
> java.lang.IllegalArgumentException: An SPI class of type
> org.apache.lucene.codecs.Codec with name 'Lucene410' does not exist.  You
> need to add the corresponding JAR file supporting this SPI to your
> classpath.  The current classpath supports the following names: []
> >
> > We are using embedded nodes and we don't know if we are missing some
> configuration for the elasticsearch client. This is the code we are using:
> >
> > Map<String, String> config = Maps.newHashMap();
> >
> >   config.put("bulk.flush.max.actions", "1");
> >
> >   config.put("cluster.name", "flink-test");
> >
> >
> >
> >   result.addSink(new ElasticsearchSink<>(config, new
> IndexRequestBuilder<Tuple4<String, Double, Long, Double>>() {
> >       @Override
> >       public org.elasticsearch.action.index.IndexRequest
> createIndexRequest(Tuple4<String, Double, Long, Double> element,
> RuntimeContext ctx) {
> >           Map<String, Object> json = new HashMap<>();
> >           json.put("data", element);
> >           return org.elasticsearch.client.Requests.indexRequest()
> >                   .index("stream_test_1")
> >                   .type("aggregation_test")
> >                   .source(json);
> >       }
> >   }));
> >
> > The flink server as well as the elasticsearch server are in the same
> local machine.
> >
> > Thanks for your help
>
>

Mime
View raw message