ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexey Kukushkin <kukushkinale...@gmail.com>
Subject Re: How to do 'stream processing' and more questions of a Ignite newbie
Date Wed, 06 Dec 2017 10:56:52 GMT

What you described sounds like an Event Processing architecture, which
includes never-ending stream of input data, limited time window, analysis
of data within the time window and taking an action if necessary.

Ignite supports Event Processing architecture with the following components:

   - Data Streamers <https://apacheignite.readme.io/docs/data-streamers> -
   to stream endless data into Ignite. One of them if the Kafka streamer that
   you are already familiar with.
   - Data Expiry Policy
   <https://apacheignite.readme.io/docs/expiry-policies> - to define a
   limited time window on the endless stream of data. Thus, answering your
   question "4" - you do not need to manually remove data. Define the time
   window using expire policy and Ignite will take care about removing data
   - Continuous Query
      - Remote Filter
      analyse events on the server side to decide whether you want to
act on them.
      - Local Listener
      implement your action to be called if an even passes remote filter.

Answering your question "3": you have to collocate data. There are two data
collocation APIs - using AffinityKey
a key type or using @AffinityKeyMapped
you prefer annotation (declarative) style. You want "GPS and Acceleration
points that share the same measurementId and deviceId located on the same
node". Thus, you could create a type MeasurementKey { deviceId,
measurementId } and use that type as an affinity key for both GPS and
AccelerationPoint. See examples here

View raw message