polygene-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "zhuangmz08" <zhuangm...@qq.com>
Subject 回复: event streaming data feed?
Date Wed, 20 Apr 2016 10:04:02 GMT
OK.
"We can query an application's state to find out the current state of the world, and this
answers many questions. However there are times when we don't just want to see where we are,
we also want to know how we got there.

Event Sourcing ensures that all changes to application state are stored as a sequence of events.
Not just can we query these events, we can also use the event log to reconstruct past states,
and as a foundation to automatically adjust the state to cope with retroactive changes."
 So, event sourcing is storing all the state of an Entity in an ordered sequence.
My use case is to deal with stock price event streaming. If I use the concept of event sourcing,
can I regard StockPrice as an Entity, <price, datetime> as the state.
public interface StockPrice extends EntityComposite{
    Property<Double> price();
    Property<LocalDateTime> datetime();
} 
1. Every time the state changed, it will store the changes and notify the listeners simultaneous?
Notifying should be in first emergence level while storage could in a lower emergence level.
2. StockPrice data might be very huge, many Gigabytes every day. Can the [eventsourcing-jdbm]
library handle this kind of data?
3. I'm using the test code [org.qi4j.library.eventsourcing.domain.DomainEventTest]. Only changing
events are stored? The creating event is not stored? Every time, store the changing part<DomainEventValue>
instead of the whole new state. DomainEventValue stores the parameters in JSON format. How
could I use replayer? The tutorial is too short to follow...


Thanks a lot.


------------------ 原始邮件 ------------------
发件人: "Niclas Hedhman";<hedhman@gmail.com>;
发送时间: 2016年4月20日(星期三) 下午4:02
收件人: "dev"<dev@zest.apache.org>; 

主题: Re: event streaming data feed?



I don't know. library-eventsourcing was contributed from a downstream
project, and I haven't worked with it. And perhaps it is not in scope of
what you want to do... See Martin Fowler's and Greg Young's definition of
Event Sourcing.

For general event streaming, we are planning to have more explicit support
in 3.x, similar to the persistence support. But it is currently unclear
what core features and SPI is needed for this, and use cases are most
welcome.

There are many ways you could integrate a Kafka consumer or producer into
Zest. My guess would be that a service listens to Kafka system events and
the service creates/destroys Zest resources when needed.

Note that entities will not be able to be Kafka listeners, as you will not
be able to maintain a valid UnitOfWork while waiting.

Hope that helps
On Apr 20, 2016 13:06, "zhuangmz08" <zhuangmz08@qq.com> wrote:

> Hi,
> Is there any mechanism to use event streaming / data feeding with zest.
> I'd like something subscribe-publishing like Apache Kafka. I have a data
> feed server and a set of data feed subscribers.
> Each time a feed client subscribe a new topic (a new Value/Entity
> Composite type), the data feed server will add this subscribe into its
> listener list. As soon as the topic event takes place, it will notify all
> the related listeners.
> 1. Is event sourcing able to subscribe/unsubscribe listeners on demand?
> 1. Does event sourcing have to store every event? I don't need to persist
> every event. In other words, one of the listeners will handle event
> storage, other listeners will pay more attention on real-time event stream
> handling.
>
>
> Thanks a lot.
Mime
  • Unnamed multipart/alternative (inline, 8-Bit, 0 bytes)
View raw message