distributedlog-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Leigh Stewart <lstew...@twitter.com.INVALID>
Subject Re: hundreds of millions of streams?
Date Fri, 28 Oct 2016 14:11:39 GMT
DL is not able to handle 100s of millions of streams. 10^5-106 is probably
ok.
ZK is probably the biggest challenge (we are looking at ways to eliminate
this as we would like to scale to 10^6-10^7 in the not too distant future),
but 100s of millions is so far beyond what we've worked with there would
likely be other scaling challenges on the way to that point.

On Fri, Oct 28, 2016 at 5:56 AM, Poule Dodue <pouledodue@hotmail.com> wrote:

> In Event Sourcing, we need to have 1 stream per entity/aggregate so for
> a typical prod system it means we need hundreds of millions of streams.
>
> Is DL able to handle that or it is limited to, say, few hundreds thousands
> of streams?
>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message