nifi-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Peter Wicks (pwicks)" <pwi...@micron.com>
Subject RE: [EXT] Re: [DISCUSS] Time based release cycles
Date Tue, 05 Nov 2019 20:06:19 GMT
I feel like most users ask, "When is version x coming out" because they don't want to/or can't
do a build themselves and they really want to use new features.

I know it's a completely different direction from where I think your question was pointing
Pierre, but I wonder how many users would be OK with a nightly build binary? Many other Apache
projects provide nightly builds including JMeter, Ignite, ANT, Cordova, Solr and OpenOffice.
 This would also make it easier for users to provide feedback sooner on changes, as they could
just grab a pre-built binary.

Thanks,
  Peter

-----Original Message-----
From: Russell Bateman <russ@windofkeltia.com> 
Sent: Tuesday, November 5, 2019 8:39 AM
To: dev@nifi.apache.org
Subject: [EXT] Re: [DISCUSS] Time based release cycles

Kafka is first-rate, rock-star technology, just as is NiFi.

It would be nice to find something from Kafka elaborating on how this regular and accelerated
release cadence is working out for them, how much more work it's been, what problems they've
experienced, etc.

I show their releases over the last couple of years as below[1]. The cadence appears to be
settling into the the 4-month cycle proposed. It's possible to discern a maintenance schedule.
It doesn't exactly match NiFi's 0.x and 1.x efforts (which were simultaneous for some time
too), but it's clear they've faced similar complexity (maybe a little more though for a shorter
time). And, of course, there's no meaningful way to compare the effort going into and features
implemented in Kafka by comparison with NiFi.

2019
2.3.1    24 October
2.3.0    25 June
2.2.1     1 June
2.2.0    22 March
2.1.1    15 February

2018
2.1.0    20 November
2.0.1     9 November
2.0.0    30 July
1.1.1    19 July
1.0.2     8 July
0.11.0.3  2 July
0.10.2.2  2 July
1.1.0    28 March
1.0.1     5 March

2017
1.0.0     1 November
0.11.0.1 13 September
0.11.0.0 28 June
.
.
.

[1] https://nam01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fkafka.apache.org%2Fdownloads&amp;data=02%7C01%7Cpwicks%40micron.com%7C5025edaf0fcc4cd23ecb08d762064cb5%7Cf38a5ecd28134862b11bac1d563c806f%7C0%7C1%7C637085651507559851&amp;sdata=ax5RXiprNm8Ls1k%2FuEE4SwA5tCXzObJu3Dk%2FiP3h3dI%3D&amp;reserved=0

On 11/5/19 8:02 AM, Pierre Villard wrote:
> Hi NiFi dev community,
>
> We just released NiFi 1.10 and that's an amazing release with a LOT of 
> great new features. Congrats to everyone!
>
> I wanted to take this opportunity to bring a discussion around how 
> often we're doing releases.
>
> We released 1.10.0 yesterday and we released 1.9.0 in February, that's 
> around 8 months between the two releases. And if we take 1.9.2, 
> released early April, that's about 7 months.
>
> I acknowledge that doing releases is really up to the committers and 
> anyone can take the lead to perform this process, however, we often 
> have people asking (on the mailing lists or somewhere else) about when 
> will the next release be. I'm wondering if it would make sense to 
> think about something a bit more "planned" by doing time based releases.
>
> The Apache Kafka community wrote a nice summary of the pros/cons about 
> such an approach [1] and it definitely adds more work to the 
> committers with more frequent releases. I do, however, think that it'd 
> ease the adoption of NiFi, its deployment and the dynamism in PR/code review.
>
> I'm just throwing the idea here and I'm genuinely curious about what 
> you think about this approach.
>
> [1]
> https://nam01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fcwik
> i.apache.org%2Fconfluence%2Fdisplay%2FKAFKA%2FTime%2BBased%2BRelease%2
> BPlan&amp;data=02%7C01%7Cpwicks%40micron.com%7C5025edaf0fcc4cd23ecb08d
> 762064cb5%7Cf38a5ecd28134862b11bac1d563c806f%7C0%7C1%7C637085651507559
> 851&amp;sdata=Nj1t2mTP7VWwOIxD5V5vlnH8quyXYP8ul6Sa2e3nswE%3D&amp;reser
> ved=0
>
> Thanks,
> Pierre
>

Mime
View raw message