mina-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Trustin Lee (이희승) <trus...@gmail.com>
Subject Fwd: Some benchmarks ...
Date Sun, 12 Oct 2008 15:17:33 GMT
I don't know why this guy is making noise in the Grizzly mailing list,
while being silent in mina-dev.  This thread shouldn't belong there
IMO:

http://www.nabble.com/Some-benchmarks-...-td19839605.html

>> Performance test report is always annoying to some extent, regardless
>> of if the name of the frameworks were open or not.  
>
> So either you do some benchmark, and you do it providing all the
> needed informations, or you don't. I don't see any problem in giving
> the names when it comes to OSS project. And I don't see how annoying
> it can be. 

Well, some people finds anonymized test report a good idea, but you
don't seem so.  You can start the test by yourself within less than 30
minutes.  I wouldn't say that it's annoying without sweating a single
drop.

If you really want to know which one MINA is, it's D, the poorest
performer.  I can also tell you how poor MINA is when it comes to
asynchronous echo test, but it should be left to your homework.

I'd rather stay calm and try to run the test to see what's the
bottleneck in MINA rather than complaining.  xSocket and Grizzly team
already contacted me at least after reviewing the source code or
running the test by themselves.  So why is that a big deal for you?

>> It's JBoss policy to anonymize the names of competitive
>> technologies.  
>
>This is a stupid policy, when it comes to OSS projects. It's pretty
>understandable in the WebApp biz, as oracle, Sun and some other big
>players forbid you to do some benchs, but this is not the case here. 

OK.  Let's say it's a stupid policy and I published an unanonymized
report.  I bet you will still be saying that I am suggesting people not
to bother with other frameworks, just like what you are saying in the
paragraph below.

>> On top of that, any performance test can contain some mistakes, and I
>> don't want a certain framework gets hurt by my (potentially serious)
>> mistake.
>
> Again, either you are confident that your bench is ok, and you provide
> the names, or you don't. Of course you will have mistakes in your
> benchmarks, but that's not such a big deal. By publishing a benchmark
> without providing the names, you are basically telling the
> world :"Hey, look : my product is the fastest ! Don't bother use any
> other !".

I'm of course 90%+ confident that my performance benchmark is accurate
for the particular test scenario I described.  I have spent more than a
week to reliably reproduce the test result for all 5 frameworks, and
that's why I have published the test report finally.

Moreover, I am 100% confident that my performance benchmark is accurate
for MINA as a former major contributor of MINA.

I think it doesn't matter if I publish the names or not, either.
You're just complaining something pointlessly.  Yeah, Netty performs
very good. But I didn't tell anyone not to bother using other
frameworks.  I didn't tell anyone that Netty will perform better than
all others in any other test scenario I didn't cover, either.

What you are saying here is purely your own perception and I can't do
anything about it. Some people might share the same perception with
you, while others will not.

So... shall I tell you my perception?  It is that you will complain
whatever performance test report I publish regardless of if it's
anonymized or not, and how accurate it is or not.  As a contributor (?)
of a very healthy community, what you are doing now is pretty silly.

Mime
View raw message