logging-log4j-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ralph Goers <ralph.go...@dslextreme.com>
Subject Re: Performance tests log4j 1, logback, log4j 2
Date Mon, 05 Nov 2012 18:38:27 GMT
I didn't say "longer path name" but "longer path length".

Logback doesn't separate configuration from the loggers (which is why it can't be reconfigured
without losing events).  Log4j 2 adds the LoggerConfigs and has a reference to the LoggerConfig
from the Loggers.  This means at least one more method call or at least one more variable
has to be referenced resulting in a longer path length.

I should also point out that we also have a slightly longer path length because Log4j 2 supports
filters on Loggers and appender-refs.  Typically, that is just a test against a reference
being null.

Ralph  


On Nov 5, 2012, at 10:29 AM, Christian Grobmeier wrote:

> Thanks Ralph for all the very interesting insight.
> 
> I agree, one cannot really see whats "better". One can just say they
> all do a pretty good job.
> 
> On thing I didn't get was the "long path name". What do you mean with that?
> 
> Thanks!
> Christian
> 
> On Mon, Nov 5, 2012 at 3:53 PM, Ralph Goers <ralph.goers@dslextreme.com> wrote:
>> One thing I should clarify. The slightly longer path length in Log4j 2 I
>> mentioned should not really be a factor in the PerformanceComparison test.
>> Those show up in the SimplePerfTest - which for me results in 2 ns per
>> event. when logging is disabled and no filters are present.
>> PerformanceComparison is the exact opposite kind of test that measures the
>> cost of actually having logging enabled and so is impacted by many other
>> factors.
>> 
>> Ralph
>> 
>> 
>> 
>> On Nov 5, 2012, at 6:14 AM, Ralph Goers wrote:
>> 
>> You will notice on the web site how I am careful to mention that in most
>> cases the performance results should not be used to draw much of a
>> conclusion.
>> 
>> The path length of Log4j 2 has to be slightly longer than Logback in order
>> to allow the configuration to be updated in aemi-atomic manner.  Some time
>> ago I spent a bit of time in YourKit looking to see how I could improve
>> performance. i would find what looked like a hot spot and make a change to
>> eliminate a method call and then find the next hot spot and change it, etc.
>> But then when I would rerun the tests it wouldn't make any actual
>> difference. Then someone reported that the performance unit test was failing
>> on Windows and what I found was that Windows, by default runs in "client"
>> mode while most other environments run in "server" mode.  The difference is
>> striking. What I believe is happening is that the hot spot compiler is
>> implementing all those performance improvements I was trying to do by hand.
>> 
>> Recently Ceki posted about the tests being unfair because Logback was being
>> compared with buffering turned off while Log4j 1 and 2 had it turned on.
>> That was simply because Logback didn't support buffering when the test was
>> first run. As part of fixing that I noticed that Log4j 2 was fully 25%
>> slower than Logback, so I started looking into why.  What I found that the
>> primary culprit was that PatternLayout was invoking StrSubstitutor over the
>> resolved pattern on every event.  I changed that so that only Converters
>> that can contain variables now do substitution and the pattern is checked
>> when the converter is created to determine if it contains a variable. This
>> single change brought the times down to where you see them now.
>> 
>> I also had a suspicion that the way PatternLayout uses a StringBuilder might
>> be a problem, so I modified all the Converters to return a String instead of
>> updating a StringBuilder.  While I actually liked that code better it made
>> no difference at all in performance so I didn't make the change.
>> 
>> Note that PerformanceComparison only does a single threaded comparison.
>> When I run FilterPerformanceComparison I get
>> 
>> Starting Log4j 2.0
>> Starting Logback
>> ###############################################
>> Logback: 22
>> Log4j 2.0: 19
>> ###############################################
>> Testing multithreading
>> Testing 1 threads
>> ###############################################
>> Logback: 20
>> Log4j 2.0: 16
>> ###############################################
>> Testing 2 threads
>> ###############################################
>> Logback: 41
>> Log4j 2.0: 19
>> ###############################################
>> Testing 5 threads
>> ###############################################
>> Logback: 56
>> Log4j 2.0: 32
>> ###############################################
>> Testing 10 threads
>> ###############################################
>> Logback: 109
>> Log4j 2.0: 56
>> ###############################################
>> Testing 20 threads
>> ###############################################
>> Logback: 202
>> Log4j 2.0: 112
>> ###############################################
>> Testing 50 threads
>> ###############################################
>> Logback: 565
>> Log4j 2.0: 320
>> ###############################################
>> 
>> I've not recently run the test with the MarkerFilter but the last results
>> are at the bottom of http://logging.apache.org/log4j/2.x/performance.html.
>> Since the overall times for both Logback and Log4j are faster on the
>> computer I am using now I would also expect the times to be lower on the
>> Marker test, but the relative differences shouldn't change.
>> 
>> In general, I agree with your assessment that it is the overall average that
>> matters. My experience with running these is that the results will vary
>> quite a bit from run to run, sometimes yielding surprising results. I would
>> also argue that when you are talking in the range of 100 nanoseconds over
>> many, many log events that the difference is not particularly significant.
>> 
>> Ralph
>> 
>> 
>> On Nov 5, 2012, at 4:06 AM, Christian Grobmeier wrote:
>> 
>> Hi all,
>> 
>> I just run the PerformanceComparison test and it showed me that log4j
>> 2 is the clear winner of the game. Then I runned it again... and
>> again... and again. The results were so different that I suddenly
>> realized that there is no comparison possible.
>> 
>> In total I made it 10 times and wrote all the results in this doc:
>> https://docs.google.com/spreadsheet/ccc?key=0Aqmq9ym36L2DdHpsek1vN2xzVjdfS29idjcyTkFjVUE
>> 
>> I then summed up all the results, divided with 15 and get an average.
>> The average is like:
>> 
>> Log4j 1.x : 2314 ns
>> logback: 2116 ns
>> log4j 2.x: 2386 ns
>> 
>> The difference is pretty small. In the first runs logback performed
>> not so good, but in the latter runs logback was always better than the
>> rest.
>> 
>> That said, I think it is safe to say that all logging frameworks have
>> an excellent performance. logback performs slightly better with 200ns
>> in avg. But given the fact we are speaking of 10.000.000 debug calls I
>> would say this does not matter so much. With log4j 2 still in beta I
>> would think that we might improve performance a little more in time.
>> 
>> Any comments to this observation?
>> 
>> Cheers
>> Christian
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: log4j-dev-unsubscribe@logging.apache.org
>> For additional commands, e-mail: log4j-dev-help@logging.apache.org
>> 
>> 
>> 
> 
> 
> 
> -- 
> http://www.grobmeier.de
> https://www.timeandbill.de
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: log4j-dev-unsubscribe@logging.apache.org
> For additional commands, e-mail: log4j-dev-help@logging.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: log4j-dev-unsubscribe@logging.apache.org
For additional commands, e-mail: log4j-dev-help@logging.apache.org


Mime
View raw message