harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mikhail Loenko <mloe...@gmail.com>
Subject Re: [classlib] Unit and performance testing
Date Tue, 24 Jan 2006 07:40:06 GMT
Hello

We have figured out that one of approcahes that was earlier dicussed and that
I originally opposed would work for us.

That is: get PerformanceTest class out of there and replace log() with calls
to java.util.logging.Logger.

Please let me know what you think.

Thanks,
Mikhail

On 1/20/06, Mikhail Loenko <mloenko@gmail.com> wrote:
> Formally, option #2 from my mail that was
>
> > > 2. Remove PerformnceTest. Introduce a simple Logger that does not print by
> > > default.
>
> does not mix any performance *infrastructure* with junit testing.
>
> I think that we do not have to find the final solution right now,  we might see
> various ideas as people will develop and contribute thier tests and as we
> investigate more options.
> As far as the #1 task for now is integration of security2, any rather good
> short-term solution would be acceptable.
>
> I have not seen any other solution that is well studied and does not cut
> existing functionality.
>
> Thanks,
> Mikhail
>
>
> On 1/20/06, Geir Magnusson Jr <geir@pobox.com> wrote:
> > [I got sick of the thread subject - it blended into every other JIRA
> > thread... ]
> >
> > There is a 4th option - not mix performance infrastructure with unit
> > testing.
> >
> > I'm all for getting "PerformanceTest" out of the class hierarchy, and
> > not having unit tests yammer out to console if we can avoid it. (I do
> > testing in console, and don't really care about the output, but it will
> > slew the performance numbers as console i/o is relatively expensive...)
> >
> > That said, I do believe in the importance of having performance numbers
> > to help detect regressions.
> >
> > George outlined how to use standard JUnit mechanisms to do this.  IMO,
> > they are good because they are the canonical way using JUnit, but they
> > also are a bit invasive too.
> >
> > Some other options :
> >
> > 1) This problem seems to be to be one of three usecases in the universe
> > for using aspects (the other two being logging and caching, of
> > course...)  So that's one area we might investigate - we would add an
> > interceptor for each test/suite/whatever to do the perf that we need to
> > be done.   We might be able to use it to turn debug logging on and off
> > as well in a cheap and uninvasive way.
> >
> > 2) TestNG - I do want to give this a hard look, as it's annotations
> > based, and see if there's something in there (or coming in there) for
> > this.  TestNG will also run JUnit tests as is, so playing with it is
> > going to be easy.
> >
> > geir
> >
> >
> > Mikhail Loenko wrote:
> > > To summarize, we have 3 options:
> > >
> > > 1. Keep PerformanceTest as a super class. Set printAllowed to false by default.
> > > 2. Remove PerformnceTest. Introduce a simple Logger that does not print by
> > > default.
> > > 3. Move performance functionality to Decorator.
> > >
> > > #1 is the most unliked. #3 as I wrote before does not work.
> > >
> > > So I can submit a script that goes through the tests replacing
> > > "extends PerformanceTest" with "extends TestCase"
> > > "import PerformanceTest" with "import Logger"
> > > and putting "Logger." before
> > > logln() and other log functions
> > >
> > > Thanks,
> > > Mikhail
> > >
> > >
> > > On 1/19/06, Geir Magnusson Jr <geir@pobox.com> wrote:
> > >>
> > >> Mikhail Loenko wrote:
> > >>> On 1/19/06, Geir Magnusson Jr <geir@pobox.com> wrote:
> > >>>> Mikhail Loenko wrote:
> > >>>>> The problem is unstable execution time of java programs:
> > >>>>>
> > >>>>> If you consequently run the same java program on the same computer
> > >>>>> in the same conditions, execution time may vary by 20% or even
more
> > >>>> Why?  Given that computers are pretty determinstic, I'd argue that
you
> > >>>> don't have the same conditions from run to run.
> > >>> Did you make experiments or it's your theoretical conclusion :) ?
> > >> Have done experiments.  I never claim that it's the same conditions
> > >> every run.  That's the issue, I think.
> > >>
> > >> geir
> > >>
> > >>> Try to create an application that runs 20 seconds and run it several
times.
> > >>>
> > >>> Frankly, I do not exactly know why. But I know a lot of reasons that
could
> > >>> affect this dispersion. For example, there is a number of serving
> > >>> threads and GC that impact on execution time.
> > >>> Thanks,
> > >>> Mikhail
> > >>>
> > >>>
> > >>>> geir
> > >>>>
> > >>>
> > >
> > >
> >
>

Mime
View raw message