harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Spark Shen" <smallsmallor...@gmail.com>
Subject Re: [buildtest] pass rate definition
Date Tue, 23 Oct 2007 00:43:33 GMT
2007/10/19, Alexei Fedotov <alexei.fedotov@gmail.com>:
>
> Sean, Leo, Spark,
>
> Thank you for your replies. I especially like the following one:
> > I understand passing rate as an indicator to show the progress of
> harmony.
>
> To my understanding this is the reason why pass rate discussions are
> sometimes so painful between people who drive the commmunity and want
> to progress in a specific direction. :-)


Hi Alexei,

Feel really happy that you like the statement :-)
If to show the status if the aim, we may redefine the pass rate into a more
fine formula. And
generate a colorful chart with it.

As far as the formula is concerned, we can assign a factor to each category
of test cases. For
example, BuggyTest has a factor of 5, IntermittentlyFailingTest of 4 etc.
And we can give a score
according to the formula. The score is the lower (or higher) the better.

And generate a pie chart, to show clearly which parts of test failures
contribute more to the total incompleteness.

On 10/19/07, Leo Li <liyilei1979@gmail.com> wrote:
> > On 10/19/07, Spark Shen <smallsmallorgan@gmail.com> wrote:
> > > 2007/10/19, Leo Li <liyilei1979@gmail.com>:
> > > >
> > > > On 10/19/07, Spark Shen <smallsmallorgan@gmail.com> wrote:
> > > > > 2007/10/18, Leo Li <liyilei1979@gmail.com>:
> > > > > >
> > > > > > On 10/18/07, Alexei Fedotov <alexei.fedotov@gmail.com>
wrote:
> > > > > > > Hello,
> > > > > > >
> > > > > > > I'm involved in numerous discussions on the subject, and
want
> to
> > > > make
> > > > > > > these discussions transparent to those community members
who
> are
> > > > > > > interested. Imagine we have a test suite which contains
five
> > > > following
> > > > > > > tests:
> > > > > > >
> > > > > > > BuggyTest.java
> > > > > > >    The test fails due to the test bug.
> > > > > > >
> > > > > > > FailingReferenceTest.java
> > > > > > >    The test fails on Harmony and passes on RI. The test
design
> does
> > > > > > > not imply that the test should pass.
> > > > > > >
> > > > > > > IntermittentlyFailingTest.java
> > > > > > >    The test fails intermittently due to HDK bug.
> > > > > > >
> > > > > > > UnsupportedTest.java
> > > > > > >    The test produces an expected fail due to unimplemented
> > > > > > > functionality in HDK.
> > > > > > >
> > > > > > > FailingTest.java
> > > > > > >    The test fails due to HDK bug.
> > > > > > >
> > > > > > > PassingTest.java
> > > > > > >    This one prints PASSED and completes successfully.
> > > > > > >
> > > > > > > What would be the correct formula to define a pass rate?
All
> agree
> > > > > > > that the rate is a number of passed tests divided to a
total
> number
> > > > of
> > > > > > > tests. Then people start to argue what are the numerator
and
> the
> > > > > > > denominator.
> > > > > > >
> > > > > > > One may say, that he counts any failures as bugs. Then
she
> gets
> > > > 16.66%
> > > > > > > pass rate. Others get 50%, ignoring all fail reasons except
> the one
> > > > > > > which produces a fixable HDK failure.
> > > > > > >
> > > > > > > If anyone could share common sense knowledge or Apache
> practices on
> > > > > > > the subject, this would be interesting.
> > > > > > >
> > > > > > >
> > > > > >
> > > > > > I think how to define the passing rate is not only related to
> the
> > > > > > reason of the failure tests but as the scope of the tests itself
> and
> > > > > > what the passing rate means as well.
> > > > > >
> > > > > > Current tests can be separated to two category:
> > > > > > 1. Tests provided by harmony developers.
> > > > > > 2. Tests provided by applications.
> > > > >
> > > > >
> > > > > I like your classification. :-)
> > > > >
> > > > > For 1, I do not think the passing rate can prove much. Current
> process
> > > > > > requires that till the test passes on harmony build it is not
> checked
> > > > > > in to the source code if I have not missing something. Although
> there
> > > > > > is some exceptions, we try to achieve this goal. Thus Harmony
is
> > > > > > assumed to pass all these tests and if there is a functional
> missing
> > > > > > or a known bug to be fixed, the test is supposed not exist in
> > > > > > harmony's code base and cannot be calculated in the passing
> rate.
> > > > >
> > > > >
> > > > > And I have different opinion about UnsupportedTest.java and test
> fails
> > > > on
> > > > > know issue
> > > > > (Can they be categorized into FailingReferenceTest or
> > > > > IntermittentlyFailingTest
> > > > > or FailingTest according to Alexei?) with you.
> > > > >
> > > > > I understand passing rate as an indicator to show the progress of
> > > > harmony.
> > > > > And these test failures indeed
> > > > > remind us more to be improved. Why should they be excluded?
> > > > >
> > > >
> > > > But in harmony's practice,we do not first checkin and exclude the
> > > > testcase which is not supported in current code base and then
> complete
> > > > the feature.
> > >
> > >
> > > I think we have exclude list.
> > >
> > > And we have not yet provided a covering test suites for
> > > > all java features. If there is some UnsupportedTests, I do not think
> > > > they can represent the overall status of harmony's progress.
> > >
> > >
> > > And the covering test suites is on the way? For example, sean has
> integrated
> > > emma coverage
> > > report[1]. And coverage suite for java features can be committed with
> > > exclude list first?
> >
> >   Excuse me if I am misleading you. I do not mean the coverage
> > report.:)  What I want to cover here is not the harmony's code, but
> > the demand or function that java spec requires.
> >
> > >
> > > Correct my if I am wrong.
> > > [1] http://article.gmane.org/gmane.comp.java.harmony.devel/29585
> > >
> > > > Of cause, if there is some common sense or best practice, I will
> stick to
> > > > > them.
> > > > >
> > > > >
> > > > > For 2, the passing rate can to some degree reflect harmony's
> maturity
> > > > > > and I think normally they need not be differentiated why they
do
> not
> > > > > > pass. Except for few bugs which is due to the application's
> improper
> > > > > > dependency on sun's behavior ( For example, Sean has discovered
> Jython
> > > > > > assumes the specified order of the entry stored in HashMap.
It
> is
> > > > > > actually a bug but just coincides to pass on RI. ), the failure
> among
> > > > > > the majority of the tests provided by application can reveal
one
> bug
> > > > > > or one incompatibility that harmony should try to resolve since
> we are
> > > > > > trying to give a compatible product as RI for user to switch
> > > > > > seamlessly.
> > > > > >
> > > > > > > --
> > > > > > > With best regards,
> > > > > > > Alexei,
> > > > > > > ESSD, Intel
> > > > > > >
> > > > > >
> > > > > >
> > > > > > --
> > > > > > Leo Li
> > > > > > China Software Development Lab, IBM
> > > > > >
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > Spark Shen
> > > > > China Software Development Lab, IBM
> > > > >
> > > >
> > > >
> > > > --
> > > > Leo Li
> > > > China Software Development Lab, IBM
> > > >
> > >
> > >
> > >
> > > --
> > > Spark Shen
> > > China Software Development Lab, IBM
> > >
> >
> >
> > --
> > Leo Li
> > China Software Development Lab, IBM
> >
>
>
> --
> With best regards,
> Alexei,
> ESSD, Intel
>



-- 
Spark Shen
China Software Development Lab, IBM

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message