harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Egor Pasko <egor.pa...@gmail.com>
Subject [drlvm][testing][jit] regular testing for JIT regressions on HUT
Date Wed, 27 Dec 2006 11:07:33 GMT
Folks,

I have a novell for you.

Preface:

today I ran Harmony Unit Tests on pure Jitrino.OPT and pure
Jitrino.JET again to compare. Results are not surprising)) So much to
fix))

Should I wake up regularly to run the tests ang ring the bell?  Maybe. That
makes me think of some automation/improvements to "the process". This item would
be interesting to JIT guys and especially 'build-test' and CC guys.

In brief:

    * JET is more stable than OPT, 24 tests passed on JET, but failed on OPT 
      (see below for the list of failed tests)

    * I encountered some problems running awt tests from command-line
      without ant 
      (wanna help? see below)

    * I am starting the new iteration of OPT fixing on HUT 
      (congratulate below)

    * Got some ideas how JIT bugs can be detected fast/automatically on
      HUT aiming at "no regression" 
      (participate .. below)

In depth:

--------------------------------------------------------------------------------
*24 tests passed on JET, but failed on OPT*
--------------------------------------------------------------------------------
      java.awt.BasicStrokeTest
      java.awt.geom.AffineTransformTest
      java.awt.geom.Arc2DTest
      java.awt.geom.CubicCurve2DTest
      java.awt.geom.Ellipse2DTest
      java.awt.geom.QuadCurve2DTest
      java.awt.geom.RoundRectangle2DTest
      javax.swing.plaf.UIResourceTest
      javax.swing.plaf.metal.MetalFileChooserUITest
      org.apache.harmony.luni.tests.java.lang.DoubleTest
      org.apache.harmony.luni.tests.java.lang.FloatTest
      org.apache.harmony.luni.tests.java.lang.LongTest
      org.apache.harmony.luni.tests.java.lang.MathTest
      org.apache.harmony.luni.tests.java.lang.StrictMathTest
      org.apache.harmony.luni.tests.java.lang.StringBuilderTest
      org.apache.harmony.luni.tests.java.util.ArraysTest
      org.apache.harmony.luni.tests.java.util.UUIDTest
      org.apache.harmony.sound.tests.javax.sound.midi.SequenceTest
      org.apache.harmony.tests.java.math.BigDecimalArithmeticTest
      org.apache.harmony.tests.java.math.BigDecimalCompareTest
      org.apache.harmony.tests.java.math.BigDecimalConstructorsTest
      org.apache.harmony.tests.java.math.BigDecimalScaleOperationsTest
      org.apache.harmony.tests.java.math.BigIntegerConvertTest
      org.apache.harmony.text.tests.java.text.ChoiceFormatTest
each means an near 100% *bug in OPT*, which is most likely not
reproduced on everyday runs, but is enough dangerous

--------------------------------------------------------------------------------
*I encountered some problems running awt tests from command-line*
--------------------------------------------------------------------------------
    To run the big pack of tests I use 'ant test'. To run on some specific
    mode, say -Xem:opt, I put "-Xem:opt" to
    deploy/jdk/jre/bin/default/harmonyvm.properties, but I do not like
    doing it // may influence some other tests running at the same time

    What I am used to do to investigate a failure on a classlib test:

    HARMONY  -Xbootclasspath/p:$classlib/depends/jars/junit_3.8.2/junit.jar:$classlib/deploy/build/test/support.jar:$classlib/modules/luni/bin/test
junit.textui.TestRunner org.apache.harmony.luni.tests.java.lang.DoubleTest

    for luni it is OK:
    ..................................
    Time: 0.446

    OK (34 tests)

    for awt the thing is different:
    $HARMONY -Xbootclasspath/p:$classlib/depends/jars/junit_3.8.2/junit.jar:$classlib/deploy/build/test/support.jar:$classlib/modules/awt/bin/test
junit.textui.TestRunner java.awt.geom.AffineTransformTest

    .F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.F.
    F.F.F.F.F.F.F.F.F.F.F.F.F.F.F
    Time: 0.023
    There were 55 failures:
    1) warning(junit.framework.TestSuite$1)junit.framework.AssertionFailedError: Exception
in constructor: testCreate1 (junit.framework.Assertio
    nFailedError: Variable TEST_SRC_DIR not defined
            at java.awt.SerializeTestCase.getSerializePath(SerializeTestCase.java:46)
            at java.awt.geom.AffineTransformTest.<init>(AffineTransformTest.java:105)
            at java.lang.reflect.VMReflection.newClassInstance(Native Method)
            at java.lang.reflect.Constructor.newInstance(Unknown Source)
            at junit.runner.BaseTestRunner.getTest(BaseTestRunner.java:118)
    (... and a lot of stuff like that)

    what should I set to get rid of "Variable TEST_SRC_DIR not defined"?

--------------------------------------------------------------------------------
*I am starting the new iteration of OPT fixing on HUT*
--------------------------------------------------------------------------------

    It is more fun to fix this crap together. George? Mikhail Fursov? Pavel
    Ozhdikhin?

    I am too lazy to report them officially now. Some tests are obviously
    failing due to the same reason. I will be reporting a test to JIRA and
    [1] as soon as I start investigating. Suggestions welcome.

--------------------------------------------------------------------------------
*Got some ideas how JIT bugs can be detected*
--------------------------------------------------------------------------------

    Knowing bugs sooner is obviously better. Negative power of a bug grows
    exponentially over time. To detect bugs faster we have pre-commit
    tests, CC runs, extending/excluding activity is going on etc. Which is
    good.

    Bad news for Jitrino.OPT is that it is lacking regular testing with
    tests we already have. Most of the tests run in default mode, which is
    not OPT-active, JET is used most of the time.

    Good news for Jitrino.OPT is that many of it's bugs can be detected easily
    and almost 100% detected as OPT-specific *automatically*. To detect if a bug
    is OPT-specific a script can:
    * run all tests (for example, HUT) on JET
    * run the same on OPT
    * filter out intermittent failures (by rerunning failed tests several times)
    * if some test fails on OPT and passes on JET, report it

    that is almost 100% Jitrino.OPT bug searcher, which is a promising tool to
    detect Jitrino.OPT bugs early and *automatically*.

    Opinions?

-- 
Egor Pasko


Mime
View raw message