harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mark Hindess <mark.hind...@googlemail.com>
Subject Re: [classlib] Testing conventions - a proposal
Date Fri, 07 Jul 2006 08:28:01 GMT

On 6 July 2006 at 21:02, "Nathan Beyer" <nbeyer@kc.rr.com> wrote:
>
> I think Tim has a valid point, or at least the point I'm inferring
> seems valid: the testing technology is not the real issue. This
> problem can be solved by either JUnit or TestNG. More specifically,
> this problem can be solved utilizing the grouping of arbitrary tests.

I'm happy with either JUnit or TestNG.  My only concerns about TestNG
are non-technical.  (Can we use TestNG without adding to the burden for
the developer looking at Harmony for the first time?  I think we can
automate the download as we do for other dependencies.  TestNG is under
the Apache License but I'm not sure what the licenses are like for the
third party code it includes.  This may not be immediately important,
but it might be if we wanted to include tests - ready to run - in the
hdk.)

> I'm been playing with reorganizing the 'luni' module using the
> suggested directory layout and it really doesn't seem to provide
> much value.

> Also, I'm a bit partial to the concept of one source directory
> (src/main/java), one test source directory (src/test/java) and
> any number of resource (src/main/resources/*) and test resource
> directories (src/test/resources/*) as defined by the Maven 2 POM.

+1

> The only practical value I saw in the directory layout was that in
> Eclipse I could just select a single folder and run all of the API
> tests against an RI. The same can be said for any of the other test
> folders, but this same feature can also be achieved via TestSuites.
>
> As such, I'm in alignment with Tim's thoughts on just using TestSuites
> to define the major groupings. I think the proposed naming conventions
> of 'o.a.h.test.<module>.java.package' are fine. The only addition
> I would make is to at guidelines on class name, so that pure
> API tests, Harmony tests and failing tests can live in the same
> package. Something as trivial as XXXAPITest, XXXImplTest and
> XXXFailingTest would work. Perhaps a similar approach can be used for
> platform-specific tests. These tests would then be grouped, per-module
> into an APITestSuite, an ImplTestSuite, a FailingTestSuite and
> Platform-specificTestSuites.

This is where I think TestNG has the edge.  XXXFailingTest could contain
both failing API and failing Impl tests?  With TestNG these failing
tests would not have to be moved out of the code base, but could simply
be annotated in-place.  I also like the idea of being able to write
tests (API tests) for code that we don't have yet put them in place and
annotate them appropriately until the code is written.  For instance,
when someone raises a JIRA with a test (that passes on RI) but no fix
we can add the test right away.

> In regards to tests that must be on the bootclasspath, I would say
> either just put everything on the bootclasspath (any real harm)

Testing this way means we are doing API testing in a way that is
different from the typical way a user will use the API.  This seems
wrong to me.

> or use pattern sets for bootclasspath tests (80% of the time the
> classes will be java*/*).

That might work.

> In regards to stress tests, performance tests and integration tests, I
> believe these are patently different and should be developed in their
> own projects.

I'm inclined to agree.

Regards,
 Mark.

> > -----Original Message-----
> > From: Tim Ellison [mailto:t.p.ellison@gmail.com]
> > <snip/>
> > 
> > Considering just the JUnit tests that we have at the moment...
> > 
> > Do I understand you correctly that you agree with the idea of creating
> > 'suites of tests' using metadata (such as TestNG's annotations or
> > whatever) and not by using the file system layout currently being
> > proposed?
> > 
> > I know that you are also thinking about integration tests, stress tests,
> > performance tests, etc. as well but just leaving those aside at the
> > moment.
> > 
> > Regards,
> > Tim
> > 
> > 
> > > Thanks
> > > Mikhail
> > >
> > >
> > >> Stress
> > >> ...and so on...
> > >>
> > >>
> > >> If you weigh up all of the different possible permutations and then
> > >> consider that the above list is highly likely to be extended as things
> > >> progress it is obvious that we are eventually heading for large amounts
> > >> of related test code scattered or possibly duplicated across numerous
> > >> "hard wired" source directories. How maintainable is that going to be ?
> > >>
> > >> If we want to run different tests in different configurations then IMHO
> > >> we need to be thinking a whole lot smarter. We need to be thinking
> > about
> > >> keeping tests for specific areas of functionality together (thus easing
> > >> maintenance); we need something quick and simple to re-configure if
> > >> necessary (pushing whole directories of files around the place does not
> > >> seem a particularly lightweight approach); and something that is not
> > >> going to potentially mess up contributed patches when the file they
> > >> patch is found to have been recently pushed from source folder A to B.
> > >>
> > >> To connect into another recent thread, there have been some posts
> > lately
> > >> about handling some test methods that fail on Harmony and have meant
> > >> that entire test case classes have been excluded from our test runs. I
> > >> have also been noticing some API test methods that pass fine on Harmony
> > >> but fail when run against the RI. Are the different behaviours down to
> > >> errors in the Harmony implementation ? An error in the RI
> > implementation
> > >> ? A bug in the RI Javadoc ? Only after some investigation has been
> > >> carried out do we know for sure. That takes time. What do we do with
> > the
> > >> test methods in the meantime ? Do we push them round the file system
> > >> into yet another new source folder ? IMHO we need a testing strategy
> > >> that enables such "problem" methods to be tracked easily without
> > >> disruption to the rest of the other tests.
> > >>
> > >> A couple of weeks ago I mentioned that the TestNG framework [2] seemed
> > >> like a reasonably good way of allowing us to both group together
> > >> different kinds of tests and permit the exclusion of individual
> > >> tests/groups of tests [3]. I would like to strongly propose that we
> > >> consider using TestNG as a means of providing the different test
> > >> configurations required by Harmony. Using a combination of annotations
> > >> and XML to capture the kinds of sophisticated test configurations that
> > >> people need, and that allows us to specify down to the individual
> > >> method, has got to be more scalable and flexible than where we are
> > >> headed now.
> > >>
> > >> Thanks for reading this far.
> > >>
> > >> Best regards,
> > >> George
> > >>
> > >>
> > >> [1]
> > >>
> > http://incubator.apache.org/harmony/subcomponents/classlibrary/testing.htm
> > l
> > >>
> > >> [2] http://testng.org
> > >> [3]
> > >> http://mail-archives.apache.org/mod_mbox/incubator-harmony-
> > dev/200606.mbox/%3c44A163B3.6080005@googlemail.com%3e
> > >>
> > >>
> 
> 
> ---------------------------------------------------------------------
> Terms of use : http://incubator.apache.org/harmony/mailing.html
> To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org
> For additional commands, e-mail: harmony-dev-help@incubator.apache.org



---------------------------------------------------------------------
Terms of use : http://incubator.apache.org/harmony/mailing.html
To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org
For additional commands, e-mail: harmony-dev-help@incubator.apache.org


Mime
View raw message