harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Anton Luht" <anton.l...@gmail.com>
Subject Re: [DRLVM] Integration checks (was building from svn on FC5)
Date Tue, 27 Jun 2006 13:13:41 GMT
Vladimir,

Regarding what tool will be used - I don't know yet. I'm studying
Eclipse plugins right now. Seems like the situation with standard
macro recording and playback in Eclipse is not very good now [1], so
the emulation will be based on one of Eclipse plugins or will be a
hand-written thing.

I plan to start with 1) and if it succeeds - add other scenarios later.

[1] https://bugs.eclipse.org/bugs/show_bug.cgi?id=80140

On 6/27/06, Vladimir Ivanov <ivavladimir@gmail.com> wrote:
> Do not you think these should be different Eclipse scenarios?:
> 1) simple scenario that run Eclipse, compile small application and runs it
> 2) simple debug scenario
> 3) scenario that runs ant-builder
> 4-N) other scenarios that emulate user work
>
> Am I correct that you are going to use record and play tools?
> Or, these will be test(s) written using Eclipse API?
>
>  Thanks, Vladimir
>
> On 6/27/06, Anton Luht <anton.luht@gmail.com> wrote:
> >
> > Good day,
> >
> > Anyway it seems that creating an automated Eclipse scenario as
> > described by Salikh will be useful. I'll try to create the scenario,
> > corresponding build target and file a JIRA issue with patch for the
> > new functionality.
> >
> > On 6/27/06, Vladimir Ivanov <ivavladimir@gmail.com> wrote:
> > > Seems, it is not so easy to define the proper test set...
> > > Let's define target to run the integration test. It may be:
> > >  1) we want to be sure that all were integrated correctly?
> > >  2) or we want to guaranty the 'product quality' for build?
> > >  3) some other?
> > >
> > > If target is 1) than we should run minimal tests set (seems, classlib
> > unit
> > > tests over tested VM will enough) on one platform.
> > > If target is 2) than each developer should run all known/defined tests
> > over
> > > all platforms. Seems, is no time for development any more. Everyone will
> > do
> > > the release engineering (RE) work.
> > >
> > > So we have 2 questions here:
> > > 1) the small list of integration test should be defined. It may be
> > subset of
> > > API unit tests collected as 1 or 2 tests from each API area just to be
> > sure
> > > that all things were integrated successfully.
> > > 2) the RE procedure should be defined. Who is responsible to build the
> > HDK
> > > and place it to download page? What tests should be run before it? How
> > often
> > > it should be doing?
> > > It is not so obvious as 1). The procedure may be defined, for example,
> > as:
> > >  - one of committers prepare binary form of HDK and test it on one
> > platform.
> > >
> > >  - if all tests passed he placed it to download somewhere and
> > >  - other people test it on other platform.
> > >  - if all tests passed the binaries are promoted and placed to the
> > > 'official' download page.
> > >
> > >  Thanks, Vladimir
> > >
> > > PS. To run some scenario tests actually not integration but functional
> > > testing.
> > >
> > > On 6/26/06, Salikh Zakirov < Salikh.Zakirov@intel.com> wrote:
> > > >
> > > > Alexey Petrenko wrote:
> > > > > Some checks before commits are defenetly good...
> > > > >
> > > > > 2006/6/23, Andrey Chernyshev < a.y.chernyshev@gmail.com>:
> > > > >> We may probably also need to define the list of
> > > > >> platforms/configurations covered by this procedure.
> > > > > I'm not sure that I get your idea correctly.
> > > > > Do you suggest to ask every developer to make some checks on
> > different
> > > > > platforms and software configurations?
> > > > > If so... Yes, it is good for product stability.
> > > > > But it will be nearly impossible because very small number of
> > > > > developers have access to different platforms and software
> > > > > configurations...
> > > >
> > > > First and foremost question is *what* to run as integration tests,
> > > > rather than on what platforms. I think we need to define what use
> > cases
> > > > we care for in the form of integration tests.
> > > > The more conveniently the integration tests are packaged, the higher
> > is
> > > > the probability of anyone running them.
> > > > The good example is the "smoke tests" included with DRLVM: they can be
> > > > built and run
> > > > with a single command 'build.bat test' (' build.sh test' on Linux).
> > > >
> > > > Once the integration test set is defined, we can think of platform
> > > > coverage.
> > > > BuildBot [1] could be the way to interested parties to contribute CPU
> > > > cycles
> > > > to verify Harmony quality.
> > > >
> > > > [1] http://buildbot.sourceforge.net/
> > > >
> > > > ---------------------------------------------------------------------
> > > > Terms of use : http://incubator.apache.org/harmony/mailing.html
> > > > To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org
> > > > For additional commands, e-mail: harmony-dev-help@incubator.apache.org
> > > >
> > > >
> > >
> > >
> >
> >
> > --
> > Regards,
> > Anton Luht,
> > Intel Middleware Products Division
> >
> > ---------------------------------------------------------------------
> > Terms of use : http://incubator.apache.org/harmony/mailing.html
> > To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org
> > For additional commands, e-mail: harmony-dev-help@incubator.apache.org
> >
> >
>
>


-- 
Regards,
Anton Luht,
Intel Middleware Products Division

---------------------------------------------------------------------
Terms of use : http://incubator.apache.org/harmony/mailing.html
To unsubscribe, e-mail: harmony-dev-unsubscribe@incubator.apache.org
For additional commands, e-mail: harmony-dev-help@incubator.apache.org


Mime
View raw message