harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Geir Magnusson Jr <g...@pobox.com>
Subject Re: [testing] code for exotic configurations
Date Fri, 27 Jan 2006 14:00:08 GMT


Mikhail Loenko wrote:
> Well, I like things clean and simple too :)
> 
> We could develop our own implementations for unit tests, it would take
> the same efforts as develop our own provider. It's a huge amount of work.
> Now we use BouncyCastle provider for the purpose of unit testing:
> it's open source, everyone can download it etc.
> 
> But our unit tests without any modification could serve as 'system' tests
> as Anton called them: Different providers might be installed to the system
> and the tests could be run in that configuration.
> 
> Does it mean that we are going to copy unit tests to a parallel 'system' test
> suite?

No - if you reuse something, that's cool.

What I'm trying to avoid is having a big bag of tests that require extra 
  things to be maintained in order for simple, common things to happen. 
  Case in point would be our current security2 testbed, where there's no 
naming convention (yet) and there has to be exclusion lists of tests by 
package in order for non-test code that is in the same packages to not 
be run.

(the solution is a *Test.java naming convention that gives full freedom 
for co-located support resources w/o the need for unncessary maintenance...)

Similarly, for things like system tests where there's going to be 
special setup and configuration anyway (like modifying the classpath, 
setting system properties, etc) then defining test cases from wherever 
is less of a burden.

See what I'm getting at?

geir

> 
> Thanks,
> Mikhail
> 
> On 1/27/06, Geir Magnusson Jr <geir@pobox.com> wrote:
>>
>> Mikhail Loenko wrote:
>>> On 1/27/06, George Harley1 <GHARLEY@uk.ibm.com> wrote:
>>>> But because we live in a less than ideal world there will, no doubt, be
>>>> some tests that will demand an
>>>> environment that is impossible or at the very least difficult to mock up
>>>> for the majority of developers/testers.
>>> I'm absolutely agree that we are neither living in the ideal world nor trying
>>> to make it ideal :)
>>>
>>> So until we got a 'system' test suite why should we weaken existing tests?
>>>
>>>> One solution could be to segregate those tests into a separate test suite
>>>> (available for all but primarily
>>>> for those working in the niche area that demands the special environment).
>>> Moving this kind of tests would affect many people: they will see
>>> separate suites,
>>> try, ask questions...
>>>
>>> If the test can be configured by a few people only who works on that
>>> specific area and those people are aware of those tests why not just
>>> print a log when the test is skipped?
>> Because the same set of people that will be bothered by separate suites
>> will have the same reaction to skipped tests.
>>
>> This is why I advocate making a "separate tree" for the system tests -
>> make it clear that they are not the general unit tests...
>>
>>> It would not disturb most of the people because the test will pass in 'bad'
>>> environment. But those, who know about these tests will sometimes grep
>>> logs to validate configuration.
>> IMO, there's too much special information there, too much config.  I'm a
>> simple person, and like things clean and simple.  I don't like to mix
>> concerns when possible, and here's a place where it's definitely
>> possible to separate cleanly.
>>
>> I don't see the downside.
>>
>> geir
>>
>>> Thanks,
>>> Mikhail
>>>
>>>
>>>> Alternatively, they could be
>>>> included as part of a general test suite but be purposely skipped over at
>>>> test execution time using a
>>>> test exclusion list understood by the test runner.
>>>>
>>>>
>>>> Best regards,
>>>> George
>>>> ________________________________________
>>>> George C. Harley
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Tim Ellison <t.p.ellison@gmail.com>
>>>> 27/01/2006 08:53
>>>> Please respond to
>>>> harmony-dev@incubator.apache.org
>>>>
>>>>
>>>> To
>>>> harmony-dev@incubator.apache.org
>>>> cc
>>>>
>>>> Subject
>>>> Re: [testing] code for exotic configurations
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Anton Avtamonov wrote:
>>>>>> Note that I could create my own provider and test with it, but what
I
>>>> would
>>>>>> really want is to test how my EncryptedPrivateKeyInfo works with
>>>>>> AlgorithmParameters from real provider as well as how my other classes
>>>> work
>>>>>> with real implementations of crypto Engines.
>>>>>>
>>>>>> Thanks,
>>>>>> Mikhail.
>>>>>>
>>>>> Hi Mikhail,
>>>>> There are 'system' and 'unit' tests. Traditionally, unit tests are of
>>>>> developer-level. Each unit test is intended to test just a limited
>>>>> piece of functionality separately from other sub-systems (test for one
>>>>> fucntion, test for one class, etc). Such tests must create a desired
>>>>> environment over the testing fucntionality and run the scenario in the
>>>>> predefined conditions. Unit tests usually able to cover all scenarios
>>>>> (execution paths) for the tested parts of fucntionality.
>>>>>
>>>>> What are you talking about looks like 'system' testing. Such tests
>>>>> usually run on the real environment and test the most often scenarious
>>>>> (the reduntant set, all scenarios usually cannot be covered). Such
>>>>> testing is not concentrated on the particular fucntionality, but
>>>>> covers the work of the whole system.
>>>>> A sample is: "run some demo application on some particular platform,
>>>>> with some particular providers installed and perform some operations".
>>>>>
>>>>> I think currently we should focus on 'unit' test approach since it is
>>>>> more applicable during the development (so my advise is to revert your
>>>>> tests to install 'test' providers with the desired behavior as George
>>>>> proposed).
>>>>> However we should think about 'system' scenarios which can be run on
>>>>> the later stage and act as 'verification' of proper work of the entire
>>>>> system.
>>>> I agree with all this.  The unit tests are one style of test for
>>>> establishing the correctness of the code.  As you point out the unit
>>>> tests typically require a well-defined environment in which to run, and
>>>> it becomes a judgment-call as to whether a particular test's
>>>> environmental requirements are 'reasonable' or not.
>>>>
>>>> For example, you can reasonably expect all developers to have an
>>>> environment to run unit tests that has enough RAM and a writable disk
>>>> etc. such that if those things do not exist the tests will simply fail.
>>>>  However, you may decide it is unreasonable to expect the environment to
>>>> include a populated LDAP server, or a carefully configured RMI server.
>>>> If you were to call that environment unreasonable then testing JNDI and
>>>> RMI would likely involve mock objects etc. to get good unit tests.
>>>>
>>>> Of course, as you point out, once you are passing the unit tests you
>>>> also need the 'system' tests to ensure the code works in a real
>>>> environment.  Usage scenarios based on the bigger system are good, as is
>>>> running the bigger system's test suite on our runtime.
>>>>
>>>> Regards,
>>>> Tim
>>>>
>>>>
>>>>> --
>>>>> Anton Avtamonov,
>>>>> Intel Middleware Products Division
>>>>>
>>>> --
>>>>
>>>> Tim Ellison (t.p.ellison@gmail.com)
>>>> IBM Java technology centre, UK.
>>>>
>>>>
>>>>
>>>
> 
> 


Mime
View raw message