harmony-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mikhail Loenko <mloe...@gmail.com>
Subject Re: [testing] code for exotic configurations
Date Fri, 27 Jan 2006 05:58:49 GMT
Hi George

In my test I usually either do not specify provider or loop over all of them.
The example of the test is below:

public void testEncryptedPrivateKeyInfo () {
   try {
       AlgorithmParameters a = AlgorithmParameters.getInstance(...);

       EncryptedPrivateKeyInfo e = new EncryptedPrivateKeyInfo(a, myData);

       //do some check over e
   } catch (NoSuchAlgorithmException e) {
       logln("unable to verify EncryptedPrivateKeyInfo(...)  with
*real-life* provider");
   }
}

In most cases the test would work. But if a user does not have a provider with
AlgorithmParameters then the test would be skipped and constructor of
EncryptedPrivateKeyInfo would remain untested.

Note that I could create my own provider and test with it, but what I would
really want is to test how my EncryptedPrivateKeyInfo works with
AlgorithmParameters from real provider as well as how my other classes work
with real implementations of crypto Engines.

Thanks,
Mikhail.

On 1/26/06, George Harley1 <GHARLEY@uk.ibm.com> wrote:
> Hi Mikhail,
>
> Thanks for your example.
>
> If I understand correctly (always a dangerous start to a sentence <g>),
> you would want a test case method to verify that a
> NoSuchAlgorithmException gets thrown from method
> AlgorithmParameters.getInstance() when the named provider does not have a
> concrete implementation of AlgorithmParameters available ?
> e.g.
>
> public void testGetInstanceThrowsNSAException() {
>    try {
>        AlgorithmParameters a = AlgorithmParameters.getInstance(...);
>        fail("Expected a NoSuchAlgorithmException");
>    } catch (Throwable t) {
>        assertTrue(t instanceof java.security.NoSuchAlgorithmException);
>    }
> }
>
>
> You wouldn't want a test like this to run normally (i.e. with a provider
> that *was* able to supply a concrete AlgorithmParameters) because the test
> would always fail. Instead, you would only want a test like this to be run
> where you have a provider incapable of returning the desired object.
>
> If you agree with the above interpretation of your "exotic configuration"
> then isn't it the case that you could simply augment your set of "good"
> providers with a provider (let's call it "BadProvider") that has been
> deliberately made incomplete and could be referred to by name in the above
> test method ? Then the getInstance call becomes
>
> AlgorithmParameters.getInstance("<algorithm name>", "BadProvider");
>
> This would mean that the test method *could* be included in the normal run
> of tests without need for skipping over. The "BadProvider" provider would
> be just another artifact used in the test configuration whose role is to
> intentionally create failure scenarios so that error handling works
> correctly.
>
> I'm not quite sure where the EncryptedPrivateKeyInfo constructor comes
> into your scenario, since if your provider is incapable of actually
> returning a valid AlgorithmParameters instance in the first place then how
> does the test code acquire one to test the EncryptedPrivateKeyInfo
> constructor ?
>
>
> Best regards,
> George
> ________________________________________
> George C. Harley
>
>
>
>
>
> Mikhail Loenko <mloenko@gmail.com>
> 26/01/2006 13:13
> Please respond to
> harmony-dev@incubator.apache.org
>
>
> To
> harmony-dev@incubator.apache.org
> cc
>
> Subject
> Re: [testing] code for exotic configurations
>
>
>
>
>
>
> We have tests that verify that our security framework works with users
> providers installed in the system.
>
> Consider EncryptedPrivateKeyInfo class. To call one of its constructors
> you need AlgorithmParameters instance from a provider. If the providers
> installed in your system do not have implementation of AlgorithmParameters
> (or if you misconfigured your build) you'll unable to test that
> constructor
> of EncryptedPrivateKeyInfo.
>
> So exotic config is: User's providers do not have implementation of
> AlgorithmParameters.
>
> The test that fails when no AlgorithmParameters exists can scare a user,
> the test that silently skips has a chance to be skipping forever due to
> misconfiguration.
>
> That is an example.
>
> Thanks,
> Mikhail
>
> On 1/26/06, George Harley1 <GHARLEY@uk.ibm.com> wrote:
> > Hi Mikhail,
> >
> > Just to help clarify things, could you give us all some examples of what
> > you define as an "exotic configuration" ?
> >
> > Thanks in advance,
> > George
> > ________________________________________
> > George C. Harley
> >
> >
> >
> >
> >
> > Mikhail Loenko <mloenko@gmail.com>
> > 26/01/2006 12:11
> > Please respond to
> > harmony-dev@incubator.apache.org
> >
> >
> > To
> > harmony-dev@incubator.apache.org, geir@pobox.com
> > cc
> >
> > Subject
> > Re: [testing] code for exotic configurations
> >
> >
> >
> >
> >
> >
> > Do you mean that for a single test that verifies 10 lines of code
> > working on very specific configuration I have to create a parallel test
> > tree?
> >
> > What about tests that work in two different exotic configurations?
> Should
> > we duplicate them?
> >
> > Thanks,
> > Mikhail
> >
> > On 1/26/06, Geir Magnusson Jr <geir@pobox.com> wrote:
> > > one solution is to simply group the "exotic" tests separately from the
> > > main tests, so they can be run optionally when you are in that exotic
> > > configuration.
> > >
> > > You can do this in several ways, including a naming convention, or
> > > another parallel code tree of the tests...
> > >
> > > I like the latter, as it makes it easier to "see"
> > >
> > > geir
> > >
> > >
> > > Mikhail Loenko wrote:
> > > > Well let's start a new thread as this is more general problem.
> > > >
> > > > So if we have some code designed for some exotic configurations
> > > > and we have tests that verify that exotic code.
> > > >
> > > > The test when run in usual configuration (not exotic one) should
> > > > report something that would not scare people. But if one
> > > > wants to test that specific exotic configuration that he should be
> > > > able to easily verify that he successfully made required conf and
> > > > the test worked well.
> > > >
> > > > The following options I see here:
> > > > 1) introduce a new test status (like skipped) to mark those tests
> that
> > > > did not actually run
> > > > 2) agree on exact wording that the skipped tests would print to
> allow
> > > > grep logs later
> > > > 3) introduce tests-indicators that would fail when current
> > > > configuration disallow
> > > > running certain tests
> > > >
> > > > Please let me know what you think
> > > >
> > > > Thanks,
> > > > Mikhail Loenko
> > > > Intel Middleware Products Division
> > > >
> > > >
> > >
> >
> >
> >
>
>
>

Mime
View raw message