commons-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Brett Henderson" <>
Subject RE: [codec] multipart encoders/decoders
Date Tue, 13 Jan 2004 02:34:37 GMT
> > There are obviously advantages to having a single unified framework 
> > and if possible it would be the ideal result. Unfortunately 
> I have run 
> > into performance disadvantages so far. I haven't tried it 
> for a while 
> > but in the past my base 64 conversion has not been as fast as the 
> > existing codec implementation for small conversions.
> > For common algorithms such as base 64 it may make sense to have
> > two implementations optimised for different purposes.
> That does not seem justified at first. Optimize last if at all... ;-)

Hehe, you're right.

I guess it just feels wrong pushing for stream support in codec when
its introduction will incur overhead for non-streamed cases.  Of
course in 99.9% of those cases the performance difference will be
immeasurable in the overall application :-)

> > In addition, I'm not familiar with language codecs but you 
> mentioned 
> > it makes no sense to use these in streams.
> One of the things to keep in mind, is that for simple cases, 
> the f/w should be invisible to the client code. For example:
> DigiestUtil.md5Hex(new FileInputStream("boo.txt"));
> Gary

Hmm, that is definitely worth remembering.  The more generic I made
the design, the more coding was required in order to use it :-(
Perhaps a symptom of over-engineering, I hope not.

There are a few ways I can think of dealing with this.
1. Do nothing.  Force people to learn a new and more complicated API.
2. Create a new API that supports streaming leaving the existing
API in place for the existing functionality and common use cases
not requiring stream support.
3. Add stream support to the existing API.
4. Create an API supporting stream processing and re-implement the
existing API using it.

Of these I think.
1. A non-starter but had to list it.  Backwards compatibility and usability
being two reasons.
2. This is a valid approach but leaves two distinct code-bases to support.
I hope there are other options available.
3. In most projects this tends to be the way things are done.  In this
case I'm not sure that its practical and may get fairly messy and create
an unmaintable codebase.  I really need to spend more time looking at the
existing APIs in detail though.
4. I think a variation on this idea could work well in practice.
Codec could be conceptually designed in various layers.  It could
have a low level API that is modular and supports stream based
processing.  My library or some equivalent would fit this purpose.
A second layer could then provide simplified access to the library
for the most common use cases implementing the existing API and adding
new functionality as desired.

To give an example, the example could be implemented as follows:
  //DigiestUtil.md5Hex(new FileInputStream("boo.txt"));
  public class DigestUtil {
    public static String md5Hex(InputStream inputStream) throws
CodecException {
      BufferByteConsumer result = new BufferByteConsumer();
      ChainByteEngine chain = new ChainByteEngine(result);
      chain.append(new MD5());
      chain.append(new AsciiHexEncode());
      new InputStreamProducer(chain, inputStream).pump();
      return new String(;

There's some overhead in initialisation but most classes are fairly
All of the above classes have been implemented if you wish to have a

I updated some of the classes last night, a copy can be found at:


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message