cocoon-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Derek Hohls" <>
Subject Re: Aggregation - or is it? How do I combine multiple streams of SAX events?
Date Thu, 03 Jun 2004 06:13:53 GMT

Apologies in advance for the trivial advice... probably others will
you better and more complete answers, but if I had to tackle this 
problem and solve it right now, I would try and break the problem
up into logical pipelines.  The "mini" pipeline, where each produces
a particular type of sub-document, is an efficient approach to this (
as it allows for caching at the lowest possible level).

You would also need a "master" document creation pipeline, which
will have calls in appropriate places to all the other sub-document 
pipelines, in the form:

<cinclude:include src="cocoon:/sub-doc-1-pipeline"/>

As you stated, this document can be created dynamically, so that
the particular set of cincludes will vary from one instance to the

Final result can be transformed to desired output.

I guess there are fancier approaches to this such as writing out
to disk and reading pre-created versions based on some conditions...
but I have  successfully used the above approach in a working app.


>>> 2004/06/02 11:44:18 PM >>>
I have a set of XSP/ESQL based pipelines that generate XML documents
from an
Oracle database. Essentially I'm using them as a sort of customized
bridge, and I've been very happy with Cocoon for that purpose so far.
(I'm using
Cocoon 2.1.4 on Linux, but will upgrade to 2.1.5 at some point.)

Now I'd like to combine several of those individual XML documents into
a single
document. I know there are many possible approaches - map:aggregate,
transformer, XInclude, taglibs, etc, etc. I've read through the online
docs and
the Cocoon Wiki, but I haven't found anything that makes sense yet.

What makes this more challenging are the following restrictions:

1. There's not one static list of sub-documents to include - it's based
on the
request, with a mapping that changes fairly frequently, so I'd rather
not put it
in my sitemap if I can easily avoid it.

2. I get a single key as a query parameter in the request; it's
basically a ZIP
code/postal code. That's sufficient for me to query the database for
the first
sub-document in the list, which contains information about that postal
including several mappings to primary keys of other tables. I then need
to read
those mapped values from the first sub-document in order to run the
queries to
get the other sub-documents.

3. I know that I will have to worry about server capacity, given the
that this application will see, so I need to design efficiently.
Because of
this, I want to avoid any unnecessary steps I can. In particular, I
want to
include the "raw" output of my XSP/ESQL, (i.e. the streams of SAX
events) rather
than serializing each document to XML only to immediately reparse it.
yet, I'd like to cache as many of the sub-documents as possible.

So, basically, I need to generate the first sub-document and send it
down the
pipeline, but also read values out of it in order to generate and
"merge in"
other streams of dynamically generated XML. I need to be able to easily
what pieces are included for a given request URI, and I need to do it
without a lot of extra steps. (And while I'm wishing, I'd like a

What would be the best approach to accomplish all of this? I would very
appreciate some suggestions, with pointers to documentation or source I
look at. A nudge in the right direction would be a great help.

Thanks in advance!

Jeff Jones
jajones -at-
TWC Interactive

To unsubscribe, e-mail: 
For additional commands, e-mail: 

This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.
MailScanner thanks transtec Computers for their support.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message