cocoon-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Robin Green" <>
Subject Aha! got it! 64k limit(was: new version of the sql logicsheet under development)
Date Mon, 28 Aug 2000 16:28:20 GMT
No wait - I have a better idea!

There is no need to write out .ser files. It is inefficient to write them 
out and read them back in again. And declaring extra methods for common 
tasks will not help that much.

Even if you're in C2 and using SAX, the overhead of SAX->DOM->SAX is 
probably less than SAX->filesystem->SAX. Just pass in an array or Vector of 
literal XML fragments as DOM objects (Elements, DocumentFragments, 
TextNodes, and/or Attributes) to a one-time initialization method in the 
XSPPage class, which then stores them as a field (array or Vector). This is 
thread safe because it is only called once, before first execution. Then the 
populateDocument method can use <xsp:expr>-type code to insert these DOM 
objects directly into the output (cloning them first).

You'd have an optimization process that would attempt to group together 
literal XML into as large as possible clumps that could be inserted at once. 
For example, with this fragment:

  <a><b><c>hello</c> <xsp:logic>... </xsp:logic> </b></a>

you could have just one Element in the literal-vector, like this:


with an associated "instruction" to move the currentNode marker to just 
after </c> for the xsp:logic, and then another instruction to pop it back 
out to after </a> after the xsp:logic block had completed.

- so thinking about it a bit more, the literal-vector would not just contain 
DOM Objects, but also little instructions to move the currentNode. But this 
would be easy enough to implement - it's just the optimisation that would 
require a bit of thought.

Unfortunately it wouldn't be this easy with SAX, which doesn't allow random 
access - with SAX I think you could only have well-formed sequential 
"chunks", not overlapping chunks as in the DOM example above.

This will help a lot with large chunks of literal data. But there is still a 
theoretical limit, even then - it's just that with this, I would have 
thought that limit would be far less likely to be reached.

What do y'all think? I tapped this out a bit hastily so it might not be the 

Get Your Private, Free E-mail from MSN Hotmail at

Share information about yourself, create your own public profile at

View raw message