cocoon-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Olivier Lange" <w...@petit-atelier.ch>
Subject RE: Large documents and fragments?
Date Tue, 09 Sep 2003 19:09:58 GMT
> This might be a bit outside of the normal cocoon usage. Has anyone else
> had any experience with this approach? Am I missing something obvious?
> Is there a better way?

Have you seen that Cocoon can be run from the command line? In that case, it
produces static files for each matched URI in the sitemap, and Cocoon can
follow links between the files. The Cocoon documentation is built like this,
it was the initial intent of Cocoon. Apache Forrest uses Cocoon to do this
also, generating static HTML and PDF documents alike. I'm just doing this to
generate a website offline.

> I'm wondering if the document should be split up into fragments. How would
> something like this be done with cocoon? Can you serialize to a disk file?

Yes. Files are automatically created for the matched URIs from the
serialized content if you run Cocoon from the command-line.

The generation process could be splitted between different matchers, each
one composing some part of the document. You could even shield inner
processing from matching URIs in so called "internal" pipelines. The
"external" pipelines would drive the processing, aggregate the content built
by internal pipelines, further transform and serialize it.

Is that of any help?

Olivier


-----Message d'origine-----
De : Fred Toth [mailto:ftoth@synernet.com]
Envoye : mardi, 9. septembre 2003 16:01
A : users@cocoon.apache.org
Objet : Large documents and fragments?


Hi,

We work in the scientific publishing industry and our typical source
materials
are fairly large XML files that contain a journal article with all the
usual stuff,
abstracts, bibliographic references, figures, tables, etc.

One of these documents typically yields multiple individual pages. For
example,
we will have an abstract page, a full text page, a figure 1 page, etc.
Further, we
will aggregate bits of 50 documents or so to produce a table of contents.

I am looking for the best way to approach this with cocoon. It seems
impractical
to have a single source document drive all of these pages? I'm wondering
if the document should be split up into fragments. How would something like
this be done with cocoon? Can you serialize to a disk file?

Also note that we are likely to be generating HTML off line and not using
cocoon
for serving pages. But we want to be able to take advantage of sitemaps,
pipelines
and all the other goodies to get the job done.

This might be a bit outside of the normal cocoon usage. Has anyone else
had any experience with this approach? Am I missing something obvious?
Is there a better way?

Many thanks!

Fred


---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@cocoon.apache.org
For additional commands, e-mail: users-help@cocoon.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@cocoon.apache.org
For additional commands, e-mail: users-help@cocoon.apache.org


Mime
View raw message