cocoon-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Antonio Gallardo" <agalla...@agsoftware.dnsalias.com>
Subject Re: [RT] Moving towards a new documentation system
Date Sat, 11 Oct 2003 16:31:02 GMT
Stefano Mazzocchi dijo:
>
> On Saturday, Oct 11, 2003, at 14:25 Europe/Rome, Nicola Ken Barozzi
> wrote:
>
>> Please don't forget Forrest.
>
> we are not.
>
> I thought about it and I totally resonate with Bertrand: we need to
> outline an incremental transition to our own CMS reusing as much dog
> food as possible (which is also good for a community perspective).
>
> Here is my proposal:
>
>   1) the system gets divided into three parts
>
>       - frontend -> mapped to http://cocoon.apache.org -> static
>       - repository -> a WebDAV server
>       - backend -> mapped to http://edit.cocoon.apache.org -> dynamic
>
>    2) each page is rendered with a wiki-like "edit this page" link, that
>
> points to the exact same URI, on the backend virtual host.
>
>    3) everybody is able to edit or to enter new pages, but they need to
> be approuved to get published.
>
>    4) a committer can publish directly (thru a login)
>
>    5) each page is considered an atomic learning object (LO) and is
> identified by a numerical URI
>
>    6) the process of creating a learning object (editing) and the
> process of linking some together, are kept independent.

I agree with this point. But this single point is complex.... what if?...
there are many users posting new version of the same document, how we will
handle this? Partial publications or like bugzilla (overwritting the
commited change - if you wish)?

> The above is really important, IMO, because separates concerns between:
>
>     - writers -> those who know about something and want to write about
> it
>     - editors -> those who know what users want to know and assembles
> knowledge for them
>
> The lack of separation between these two types of people is, IMO, what
> is missing from our documentation infrastructure. Note that wikis
> concentrate on the first part and leave the second part up to the
> collective task of content refactoring. I think this is weak and it's
> the worst part of the wiki pattern: we need something better.
>
> The future idea is to use indirect linking where lookup is a sort of
> "what's related" understood out of the analysis of user patterns, but
> this is far ahead in the future.
>
> For now, I think direct linking would be enough for our needs... we
> just need a good "lookup and discovery" of learning objects integrated
> in the backend.
>
>                                    - o -
>
> As the implementation
>
>   1) forrest will be used to generate the site from the contents of the
> repository
>
>   2) the repository can be either plain vanilla subversion or a webdav
> server implemented by cocoon on top of another repository (either
> subversion or catacomb or JSR170). even CVS, but we might want to stay
> away from it.
>
>   3) lenya will be used as the backend.
>
> Missing things:
>
>   1) is forrest already capable of doing what we ask?
>
>   2) what's the best repository? where do we install it?
>
>   3) is lenya enough for what we need? (Michi says so)
>
>   4) how much work is the integration of linotype with lenya? (I'll
> investigate this soon)
>
>   5) how do we get the wiki into the repository? (I plan to write a
> wiki-syntax editing pane for linotype, would this be enough?)
>
>   6) how do we get the rest of the data into the repository?
>
>   7) how do we make it simple to edit linkmaps? what searching and
> proximity tools can we provide for this?
>
> Enough of a braindump for now.
>
> Fire at will.

Based on lastest posts, I will like to see a fusion Lenya+Forrest. Is this
posible or they are divorced.

Best Regards,

Antonio Gallardo



Mime
View raw message