Return-Path: Delivered-To: apmail-cocoon-dev-archive@www.apache.org Received: (qmail 4295 invoked from network); 11 Oct 2003 16:31:13 -0000 Received: from daedalus.apache.org (HELO mail.apache.org) (208.185.179.12) by minotaur-2.apache.org with SMTP; 11 Oct 2003 16:31:13 -0000 Received: (qmail 63643 invoked by uid 500); 11 Oct 2003 16:31:02 -0000 Delivered-To: apmail-cocoon-dev-archive@cocoon.apache.org Received: (qmail 63577 invoked by uid 500); 11 Oct 2003 16:31:02 -0000 Mailing-List: contact dev-help@cocoon.apache.org; run by ezmlm Precedence: bulk list-help: list-unsubscribe: list-post: Reply-To: dev@cocoon.apache.org Delivered-To: mailing list dev@cocoon.apache.org Received: (qmail 63563 invoked from network); 11 Oct 2003 16:31:00 -0000 Received: from unknown (HELO ags01.agsoftware.dnsalias.com) (216.6.48.60) by daedalus.apache.org with SMTP; 11 Oct 2003 16:31:00 -0000 Received: from ags01.agsoftware.dnsalias.com (localhost [127.0.0.1]) by ags01.agsoftware.dnsalias.com (8.12.8/8.12.8) with ESMTP id h9BGV2Pt001971 for ; Sat, 11 Oct 2003 10:31:02 -0600 Received: (from apache@localhost) by ags01.agsoftware.dnsalias.com (8.12.8/8.12.8/Submit) id h9BGV27A001969; Sat, 11 Oct 2003 10:31:02 -0600 X-Authentication-Warning: ags01.agsoftware.dnsalias.com: apache set sender to agallardo@agsoftware.dnsalias.com using -f Received: from 10.0.0.5 (SquirrelMail authenticated user agallardo) by ags01.agsoftware.dnsalias.com with HTTP; Sat, 11 Oct 2003 10:31:02 -0600 (CST) Message-ID: <33283.10.0.0.5.1065889862.squirrel@ags01.agsoftware.dnsalias.com> Date: Sat, 11 Oct 2003 10:31:02 -0600 (CST) Subject: Re: [RT] Moving towards a new documentation system From: "Antonio Gallardo" To: In-Reply-To: <4F6BC68E-FBED-11D7-8572-000393D2CB02@apache.org> References: <4F6BC68E-FBED-11D7-8572-000393D2CB02@apache.org> X-Priority: 3 Importance: Normal X-Mailer: SquirrelMail (version 1.2.11) MIME-Version: 1.0 Content-Type: text/plain; charset=iso-8859-1 Content-Transfer-Encoding: 8bit X-Spam-Rating: daedalus.apache.org 1.6.2 0/1000/N X-Spam-Rating: minotaur-2.apache.org 1.6.2 0/1000/N Stefano Mazzocchi dijo: > > On Saturday, Oct 11, 2003, at 14:25 Europe/Rome, Nicola Ken Barozzi > wrote: > >> Please don't forget Forrest. > > we are not. > > I thought about it and I totally resonate with Bertrand: we need to > outline an incremental transition to our own CMS reusing as much dog > food as possible (which is also good for a community perspective). > > Here is my proposal: > > 1) the system gets divided into three parts > > - frontend -> mapped to http://cocoon.apache.org -> static > - repository -> a WebDAV server > - backend -> mapped to http://edit.cocoon.apache.org -> dynamic > > 2) each page is rendered with a wiki-like "edit this page" link, that > > points to the exact same URI, on the backend virtual host. > > 3) everybody is able to edit or to enter new pages, but they need to > be approuved to get published. > > 4) a committer can publish directly (thru a login) > > 5) each page is considered an atomic learning object (LO) and is > identified by a numerical URI > > 6) the process of creating a learning object (editing) and the > process of linking some together, are kept independent. I agree with this point. But this single point is complex.... what if?... there are many users posting new version of the same document, how we will handle this? Partial publications or like bugzilla (overwritting the commited change - if you wish)? > The above is really important, IMO, because separates concerns between: > > - writers -> those who know about something and want to write about > it > - editors -> those who know what users want to know and assembles > knowledge for them > > The lack of separation between these two types of people is, IMO, what > is missing from our documentation infrastructure. Note that wikis > concentrate on the first part and leave the second part up to the > collective task of content refactoring. I think this is weak and it's > the worst part of the wiki pattern: we need something better. > > The future idea is to use indirect linking where lookup is a sort of > "what's related" understood out of the analysis of user patterns, but > this is far ahead in the future. > > For now, I think direct linking would be enough for our needs... we > just need a good "lookup and discovery" of learning objects integrated > in the backend. > > - o - > > As the implementation > > 1) forrest will be used to generate the site from the contents of the > repository > > 2) the repository can be either plain vanilla subversion or a webdav > server implemented by cocoon on top of another repository (either > subversion or catacomb or JSR170). even CVS, but we might want to stay > away from it. > > 3) lenya will be used as the backend. > > Missing things: > > 1) is forrest already capable of doing what we ask? > > 2) what's the best repository? where do we install it? > > 3) is lenya enough for what we need? (Michi says so) > > 4) how much work is the integration of linotype with lenya? (I'll > investigate this soon) > > 5) how do we get the wiki into the repository? (I plan to write a > wiki-syntax editing pane for linotype, would this be enough?) > > 6) how do we get the rest of the data into the repository? > > 7) how do we make it simple to edit linkmaps? what searching and > proximity tools can we provide for this? > > Enough of a braindump for now. > > Fire at will. Based on lastest posts, I will like to see a fusion Lenya+Forrest. Is this posible or they are divorced. Best Regards, Antonio Gallardo