forrest-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Crossley <>
Subject Re: [Proposal] forrestbot at
Date Wed, 26 Jan 2005 04:04:11 GMT
Dave Brondsema wrote:
> David Crossley wrote:
> >I suddenly realised the issue with "cocoon-trunk". It needs
> >to run its 'build docs' before running 'forrest'. It generates
> >some extra source documentation before forrest starts.
> >
> >There is a global parameter "forrest-exec" which can call
> >a shell script to do other things, then call forrest.
> A better option would be to have the forrestbot buildfile for 
> cocoon-trunk run its 'build docs', then you don't affect any other projects.

That is okay, my forrest-exec shell wrapper has a case statement
to switch based on siteName to handle various special "pre-forrest"
operations. However, the length of time that it takes, causes
the issue that i mention below - forrestbot run via the webapp
will not know if there is a forrestbot already running via
cron and vice versa. I think that can be fixed in my wrapper
by checking/setting the date on the cocoon-trunk forrestbot log.

Anyway, thanks, it is good to know that the forrestbot buildfile
has that capability - du'oh i should have realised that.
Perhaps i can move some of the pre-forrest functionality.

Youch, just tried it. I presume that you mean adding an <import>
for the cocoon-trunk/build.xml then calling the forrestbot
with targets "docs" and then "build". I get errors from cocoon's
"docs" target because forrestbot is not running in the top-level
of cocoon's source.

Also, because forrestbot only does 'svn up' for the exact parts
of the sources that it requires, i need to do an 'svn up'
for the full cocoon-trunk. This needs to happen over in the
work/svn/cocoon-trunk space, then do cocoon's 'build docs' there,
then let forrestbot copy the sources to its ${}


> >I have the forrestbot working now for cocoon-trunk.
> >There is potential to have multiple cocoon-trunk builds
> >running, so i am currently finding a way to avoid that.
> >
> >--David

View raw message