forrest-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Gav...." <>
Subject Re: speeding up the static build (Was: Roadmap for v2)
Date Fri, 28 Oct 2005 11:35:04 GMT

----- Original Message ----- 
From: "David Crossley" <>
To: <>
Sent: Friday, October 28, 2005 3:32 PM
Subject: speeding up the static build (Was: Roadmap for v2)

| Gav.... wrote:
| >
| > Which shows I think I mentioned before the only downside to static site
| > generation.
| > To make that one correction and add one word to the index.xml file meant 
| > had
| > to rebuild the entire site again with 'forrest site' and then re-upload. 
| > chose to
| > override my editors complaints that all files had been changed and do I
| > really want
| > to upload the entire site again - no, just index.* please !
| What do you mean by "editors complaints"?
| Are you using an editor to do the upload to
| your website? That is not Forrest's fault then.

Sometimes Dreamweaver MX, othertimes a dedicated FTP client, either
way the files are changed as far as they are concerned. No, not Forrests
fault and apologies if it sounded that way.

| For the project website we use Subversion to store
| the generated content. In that way only the changed
| files get uploaded. We use forrestbot too of course.

I think I am going to give Forrestbot a try, this will not
solve the above, but one less thing to do myself.

| You could use 'scp' to specifically copy certain files.
| Someone suggested forrestbot by ftp too.

Thats the method I will try.

| Sure, we know that there are ways to speed the site
| build process. Cocoon CLI checksums. There is probably
| a Jira issue registered for that.

Nothing recent that I can see, I will uncomment out the line
that reads <checksums-uri>build/work/checksums</checksums-uri>
in the cli.xconf (site-author ?) , I guess I don't need to do anything else 

| There is a trick that can cut down your turnaround time
| with building. In ...
| # The URL to start crawling from
| #project.start-uri=linkmap.html
| Uncomment that and set it to the specific page that
| you want. That will build that page, then of course
| it will keep crawling links from there. It may be
| confined to a sub-directory, but depending on links
| could end up generating the whole site.
| The main thing is that your page of interest is built
| first.

Thanks for that. I may be way out here as I don't know the
specifics, but I wonder if this process can be copied and then enhanced 
somehow to create a 'build one file' tool.
Where you say, 'will be that page, and then ....' stop right there, dont 
crawl, we are done. Is that even feasible and do you think it is worth it.
Of course if the CLI Checksums thing works then there is no need.


No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.362 / Virus Database: 267.12.5/150 - Release Date: 27/10/2005

This message was scanned for spam and viruses by BitDefender.
For more information please visit

View raw message