incubator-allura-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Igor Bondarenko <>
Subject Re: ideas for caching wiki pages, etc
Date Mon, 21 Oct 2013 17:59:07 GMT
Another solution could be: cache only static content of the page and always
re-render macros. Something like this will do:

Before putting a page to cache - strip out all macros from source markdown
and replace them with some text (e.g. MACRO:<macro-hash>). Render resulting
markdown and put it to cache.  Before displaying page from cahce - find all
macros in the source markdown, render each separately and replace
corresponding MACRO:<macro-hash> with rendered html.

It's trickier than Dave's option, though.

On Mon, Oct 21, 2013 at 8:07 PM, Dave Brondsema <> wrote:

> I'd like to address soon.
>  The
> summary is that we currently have a max char size for rendering markdown,
> since
> sometimes it can be extremely slow to render (and we've tried to improve
> that
> with no luck).  A max char size is ugly though and we don't want that.  We
> added
> caching for markdown rendering recently, but have only applied it to
> comments
> ("posts") so far.  If we expand it to wiki pages, tickets, etc, then the
> max
> char limit can be removed or made much much higher.  But it's more likely
> that a
> macro (e.g. include another page) will be used in wiki and tickets and
> then our
> simple caching strategy won't work well because the macro won't be re-run.
> Anyone have ideas for how to do cache invalidation in that situation?  One
> idea
> I have is pretty crude, but might work: check to see if there are any
> macros in
> the markdown (search '[[') and never cache those.
> --
> Dave Brondsema :
> : personal
> : programming
>               <><

Igor Bondarenko

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message