jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jukka Zitting" <jukka.zitt...@gmail.com>
Subject Re: Efficient versioning of nodes containing large binary data
Date Wed, 12 Sep 2007 22:51:40 GMT

On 9/12/07, Alexander Nesterov <alex.maddriver@gmail.com> wrote:
> The question is how to version nodes containing large binary data not
> making repository size grow rapidly? I created a node representing
> wiki page attachment which has 'name' property that holds the name of
> attachment and 'data' property that stores attachment content. This
> node is versionable. Changing the 'name' property I noticed that the
> whole node is copied to version history which causes the rapid grows
> of repository size. Are there any means to force Jackrabbit act as
> Subversion that uses cheap copies algorithm for coping and visioning?
> Or maybe there are some programming patterns to version binary content
> efficiently in Jackrabbit?

Jackrabbit 1.4 will have a very nice solution to this issue. See
https://issues.apache.org/jira/browse/JCR-926 for the details.

Currently the only workaround to get cheap copies and versioning
operations for large binaries is to store the binaries somewhere else
and just refer to them in the content being copied or versioned.


Jukka Zitting

View raw message