jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alexander Nesterov" <alex.maddri...@gmail.com>
Subject Efficient versioning of nodes containing large binary data
Date Wed, 12 Sep 2007 20:50:35 GMT

I'm quite new to Jackrabbit and my question can look pretty simple,
but I haven't found good answer in Internet and Jackrabbit mailing

The question is how to version nodes containing large binary data not
making repository size grow rapidly? I created a node representing
wiki page attachment which has 'name' property that holds the name of
attachment and 'data' property that stores attachment content. This
node is versionable. Changing the 'name' property I noticed that the
whole node is copied to version history which causes the rapid grows
of repository size. Are there any means to force Jackrabbit act as
Subversion that uses cheap copies algorithm for coping and visioning?
Or maybe there are some programming patterns to version binary content
efficiently in Jackrabbit?

Alexander Nesterov

View raw message