jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From woolly <p.b...@lbs-ltd.com>
Subject Re: JackRabbit Relationships and Efficiency
Date Fri, 10 Aug 2007 14:45:04 GMT

Thanks for all your responses. I'm having a little difficulty getting my head
around some of these JCR concepts which eventually led to the "lets have a
look at the schema" - which didn't help at all, and raised some further

For anyone in a similar predicament, I'm finding this page useful:

Thomas Mueller-6 wrote:
> Hi,
>> i thought that databases optimised lookups
>> based on the defined
>> relationships. is this not the case?
> No, databases optimize based on indexes. There are enough, and the
> right indexes in the Jackrabbit schema.
>> what about data integrity? are we relying on JackRabbit to manage that
>> for us? but if so, surely it would want some help from the db?
> The problem is, adding referential constraints slows down the
> database. Data integrity is enforced by using transactions: if a node
> is deleted, then first the record in the _NODE table is deleted, then
> the records in the _PROP table (or the other way round) using a
> transaction.
> I'm not an expert, but I think the schema is:
>> DEFAULT_BINVAL: binary values
>> DEFAULT_NODE: nodes
>> DEFAULT_PROP: properties and values
>> DEFAULT_REFS: references
>> VERSION_BINVAL: versioned binary values
>> VERSION_NODE: versioned nodes
>> VERSION_PROP: versioned properties and values
>> VERSION_REFS: versioned references
> Not sure about the FS entry tables.
>> it just seems to be a very "strange" schema
> I think its quite logical. I didn't invent it, but I would have used
> the same architecture. How else would you make it?
> Thomas

View this message in context: http://www.nabble.com/JackRabbit-Relationships-and-Efficiency-tf4247534.html#a12092648
Sent from the Jackrabbit - Users mailing list archive at Nabble.com.

View raw message