jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Thomas Mueller" <thomas.tom.muel...@gmail.com>
Subject Re: JackRabbit Relationships and Efficiency
Date Fri, 10 Aug 2007 10:08:28 GMT
Hi,

> i thought that databases optimised lookups
> based on the defined
> relationships. is this not the case?

No, databases optimize based on indexes. There are enough, and the
right indexes in the Jackrabbit schema.

> what about data integrity? are we relying on JackRabbit to manage that
> for us? but if so, surely it would want some help from the db?

The problem is, adding referential constraints slows down the
database. Data integrity is enforced by using transactions: if a node
is deleted, then first the record in the _NODE table is deleted, then
the records in the _PROP table (or the other way round) using a
transaction.

I'm not an expert, but I think the schema is:

> DEFAULT_BINVAL: binary values
> DEFAULT_NODE: nodes
> DEFAULT_PROP: properties and values
> DEFAULT_REFS: references

> VERSION_BINVAL: versioned binary values
> VERSION_NODE: versioned nodes
> VERSION_PROP: versioned properties and values
> VERSION_REFS: versioned references

Not sure about the FS entry tables.

> it just seems to be a very "strange" schema

I think its quite logical. I didn't invent it, but I would have used
the same architecture. How else would you make it?

Thomas

Mime
View raw message