cocoon-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Berin Loritsch <>
Subject Re: [RT] Adaptive Caching
Date Wed, 16 Jul 2003 18:14:59 GMT
Stefano Mazzocchi wrote:

> A pretty reasonable cost function could be
>  0.7 * time + 0.2 * memory + 0.1 * disk
> that reflects the real-life costs of the hardware used to operate the 
> machine. In fact, the "cost function" is better the more it mimics real 
> life economical costs.
> Why? well, the above states that 70% of the "cost of update" is in the 
> CPU computational capabilities + RAM access time + disk access time, 
> because it's normally harder to change those values. 20% is the cost of 
> RAM (because it's more expensive) and 10% is the cost of disk memory (in 
> case of huge drives, this cost can well be zero)

Taking the adaptive cache to new levels, we can also explore adaptive
cost functions--nothing will have to change from the overall architecture
for us to do that.

For example, if we have a requirement from our hosting service that we can
only use a certain amount of memory, as we approach that maximum the
efficiency of that parameter becomes much more important.

So the weighting constants would linearly adapt to the importance of the

I.e. assuming we have the basic cost function:

0.7 * time + 0.2 * memory + 0.1 * disk

The closer we get to administrator mandated maximum the more important
the cache efficiency for that resource is.  If the maximum is 65MB
(theorhetically speaking) and the rest-state of Cocoon for our application
is 30MB, then the weighting is adjusted proportionately from 0.2 to 1.0
for that particular resource and the other weightings are also proportionately

When our application is running at 51MB we are 60% closer to our maximum.
The new weightings should be:

0.28 * time + 0.68 * memory + 0.04 * disk

As you can see, an adaptive cost function working in conjunction with an
adaptive cache can add a new level of adaptability to hard requirements
without having to resort to too many explicit rules.

Again when we reach our maximum of 65MB we are 100% closer to our maximum.
The new weightings would be:

0.0 * time + 1.0 * memory + 0.0 * disk

We can never truly guarantee that we will never cross the 65MB threshold,
but we will have everything working to minimize the amount over we can


"They that give up essential liberty to obtain a little temporary safety
  deserve neither liberty nor safety."
                 - Benjamin Franklin

View raw message