cocoon-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hunsberger, Peter" <Peter.Hunsber...@stjude.org>
Subject RE: [RT] Adaptive Caching
Date Wed, 16 Jul 2003 19:18:41 GMT
Berin Loritsch <bloritsch@apache.org> writes:

> Taking the adaptive cache to new levels, we can also explore adaptive
> cost functions--nothing will have to change from the overall 
> architecture
> for us to do that.
> 
> For example, if we have a requirement from our hosting 
> service that we can
> only use a certain amount of memory, as we approach that maximum the
> efficiency of that parameter becomes much more important.
> 
> So the weighting constants would linearly adapt to the 
> importance of the
> requirement.
> 
> I.e. assuming we have the basic cost function:
> 
> 0.7 * time + 0.2 * memory + 0.1 * disk
> 
> The closer we get to administrator mandated maximum the more important
> the cache efficiency for that resource is.  If the maximum is 65MB
> (theorhetically speaking) and the rest-state of Cocoon for 
> our application
> is 30MB, then the weighting is adjusted proportionately from 
> 0.2 to 1.0
> for that particular resource and the other weightings are 
> also proportionately
> adjusted.
> 
> When our application is running at 51MB we are 60% closer to 
> our maximum.
> The new weightings should be:
> 
> 0.28 * time + 0.68 * memory + 0.04 * disk
> 
> As you can see, an adaptive cost function working in 
> conjunction with an
> adaptive cache can add a new level of adaptability to hard 
> requirements
> without having to resort to too many explicit rules.
> 
> Again when we reach our maximum of 65MB we are 100% closer to 
> our maximum.
> The new weightings would be:
> 
> 0.0 * time + 1.0 * memory + 0.0 * disk
> 
> We can never truly guarantee that we will never cross the 
> 65MB threshold,
> but we will have everything working to minimize the amount over we can
> go.

You'd get the same thing using a non-linear function instead of using
variable weightings. The problem with adaptive weightings is that you'd
in turn need some function to calculate those weightings
(add-infinitum).  Using non-linear combinations in turn allows for the
possibility of local minimums (as would adaptive weightings - though in
a less obvious way).  However, if you stick with relatively simple
functions you'd be ok.  Eg:

	0.8 +  t + 0.01 * m * m + 0.0001 * d * d * d

Would essentially mean that memory only becomes important as it starts
to run out and disk only becomes important as it runs out (and then it
becomes real important)...






Mime
View raw message