## Sunday, February 25, 2007

### Scalability Parameters

In a recent Guerrilla training class given at Capital Group in Los Angeles, Denny Chen (a GCaP alumnus) suggested a way to prove my Conjecture 4.1 (p.65) that the two parameters α and β are both necessary and sufficient for the scalability model:

 C(N) = N1 + αN + βN (N − 1)

developed in Section 4.4 of the Guerrilla Capacity Planning book.

Basically, Denny observes that 2 parameters (a and b) are needed to define an extremum in a quadratic function (e.g., a parabola passing through the origin with c = 0), so a similar constraint should hold (somehow) for a rational function with a quadratic denominator. This is both plausible and cool. I don't know why I didn't think of it myself.

steve jenkin said...

The physical analogy is "countervailing forces" - to maintain an equilibrium state, a minimum of two forces is required. Remove gravity and planets fly away the Sun, remove centripital force and they fall into the Sun. Two opposing forces are needed to maintain a steady state condition.

Are there any cases in computer Performance Modelling where there can be three interacting forces of roughly equal magnitudes? Wouldn't that case require three variables to model? The only quick physical example is with gases (PV = nRT) condensing or being created by reactions - all three variables change, but you have to assume one is constant to calculate the others.

There is another element:
How precise does the model have to be?

I.e. How close to observed measures does the predicted value need to be? Are there any cases in Performance Analysis where a high degree of precision is needed??

In Engineering, we'd refer to things being correct "to the first order" - meaning they were usable and pretty much correct.

You make the point in your classes/books that models are only models - not precise simulations. Models show the most important features of what's going on in systems, how things will trend when a system is pushed to its limits, how performance attributes will scale as load/demand is varied, how you can expect the system to respond in areas its never been.

And maybe, most importantly, if there are any pernicious circumstances/configurations - like Virtual Memory 'thrashing' - when it starts, the system will stay thrashing (and produce very little useful work).

Neil Gunther said...

As of today (Fri May 9 15:06:56 PDT 2008), I have the proof of Conjecture 4.1 on p. 65 of the GCaP book.