I was watching the Pivotal Labs video of their luncheon with New Relic and the speaker, CEO Lewis Cirne said something that struck me as really odd yet kind of clever.
As he was describing the architecture of their system, Cirne revealed they they run a constant "synthetic" load on their production system. He referred to the faux load agents as their "canaries in the coal mine." The purpose of these canaries, as he explained it, is so that when they reach a dangerous level of traffic - when performance begins to suffer - they can kill off the fake load, freeing up resources and consequently giving them a gauge as to how much breathing room they have before they are really in trouble.
Right off the bat this ingenious ruse throws up two warning flags to me. First off, is all this fake load unnecessarily reducing the lifetime of the hardware, such as thrashing the disk drives, generating excess heat, sucking electricity, etc.? And secondly, how sure are they that this fake load is representative of real load, and hence a true measurement of spare capacity once it's removed?
What do you think?
Tuesday, October 14
Subscribe to:
Post Comments (Atom)
1 comment:
Great observations.
The answer to #1 is "yes, definitely", although the degree is debatable. It may not matter to them; they may want to upgrade hardware before it's failure point anyway.
As far as representation: I think it depends on their load profile. If their traffic is bursty enough (think Slashdot effect on a single server bursty), then their fake load may not be enough to represent a real load increase.
Load testing is definitely important on high-volume sites, but personally I'd prefer to do occasional load testing and extrapolate the results rather than run fake load all the time.
Post a Comment