Historical Note on Computer Capacity Management - Cloud Computing

In the old mainframe days, IBM computers reported extensive information about themselves in a form known as RMF and SMF records. Barry Merrill’s classic 1980 book and his related SAS code were widely used to gather and reduce mainframe utilization data. Tools like BEST/1, based on Dr.Jeff Buzen’s pioneering work with applying Kleinrock’s analytical queuing models to computer capacity analysis, allowed pretty accurate modeling, analysis, and calibration.

Then came the personal computer revolution.

In the era of exploding availability of ever more powerful and cheap personal computers, capacity planning became a largely forgotten art. Also, it wasn’t as easy. The kind of self-reporting embedded into mainframe operating systems just wasn’t there. Obtaining utilization data was a major chore, and modeling simply wasn’t worth the effort. Computers were cheap, and if there were performance problems, you simply added resources until the problem went away.

With cloud computing, it’s back-to-the-future time, folks. And capacity planning is once again a much-sought-after skill. Keep in mind that in the cloud, for the most part, you are dealing with two levels of resources: virtual resources and physical ones. While getting into the cloud was easy, managing numerous cloud servers can be a challenge. Older folks who had capacity planning skills are being called out of retirement, dusting off their long-dormant skills and picking out the cobwebs. Folks too young to remember are reinventing the tools of yesteryear in a cloud environment. They are much needed.

All rights reserved © 2020 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

Cloud Computing Topics