In 1960, Professor Fano had a vision of computer utility, the concept of the computer system as a repository for the knowledge of a community, its data, and its procedures in a form that could be readily shared and that could be built upon to create ever more powerful procedures, services, and active knowledge by building on what was already in place. Professor Corbató’s goal was to provide the kind of central computer installation and operating system that could make this vision a reality. With funding from DARPA, the Defense Advanced Research Projects Agency, the system he developed became known as MULTICS.
For those who, unlike me, are still under sixty and therefore are probably not old enough to remember, MULTICS (an acronym for Multiplexed Information and Computing Service) was an extremely influential early time-sharing operating system, started in 1964. It went live at MIT in 1969 and proved that (mainframe-based) computing could serve many people in remote locations at the same time. It set creative minds to thinking about a generally available computer utility, connected to your house through a cable. MULTICS remained operational right upto the dot.com era. Believe it or not, the last MULTICS system was not shut down until October 31, 2000.
Multics inspired far-reaching thoughts. I still have my original copy of Professor Fred Gruenberger’s influential book, Computers and Communications; Toward a Computer Utility (Prentice-Hall, 1968), which I read when it first appeared. It was based in part on “The Computers of Tomorrow,” a May 1964 article in Atlantic Monthly by Martin Greenberger, another influential computer scientist. Back then, I was an undergraduate, during an era characterized by pot-smoking, bra-burning, and anti-Vietnam War protests, and nearly all computing was still based on mainframes and batch processing. Punch cards were the norm for both programming and for data entry. Despite the prevailing limitations, Gruenberger looked at MULTICS and its teletype data entry terminals and saw far into the future. He imagined a “computing utility” that would operate much like an electrical utility, letting you draw as much or as little as you need, while paying only for consumption— what you use.
Honeywell H6180 MULTICS computer.
What he articulated in detail didn’t exist, except in his imagination. The technology wasn’t there. But now it does;we know it as cloud computing. In 1969, Leonard Kleinrock, an expert in queuing theory and one of the chief scientists of the original Advanced Research Projects Agency Network (ARPANET) project which seeded the Internet, said: “As of now, computer networks are still in their infancy, but as they grow up and become sophisticated, we will probably see the spread of ‘computer utilities’ which, like present electric and telephone utilities, will service individual homes and offices across the country.”
To appreciate the depth of this vision, let’s remember where things really were in those days. Nearly all computing was still done with batch processing. Punched cards were the primary means of input. Time sharing was still in its infancy. As a 19-year old, back when Lyndon Johnson was still president, I read an article in Business Week which reported that all you needed to do to attract venture capital funding was to stroll down Sand Hill Road in Menlo Park, California (then, as now, home to some of the most prominent and successful venture capitalists) and shout “time-sharing.” Venture money would be poured on you. The idea back then was that we all needed slices of computing “on-demand,” but only for short bursts of activity.
Model 33 Teletype, including paper tape reader and printer (photo by Allison W, on display at The National Museum of Computing, licensed under the Creative Commons Attribution-Share Alike 3.0 Unported License).
Back then the Teletype Model 33 was the “terminal” of choice; its printing speed was limited to about 10 characters per second. It could send data at a maximum rate of 300 bps and often used an acoustic coupler to connect it to a telephone handset. Later, faster models supported 30–120 characters per second. Like Bill Gates, I gained my first computing experience on the Model 33.
Of course, the slow speed limited the use of time-sharing to applications requiring limited data entry and output. Payroll and sales data entry from branch offices, sales force management, light accounting, and modeling were the prime applications.
Ultimately, however, the PC revolution put the kibosh on timesharing, for not as a modest, one-time investment users were no longer tethered to a money-guzzling mainframe via the maddeningly slow lines of communication of that era—all the while being charged for it by the minute.
But while time-sharing died a slow and painful death, the concept behind timesharing, “hosted applications,” had enduring merit. As the world recognized that personal computers were not always appropriate or powerful enough, client/server applications became the next big thing, running locally what could be processed on a personal computer, and using the back-end servers for the heavy lifting. This was followed by three-tier solutions. In 2004 (in the infancy of cloud computing), when the world first started talking about “hosted applications,” Laurie Sullivan noted in InformationWeek
“Hosted enterprise applications are nothing new. They first emerged astime-sharing apps in the 1960s, when companies rented hardware andsoftware computing resources because they lacked the money and expertise to run applications internally. Among those to first offer such serviceswere IBM and General Electric. The strategy eventually morphed into the [ASP] application-service-provider model in the late 1990s. The business model for both approaches failed, giving rise to the next iteration of hosted applications. The hosted, on-demand model is the third wave,” says Jim Shepherd, a senior VP at AMR Research. “The difference is this time, heavy hitters like IBM and Oracle are pushing the concept, so there’s no question as to whether it will survive. . . . The question now is, how big will it become?”
Very big indeed.
Utility computing,InfoWeekwrote years later, “is a [type of cloud computing that provides a] way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends IT’s existing capabilities.” That sure sounds like Gruenberger’s computer utility to me.
Cloud Computing Related Interview Questions
|Adv Java Interview Questions||UNIX/XENIX Interview Questions|
|Red Hat Linux System Administration Interview Questions||Microsoft Azure Interview Questions|
|Amazon Web Services (AWS) Interview Questions||Unix/Linux Interview Questions|
|KVM Interview Questions||Linux Virtualization Interview Questions|
|Aws Cloud Architect Interview Questions||Salesforce Crm Interview Questions|
|Azure Cosmos DB Interview Questions|
Cloud Computing Related Practice Tests
|Adv Java Practice Tests||UNIX/XENIX Practice Tests|
|Red Hat Linux System Administration Practice Tests||Microsoft Azure Practice Tests|
|Amazon Web Services (AWS) Practice Tests|
Cloud Computing Tutorial
Cloud Computing Is A True Paradigm Shift
From Do It Yourself To Public Cloud—a Continuum
Cloud Computing: Is It Old Mainframe Bess In A New Dress?
Moving Into And Around The Clouds And Efforts At Standardization
Cloud Economics And Capacity Management
Demystifying The Cloud: A Case Study Using Amazon’s Cloud Services (aws)
Virtualization: Open Source And Vmware
Securing The Cloud: Reliability, Availability, And Security
Scale And Reuse: Standing On The Shoulders Of Giants
Google In The Cloud
Enterprise Cloud Vendors
Cloud Service Providers
Practice Fusion Case Study
All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd
Wisdomjobs.com is one of the best job search sites in India.