Think of your electric utility. A generator in a power plant converts mechanical energy to electrical energy and produces a fixed amount of electricity (the source of mechanical energy may be a reciprocating or turbine steam engine, water falling through a turbine or waterwheel, an internal combustion engine, a wind turbine, a hand crank, compressed air or any other source of mechanical energy). The electricity generated goes out to “the grid” for delivery to customers. If it’s not used, it’s usually wasted, as electricity is usually too expensive to store (in a battery) for later use. The idea behind the grid is that many plants contribute to the grid, and many, many users consume electricity from the grid at varying times and in varying amounts.
Inevitably, there are peaks and valleys in demand. At consumption low points, only the most efficient (least costly) generators are used. At peak, the most expensive generators are fired up. Since at times of reduced demand, the same amount of electricity is generated by the generator, unconsumed (excess) power is sold off to a distant utility where possible. Selling power over longer distances takes advantage of the fact that the consumption peaks shift with the time zone changes. For example, at the 8:00 P.M. peak in the West Coast it’s already 11:00 P.M. on the East Coast, when demand is slacker, so sending excess power from East to West helps smooth out local peaks and valleys.
It’s the same concept with computing power, only on steroids. Not only does user demand vary wildly (Oprah Winfrey praises a Web site on her show and demand instantly skyrockets), but transactions themselves have their own sharp peaks and valleys in resource consumption, at one moment demanding lots of processing cycles and at another almost no processing cycles while file (database) access is demanded. Cloud computing and virtualization allow for more effective use of computer resources, smoothing out the peaks and valleys and increasing average equipment utilization. The larger the overall cloud network, the more efficiently the smoothing can be done. Thus public clouds enjoy an inherent efficiency advantage over private clouds, which in turn have efficiency advantages over farms of older single-purpose (dedicated) servers. on capacity planning and economics.
Cloud Computing Related Interview Questions
|Adv Java Interview Questions||UNIX/XENIX Interview Questions|
|Red Hat Linux System Administration Interview Questions||Microsoft Azure Interview Questions|
|Amazon Web Services (AWS) Interview Questions||Unix/Linux Interview Questions|
|KVM Interview Questions||Linux Virtualization Interview Questions|
|Aws Cloud Architect Interview Questions||Salesforce Crm Interview Questions|
|Azure Cosmos DB Interview Questions|
Cloud Computing Related Practice Tests
|Adv Java Practice Tests||UNIX/XENIX Practice Tests|
|Red Hat Linux System Administration Practice Tests||Microsoft Azure Practice Tests|
|Amazon Web Services (AWS) Practice Tests|
Cloud Computing Tutorial
Cloud Computing Is A True Paradigm Shift
From Do It Yourself To Public Cloud—a Continuum
Cloud Computing: Is It Old Mainframe Bess In A New Dress?
Moving Into And Around The Clouds And Efforts At Standardization
Cloud Economics And Capacity Management
Demystifying The Cloud: A Case Study Using Amazon’s Cloud Services (aws)
Virtualization: Open Source And Vmware
Securing The Cloud: Reliability, Availability, And Security
Scale And Reuse: Standing On The Shoulders Of Giants
Google In The Cloud
Enterprise Cloud Vendors
Cloud Service Providers
Practice Fusion Case Study
All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd
Wisdomjobs.com is one of the best job search sites in India.