One method of achieving interoperability is through a cloud broker. A cloud broker is a cloud that provides services to cloud consumers but might not host any of its own resources. In this example, the broker federates resources from Cloud 1 and Cloud 2, making them available transparently to cloud consumers. Cloud consumers interact only with the broker cloud when requesting services, even though the delivered services come from other clouds.
As the DTMF stated in its white paper on Interoperable Clouds, the goal of the Cloud Incubator is “to define a set of architectural semantics that unify the interoperable management of enterprise and cloud computing.” Building blocks provided will be used “to specify the cloud provider interfaces, data artifacts, and profiles to achieve interoperable management.”It outlines its deliverables thus:
The Phase 1 deliverables—reference architecture, taxonomy, use cases, priorities, submissions from vendors, and existing standards and initiatives— will be analyzed to deliver one or more cloud provider interface informational specification documents, which will be the basis for the development of future cloud standards. These documents will describe functional interfaces, protocols, operations, security, and data artifacts. These building blocks are being worked on, but have not yet been published. Phase 2 will deliver a recommendation for each standard, which will include the gaps and overlaps as well as abstract use cases (usage scenarios that describe one or more business contexts) applicable to the sub-domain of each standard.
Role of a cloud broker (Courtesy DMTF).
The evolution of cloud standards is coordinated by cloud-standards. org, and they maintain a Wiki for that purpose.
Open Cloud Consortium (OCC)41 is a member-driven organization that supports the development of standards for cloud computing and frameworks for interoperating between clouds, develops benchmarks for cloud computing, and supports reference implementations for cloud computing.
The OCC also manages test-beds for cloud computing, such as the Open Cloud Testbed, and operates cloud computing infrastructures to support scientific research, such as the Open Science Data Cloud.
OCC seems to be an effort driven mostly by Yahoo! and its university research partners. However, its concerns are industry-wide; its Working Group on Standards and Interoperability For Large Data Clouds (one of four working groups at this writing) focuses on developing standards for interoperating large data clouds. For example, what are standard interfaces to storage clouds and compute clouds? What are appropriate benchmarks for large data clouds? The group is concerned with an architecture for clouds popularized by a series of Google technical reports that consists of a storage cloud providing a distributed file system.
A compute cloud supporting MapReduce is a patente software framework introduced by Google to support distributed computing on large data sets on clusters of computers and a data cloud supporting table services. The open source Hadoop system follows this architecture. The working group uses the name large data clouds for these types of clouds.
One big challenge with today’s PaaS offerings is that they are all fairly unique and incompatible with one another and with the way that enterprises run their applications. Once you select a PaaS offering, it is easy to become locked into their particular offering, unable to easily move your applications and data to another PaaS provider or back into your own datacenter should the need arise, which is a challenge for cloud computing as a whole.
Interclouding, DTMF and OVFS
Google’s Vint Cerf says that what we need is Intercloud, a goal that Google, VMware, and several others are working to address within the DMTF45 including the Open Virtualization Format Specification. Just as personal computers didn’t really take off until there were just two standards (Apple and IBM-compatible), so too we can expect to see Interclouding, once supported by all major vendors, to greatly accelerate movement to the cloud.
Cloud Computing Related Interview Questions
|Adv Java Interview Questions||UNIX/XENIX Interview Questions|
|Red Hat Linux System Administration Interview Questions||Microsoft Azure Interview Questions|
|Amazon Web Services (AWS) Interview Questions||Unix/Linux Interview Questions|
|KVM Interview Questions||Linux Virtualization Interview Questions|
|Aws Cloud Architect Interview Questions||Salesforce Crm Interview Questions|
|Azure Cosmos DB Interview Questions|
Cloud Computing Related Practice Tests
|Adv Java Practice Tests||UNIX/XENIX Practice Tests|
|Red Hat Linux System Administration Practice Tests||Microsoft Azure Practice Tests|
|Amazon Web Services (AWS) Practice Tests|
Cloud Computing Tutorial
Cloud Computing Is A True Paradigm Shift
From Do It Yourself To Public Cloud—a Continuum
Cloud Computing: Is It Old Mainframe Bess In A New Dress?
Moving Into And Around The Clouds And Efforts At Standardization
Cloud Economics And Capacity Management
Demystifying The Cloud: A Case Study Using Amazon’s Cloud Services (aws)
Virtualization: Open Source And Vmware
Securing The Cloud: Reliability, Availability, And Security
Scale And Reuse: Standing On The Shoulders Of Giants
Google In The Cloud
Enterprise Cloud Vendors
Cloud Service Providers
Practice Fusion Case Study
All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd
Wisdomjobs.com is one of the best job search sites in India.