Welcome!

SOA & WOA Authors: Bill Vorhies, Andreas Grabner, Pat Romanski, Liz McMillan, Carmen Gonzalez

Related Topics: Cloud Expo, Virtualization

Cloud Expo: Article

Solving the Problem of Cloud Interoperability

There are a number of organizations looking into solving the problem of cloud federation

Reuven Cohen's "Elastic Vapor" Blog

In the next few years the a key opportunity for the emerging cloud industry will be on defining a federated cloud ecosystem by connecting multiple cloud computing providers using an agreeing upon standard or interface. There are a number of organizations looking into solving the problem of cloud federation.

A fundamental challenge in creating and managing a globally decentralized cloud computing environment is that of maintaining consistent connectivity between various untrusted components that are capable of self-organization while remaining fault tolerant. In the next few years the a key opportunity for the emerging cloud industry will be on defining a federated cloud ecosystem by connecting multiple cloud computing providers using an agreeing upon standard or interface. In this post I will examine some of work being done in cloud federation ranging from adaptive authentication to modern P2P botnets.

Cloud Computing is undoubtedly a hot topic these days, lately it seems just about everyone is claiming to be a cloud of some sort. At Enomaly our focus is on the supposed "cloud enabler" Those daring enough to go out and create their very own computing clouds, either privately or publicly. In our work it has become obvious the the real problems are not in building these large clouds, but in maintaining them. Let me put it this way, deploying 50,000 machines is relatively straight forward, updating 50,000 machines or worst yet taking back control after a security exploit is not.

There are a number of organizations looking into solving the problem of cloud federation. Traditionally, there has been a lot of work done in the grid space. More recently, a notable research project being conducted by Microsoft called the “Geneva Framework" has been focusing on some the issues surrounding cloud federation. Geneva is described as a Claims Based Access Platform and is said to help simplify access to applications and other systems with an open and interoperable claims-based model.

In case you're not familiar with the claims authentication model, the general idea is using claims about a user, such as age or group membership, that are passed to obtain access to the cloud environment and to systems integrated with that environment. Claims could be built dynamically, picking up information about users and validating existing claims via a trusted source as the user traverses a multiple cloud environments. More simply, the concept allows for multiple providers to seamlessly interact with another. The model enables developers to incorporate various authentication models that works with any corporate identity system, including Active Directory, LDAPv3-based directories, application-specific databases and new user-centric identity models, such as LiveID, OpenID and InfoCard systems, including Microsoft’s CardSpace and Novell's Digital Me. For Microsoft, Authentication seems to be at heart of their interoperability focus. For anyone more microsoft inclined, Geneva is certainly worth a closer look.

For the more academically focused, I recommend reading a recent paper titled Decentralized Overlay for Federation of Enterprise Clouds published by Rajiv Ranjan and Rajkumar Buyya at the The University of Melbourne. The team outlines the need for cloud decentralization & federation to create a globalized cloud platform. In the paper they say that distributed cloud configuration should be considered to be decentralized if none of the components in the system are more important than the others, in case that one of the component fails, then it is neither more nor less harmful to the system than caused by the failure of any other component in the system. The paper also outlines the opportunities to use Peer2Peer (P2P) protocols as the basis for these decentralized systems.

The paper is very relevant given the latest discussions occurring in the cloud interoperability realm. The paper outlines several key problems areas:

  • Large scale – composed of distributed components (services, nodes, applications,users, virtualized computers) that combine together to form a massive environment. These days enterprise Clouds consisting of hundreds of thousands of computing nodes are common (Amazon EC2, Google App Engine,Microsoft Live Mesh) and hence federating them together leads to a massivescale environment;
  • Resource contention - driven by the resource demand pattern and a lack of
    cooperation among end-user’s applications, particular set of resources can get
    swamped with excessive workload, which significantly undermines the overall
    utility delivered by the system;
  • Dynamic – the components can leave and join the system at will.

Another topic of the paper is on the challenges in regards to the design and development of decentralized, scalable, self-organizing, and federated Cloud computing system as well as a applying the the characteristics of a peer-to-peer resource protocols, which they call Aneka-Federation. (I've tried to find any other references to Aneka, but it seems to be a term used solely withing the university of Melbourne, interesting none the less)

Also interesting was the problems they outline with earlier distributed computing projects such as Seti@home saying they these systems do not provide any support for multi-application and programming models. A major factors driving some of the more traditional users of grid technologies to the use of cloud computing.

One the of questions large scale cloud computing opens is not about how to many a few thousand machines, but how do you manage a few hundred thousand machines? A lot of the work being done in decentralized cloud computing can be traced back to the emergence of modern botnets. A recent paper titled "An Advanced Hybrid Peer-to-Peer Botnet" Ping Wang, Sherri Sparks, Cliff C. Zou at The University of Central Florida outlines some of the "opportunities" by examining the creation of a hybrid P2P botnet.

In the paper the UCF team outlines the problems encountered by P2P botnets which appear surprisingly similar to the problems being encountered by the cloud computing community. The paper lays out the following practical challenges faced by botmasters; (1). How to generate a robust botnet capable of maintaining control of its remaining bots even after a substantial portion of the botnet population has been removed by defenders? (2). How to prevent significant exposure of the network topology when some bots are captured by defenders? (3). How to easily monitor and obtain the complete information of a botnet by its botmaster? (4). How to prevent (or make it harder) defenders from detecting bots via their communication traffic patterns? In addition, the design should also consider many network related issues such as dynamic or private IP addresses and the diurnal online/offline property of bots. A very interesting read.

I am not condoning the use of botnets, but architecturally speaking we can learn a lot from our more criminally focused colleagues. Don't kid yourselves, they're already looking at ways to take control of your cloud and federation will be a key aspect in how you protect yourself and your users from being taken for a ride.

More Stories By Reuven Cohen

An instigator, part time provocateur, bootstrapper, amateur cloud lexicographer, and purveyor of random thoughts, 140 characters at a time.

Reuven is an early innovator in the cloud computing space as the founder of Enomaly in 2004 (Acquired by Virtustream in February 2012). Enomaly was among the first to develop a self service infrastructure as a service (IaaS) platform (ECP) circa 2005. As well as SpotCloud (2011) the first commodity style cloud computing Spot Market.

Reuven is also the co-creator of CloudCamp (100+ Cities around the Globe) CloudCamp is an unconference where early adopters of Cloud Computing technologies exchange ideas and is the largest of the ‘barcamp’ style of events.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover how hardware commoditization, the ubiquitous nature of connectivity, and the emergence of Big Data a...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP and chief architect at BSQUARE Corporation; Seth Proctor, CTO of NuoDB, Inc.; and Andris Gailitis, C...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.