Welcome!

Microservices Expo Authors: Yeshim Deniz, JP Morgenthal, Liz McMillan, AppNeta Blog, Elizabeth White

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

Cloud Brokerage: The Market Unified

Commodity brokers don’t own the transport system for the market

This is Part III in a series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here and Part II here.

The feedback and fallout from Part II of this post has been quite interesting.  I thought for sure the bulk of the flack I would have to take would be from the cloud vendor incumbents I said would be relegated to the world of retail cloud business.  But since I posted my perspective I’ve found myself digging in to the nature of the Total Addressable Market (TAM) for the Cloud Brokerage industry.

For those of you keeping score at home, I said the market for cloud brokerage is more that 10 times the market for cloud computing software and related services.

Yes, 10 times.

And it is because this market is so big that cloud brokerage will spawn the next generation of technology innovation.

But before I get to the underlying technologies that are on the horizon and necessary for the future that I, along with my collaborators, envision, let me first spend a few paragraphs to explain why I am not just pulling numbers out of my, um, ‘IaaS’.

On the 6fusion iNode Network the median server in production in the cloud is a quad core dual processor unit with an average of 4TBs of available storage.  Using this standard configuration, partners and customers yield approximately $42,000 per year in net billable proceeds. I would classify that number, give or take on either side of it, to be a reasonable annual revenue estimation.

IDC recently reported that the 2011 server shipments topped out at 8.3 million units.  At a $42K clip, that is a market growing by a healthy $350 billion each year.

But of course, as we all know, server shelf life is not exactly the same as what you’d expect from a box of Krusty-O’s from the Kwik-E-Mart.

A quick trip down the hall to Gary Morris’s office at 6fusion is always an educational adventure.  “Depreciation,” Gary explains, “is a systematic and rational process of distributing the cost of tangible assets over the life of those assets. US GAAP calls for depreciation of servers using the server’s cost, estimated useful life and residual value. Typically, computers, software and equipment are depreciated over a period of 1 to 5 years, with the average useful life being 3 years.”

If we take Gary’s use of the GAAP average as a multiplier, it means there is estimated to be over $1trillion in billable utility computing presently in use around the world.

The point here is that cloud brokerage is underpinned by the availability of both private and public compute, network and storage resources.  And it is this massive untapped market that will drive the next wave of innovation.

If the origins of the cloud business belonged to the innovation of companies like Amazon, Rackspace and VMware, then the future of the cloud brokerage belongs to a new cadre of agnostic intermediaries that will enable a true utility computing marketplace to flourish.

The unification of the market is what I refer to as the point in time at which cloud computing technologies in production today can be used to interface to the commodity market.  In order for that to happen, cloud brokerage as an industry must form and deliver the underlying technologies necessary to make a true market.

Just what are these technologies?  Let’s take a look at three areas of innovation that will underpin the future of the utility computing.

Cloud brokerage technologies are best considered in the context of supply, demand and delivery.

Universal Resource Metering: Quantification of Demand and Supply

I delivered a presentation in Asia a few weeks ago and I opened with a slide that had two simple definitions:  Utility and Commodity.

A Utility, I paraphrased, “is a service provided by organizations that are consumed by a public audience.”

A Commodity, according to common definition, “is a class of goods or services that is supplied without qualitative differentiation.”

Theoretically, you can have a utility without it necessarily being commodity.  But it rarely ever works that way because in order to have a utility in the way we think about the utilities we consume every day, you must have scale.   And in order to achieve scale, the utility must be pervasive and uniform.  One should not require any special skills in order to use it.  It must be simple and consistent to use.   Think about your interaction with things like power or water services or subscribing to the Internet.

Utility is a word used quite often to describe the cloud.   In a post a couple months ago Simon Wardley aptly explained the difference between the cloud and a computer utility.  The difference, says Wardley, is really only that “cloud was simply a word used by people to explain something that really wasn’t well understood to people who were even more confused than they were.”

So is the cloud really a computer ‘utility’?  Not yet.

You see, what the cloud is missing is the factor that truly negates qualitative differentiation – common measurement. You simply cannot claim something to be a true utility if every provider measures services differently.  Common utilities all share the characteristic of universal measurement.  Think about it.  Power. Water.  Energy.  The Internet.  Whatever.

A standardized unit of measurement for the computer utility will be one of the greatest innovations to come from the emerging market for cloud brokerage because it will establish basis from which a commodity market can emerge.

Cloud Infrastructure Federation: Tapping Global Supply

When you buy corn or wheat or soybeans by contract on a commodity exchange today, you don’t buy a brand.   You buy a commodity.  Cloud brokers of the future will move commodities, not brands.   Today, cloud brokers form ‘partnerships’ with service providers.   But for a true brokerage model to blossom, there can be no possibility for vendor discrimination.  Anyone that brings product to market can and should trade it.  The denial of interoperability cannot happen.

With this in mind true cloud brokers will overcome the interoperability hurdle through collaboration and cooperation.   This doesn’t mean ascribing to one API framework or another, regardless of how high and mighty the leading retail cloud properties might become.   It means absolving oneself from the politics of the API game completely.

The Underlying Transport System:  Delivering the Commodity

It doesn’t always happen, but when a commodity contract comes due, something must be delivered.   The party that holds the paper for a hundred thousand units of corn must be able to take possession of it.  Modern commodity markets are supported by an elaborate network of supply chain delivery systems – from tankers to trains and transport trucks.

The equivalent underlying transport system must exist for the cloud infrastructure market.

Commodity brokers don’t own the transport system for the market.   And for good reason.  However, if you subscribe to the early analyst view of cloud brokerage, they do.   The analysts see brokers facilitating the transaction and delivering the compute commodity itself.   To me, they either don’t fully grasp the potential of the broker or they are describing something all together different.

Cloud interoperability is not a new concept.  It has been bandied about the blogosphere for several years already.  The problem to date is that such movements have been nothing more than thinly veiled product sales pitches.  The cloud brokers of the future will drive the innovation to construct the underlying transport system to “connect the clouds.”

In the final part of this series I will explore the future state of cloud computing; a world where the immovable IT asset becomes movable in a commodity exchange.

More Stories By John Cowan

John Cowan is co-founder and CEO of 6fusion. John is credited as 6fusion's business model visionary, bridging concepts and services behind cloud computing to the IT Service channel. In 2008, he along with his 6fusion collaborators successfully launched the industry's first single unit of meausurement for x86 computing, known as the Workload Allocation Cube (WAC). John is a 12 year veteran of business and product development within the IT and Telecommunications sectors and a graduate of Queen's University at Kingston.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as C...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...
The IT industry is undergoing a significant evolution to keep up with cloud application demand. We see this happening as a mindset shift, from traditional IT teams to more well-rounded, cloud-focused job roles. The IT industry has become so cloud-minded that Gartner predicts that by 2020, this cloud shift will impact more than $1 trillion of global IT spending. This shift, however, has left some IT professionals feeling a little anxious about what lies ahead. The good news is that cloud computin...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
We've all had that feeling before: The feeling that you're missing something that everyone else is in on. For today's IT leaders, that feeling might come up when you hear talk about cloud brokers. Meanwhile, you head back into your office and deal with your ever-growing shadow IT problem. But the cloud-broker whispers and your shadow IT issues are linked. If you're wondering "what the heck is a cloud broker?" we've got you covered.
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
Everyone wants to use containers, but monitoring containers is hard. New ephemeral architecture introduces new challenges in how monitoring tools need to monitor and visualize containers, so your team can make sense of everything. In his session at @DevOpsSummit, David Gildeh, co-founder and CEO of Outlyer, will go through the challenges and show there is light at the end of the tunnel if you use the right tools and understand what you need to be monitoring to successfully use containers in your...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facing...
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
In recent years, containers have taken the world by storm. Companies of all sizes and industries have realized the massive benefits of containers, such as unprecedented mobility, higher hardware utilization, and increased flexibility and agility; however, many containers today are non-persistent. Containers without persistence miss out on many benefits, and in many cases simply pass the responsibility of persistence onto other infrastructure, adding additional complexity.
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership abi...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...