Welcome!

Microservices Expo Authors: XebiaLabs Blog, Elizabeth White, Pat Romanski, Yeshim Deniz, Liz McMillan

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

Cloud Brokerage: The Market Unified

Commodity brokers don’t own the transport system for the market

This is Part III in a series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here and Part II here.

The feedback and fallout from Part II of this post has been quite interesting.  I thought for sure the bulk of the flack I would have to take would be from the cloud vendor incumbents I said would be relegated to the world of retail cloud business.  But since I posted my perspective I’ve found myself digging in to the nature of the Total Addressable Market (TAM) for the Cloud Brokerage industry.

For those of you keeping score at home, I said the market for cloud brokerage is more that 10 times the market for cloud computing software and related services.

Yes, 10 times.

And it is because this market is so big that cloud brokerage will spawn the next generation of technology innovation.

But before I get to the underlying technologies that are on the horizon and necessary for the future that I, along with my collaborators, envision, let me first spend a few paragraphs to explain why I am not just pulling numbers out of my, um, ‘IaaS’.

On the 6fusion iNode Network the median server in production in the cloud is a quad core dual processor unit with an average of 4TBs of available storage.  Using this standard configuration, partners and customers yield approximately $42,000 per year in net billable proceeds. I would classify that number, give or take on either side of it, to be a reasonable annual revenue estimation.

IDC recently reported that the 2011 server shipments topped out at 8.3 million units.  At a $42K clip, that is a market growing by a healthy $350 billion each year.

But of course, as we all know, server shelf life is not exactly the same as what you’d expect from a box of Krusty-O’s from the Kwik-E-Mart.

A quick trip down the hall to Gary Morris’s office at 6fusion is always an educational adventure.  “Depreciation,” Gary explains, “is a systematic and rational process of distributing the cost of tangible assets over the life of those assets. US GAAP calls for depreciation of servers using the server’s cost, estimated useful life and residual value. Typically, computers, software and equipment are depreciated over a period of 1 to 5 years, with the average useful life being 3 years.”

If we take Gary’s use of the GAAP average as a multiplier, it means there is estimated to be over $1trillion in billable utility computing presently in use around the world.

The point here is that cloud brokerage is underpinned by the availability of both private and public compute, network and storage resources.  And it is this massive untapped market that will drive the next wave of innovation.

If the origins of the cloud business belonged to the innovation of companies like Amazon, Rackspace and VMware, then the future of the cloud brokerage belongs to a new cadre of agnostic intermediaries that will enable a true utility computing marketplace to flourish.

The unification of the market is what I refer to as the point in time at which cloud computing technologies in production today can be used to interface to the commodity market.  In order for that to happen, cloud brokerage as an industry must form and deliver the underlying technologies necessary to make a true market.

Just what are these technologies?  Let’s take a look at three areas of innovation that will underpin the future of the utility computing.

Cloud brokerage technologies are best considered in the context of supply, demand and delivery.

Universal Resource Metering: Quantification of Demand and Supply

I delivered a presentation in Asia a few weeks ago and I opened with a slide that had two simple definitions:  Utility and Commodity.

A Utility, I paraphrased, “is a service provided by organizations that are consumed by a public audience.”

A Commodity, according to common definition, “is a class of goods or services that is supplied without qualitative differentiation.”

Theoretically, you can have a utility without it necessarily being commodity.  But it rarely ever works that way because in order to have a utility in the way we think about the utilities we consume every day, you must have scale.   And in order to achieve scale, the utility must be pervasive and uniform.  One should not require any special skills in order to use it.  It must be simple and consistent to use.   Think about your interaction with things like power or water services or subscribing to the Internet.

Utility is a word used quite often to describe the cloud.   In a post a couple months ago Simon Wardley aptly explained the difference between the cloud and a computer utility.  The difference, says Wardley, is really only that “cloud was simply a word used by people to explain something that really wasn’t well understood to people who were even more confused than they were.”

So is the cloud really a computer ‘utility’?  Not yet.

You see, what the cloud is missing is the factor that truly negates qualitative differentiation – common measurement. You simply cannot claim something to be a true utility if every provider measures services differently.  Common utilities all share the characteristic of universal measurement.  Think about it.  Power. Water.  Energy.  The Internet.  Whatever.

A standardized unit of measurement for the computer utility will be one of the greatest innovations to come from the emerging market for cloud brokerage because it will establish basis from which a commodity market can emerge.

Cloud Infrastructure Federation: Tapping Global Supply

When you buy corn or wheat or soybeans by contract on a commodity exchange today, you don’t buy a brand.   You buy a commodity.  Cloud brokers of the future will move commodities, not brands.   Today, cloud brokers form ‘partnerships’ with service providers.   But for a true brokerage model to blossom, there can be no possibility for vendor discrimination.  Anyone that brings product to market can and should trade it.  The denial of interoperability cannot happen.

With this in mind true cloud brokers will overcome the interoperability hurdle through collaboration and cooperation.   This doesn’t mean ascribing to one API framework or another, regardless of how high and mighty the leading retail cloud properties might become.   It means absolving oneself from the politics of the API game completely.

The Underlying Transport System:  Delivering the Commodity

It doesn’t always happen, but when a commodity contract comes due, something must be delivered.   The party that holds the paper for a hundred thousand units of corn must be able to take possession of it.  Modern commodity markets are supported by an elaborate network of supply chain delivery systems – from tankers to trains and transport trucks.

The equivalent underlying transport system must exist for the cloud infrastructure market.

Commodity brokers don’t own the transport system for the market.   And for good reason.  However, if you subscribe to the early analyst view of cloud brokerage, they do.   The analysts see brokers facilitating the transaction and delivering the compute commodity itself.   To me, they either don’t fully grasp the potential of the broker or they are describing something all together different.

Cloud interoperability is not a new concept.  It has been bandied about the blogosphere for several years already.  The problem to date is that such movements have been nothing more than thinly veiled product sales pitches.  The cloud brokers of the future will drive the innovation to construct the underlying transport system to “connect the clouds.”

In the final part of this series I will explore the future state of cloud computing; a world where the immovable IT asset becomes movable in a commodity exchange.

More Stories By John Cowan

John Cowan is co-founder and CEO of 6fusion. John is credited as 6fusion's business model visionary, bridging concepts and services behind cloud computing to the IT Service channel. In 2008, he along with his 6fusion collaborators successfully launched the industry's first single unit of meausurement for x86 computing, known as the Workload Allocation Cube (WAC). John is a 12 year veteran of business and product development within the IT and Telecommunications sectors and a graduate of Queen's University at Kingston.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Software delivery was once specific to the IT industry. Now, Continuous Delivery pipelines are used around world from e-commerce to airline software. Building a software delivery pipeline once involved hours of scripting and manual steps–a process that’s painful, if not impossible, to scale. However Continuous Delivery with Application Release Automation tools offers a scripting-free, automated experience. Continuous Delivery pipelines are immensely powerful for the modern enterprise, boosting ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, will discuss solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed ...
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, drew upon his own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He also discussed the implementation of microservices in data and application integrat...
I’m told that it has been 21 years since Scrum became public when Jeff Sutherland and I presented it at an Object-Oriented Programming, Systems, Languages & Applications (OOPSLA) workshop in Austin, TX, in October of 1995. Time sure does fly. Things mature. I’m still in the same building and at the same company where I first formulated Scrum.[1] Initially nobody knew of Scrum, yet it is now an open source body of knowledge translated into more than 30 languages[2] People use Scrum worldwide for ...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.