Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Andy Thurai, Pat Romanski, John Katrick

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

Cloud Brokerage: The Market Unified

Commodity brokers don’t own the transport system for the market

This is Part III in a series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here and Part II here.

The feedback and fallout from Part II of this post has been quite interesting.  I thought for sure the bulk of the flack I would have to take would be from the cloud vendor incumbents I said would be relegated to the world of retail cloud business.  But since I posted my perspective I’ve found myself digging in to the nature of the Total Addressable Market (TAM) for the Cloud Brokerage industry.

For those of you keeping score at home, I said the market for cloud brokerage is more that 10 times the market for cloud computing software and related services.

Yes, 10 times.

And it is because this market is so big that cloud brokerage will spawn the next generation of technology innovation.

But before I get to the underlying technologies that are on the horizon and necessary for the future that I, along with my collaborators, envision, let me first spend a few paragraphs to explain why I am not just pulling numbers out of my, um, ‘IaaS’.

On the 6fusion iNode Network the median server in production in the cloud is a quad core dual processor unit with an average of 4TBs of available storage.  Using this standard configuration, partners and customers yield approximately $42,000 per year in net billable proceeds. I would classify that number, give or take on either side of it, to be a reasonable annual revenue estimation.

IDC recently reported that the 2011 server shipments topped out at 8.3 million units.  At a $42K clip, that is a market growing by a healthy $350 billion each year.

But of course, as we all know, server shelf life is not exactly the same as what you’d expect from a box of Krusty-O’s from the Kwik-E-Mart.

A quick trip down the hall to Gary Morris’s office at 6fusion is always an educational adventure.  “Depreciation,” Gary explains, “is a systematic and rational process of distributing the cost of tangible assets over the life of those assets. US GAAP calls for depreciation of servers using the server’s cost, estimated useful life and residual value. Typically, computers, software and equipment are depreciated over a period of 1 to 5 years, with the average useful life being 3 years.”

If we take Gary’s use of the GAAP average as a multiplier, it means there is estimated to be over $1trillion in billable utility computing presently in use around the world.

The point here is that cloud brokerage is underpinned by the availability of both private and public compute, network and storage resources.  And it is this massive untapped market that will drive the next wave of innovation.

If the origins of the cloud business belonged to the innovation of companies like Amazon, Rackspace and VMware, then the future of the cloud brokerage belongs to a new cadre of agnostic intermediaries that will enable a true utility computing marketplace to flourish.

The unification of the market is what I refer to as the point in time at which cloud computing technologies in production today can be used to interface to the commodity market.  In order for that to happen, cloud brokerage as an industry must form and deliver the underlying technologies necessary to make a true market.

Just what are these technologies?  Let’s take a look at three areas of innovation that will underpin the future of the utility computing.

Cloud brokerage technologies are best considered in the context of supply, demand and delivery.

Universal Resource Metering: Quantification of Demand and Supply

I delivered a presentation in Asia a few weeks ago and I opened with a slide that had two simple definitions:  Utility and Commodity.

A Utility, I paraphrased, “is a service provided by organizations that are consumed by a public audience.”

A Commodity, according to common definition, “is a class of goods or services that is supplied without qualitative differentiation.”

Theoretically, you can have a utility without it necessarily being commodity.  But it rarely ever works that way because in order to have a utility in the way we think about the utilities we consume every day, you must have scale.   And in order to achieve scale, the utility must be pervasive and uniform.  One should not require any special skills in order to use it.  It must be simple and consistent to use.   Think about your interaction with things like power or water services or subscribing to the Internet.

Utility is a word used quite often to describe the cloud.   In a post a couple months ago Simon Wardley aptly explained the difference between the cloud and a computer utility.  The difference, says Wardley, is really only that “cloud was simply a word used by people to explain something that really wasn’t well understood to people who were even more confused than they were.”

So is the cloud really a computer ‘utility’?  Not yet.

You see, what the cloud is missing is the factor that truly negates qualitative differentiation – common measurement. You simply cannot claim something to be a true utility if every provider measures services differently.  Common utilities all share the characteristic of universal measurement.  Think about it.  Power. Water.  Energy.  The Internet.  Whatever.

A standardized unit of measurement for the computer utility will be one of the greatest innovations to come from the emerging market for cloud brokerage because it will establish basis from which a commodity market can emerge.

Cloud Infrastructure Federation: Tapping Global Supply

When you buy corn or wheat or soybeans by contract on a commodity exchange today, you don’t buy a brand.   You buy a commodity.  Cloud brokers of the future will move commodities, not brands.   Today, cloud brokers form ‘partnerships’ with service providers.   But for a true brokerage model to blossom, there can be no possibility for vendor discrimination.  Anyone that brings product to market can and should trade it.  The denial of interoperability cannot happen.

With this in mind true cloud brokers will overcome the interoperability hurdle through collaboration and cooperation.   This doesn’t mean ascribing to one API framework or another, regardless of how high and mighty the leading retail cloud properties might become.   It means absolving oneself from the politics of the API game completely.

The Underlying Transport System:  Delivering the Commodity

It doesn’t always happen, but when a commodity contract comes due, something must be delivered.   The party that holds the paper for a hundred thousand units of corn must be able to take possession of it.  Modern commodity markets are supported by an elaborate network of supply chain delivery systems – from tankers to trains and transport trucks.

The equivalent underlying transport system must exist for the cloud infrastructure market.

Commodity brokers don’t own the transport system for the market.   And for good reason.  However, if you subscribe to the early analyst view of cloud brokerage, they do.   The analysts see brokers facilitating the transaction and delivering the compute commodity itself.   To me, they either don’t fully grasp the potential of the broker or they are describing something all together different.

Cloud interoperability is not a new concept.  It has been bandied about the blogosphere for several years already.  The problem to date is that such movements have been nothing more than thinly veiled product sales pitches.  The cloud brokers of the future will drive the innovation to construct the underlying transport system to “connect the clouds.”

In the final part of this series I will explore the future state of cloud computing; a world where the immovable IT asset becomes movable in a commodity exchange.

More Stories By John Cowan

John Cowan is co-founder and CEO of 6fusion. John is credited as 6fusion's business model visionary, bridging concepts and services behind cloud computing to the IT Service channel. In 2008, he along with his 6fusion collaborators successfully launched the industry's first single unit of meausurement for x86 computing, known as the Workload Allocation Cube (WAC). John is a 12 year veteran of business and product development within the IT and Telecommunications sectors and a graduate of Queen's University at Kingston.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Kin Lane recently wrote a couple of blogs about why copyrighting an API is not common. I couldn’t agree more that copyrighting APIs is uncommon. First of all, the API definition is just an interface (It is the implementation detail … Continue reading →
The United States spends around 17-18% of its GDP on healthcare every year. Translated into dollars, it is a mind-boggling $2.9 trillion. Unfortunately, that spending will grow at a faster rate now due to baby boomers becoming an aging population, and they are the largest demographic in the U.S. Unless the U.S. gets this spiraling healthcare spending under control, in a few short years we will be spending almost 25% of our entire GDP in healthcare trying to fix people’s failing health, instead o...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...