|By John Cowan||
|May 2, 2012 07:30 AM EDT||
This is Part III in a series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here and Part II here.
The feedback and fallout from Part II of this post has been quite interesting. I thought for sure the bulk of the flack I would have to take would be from the cloud vendor incumbents I said would be relegated to the world of retail cloud business. But since I posted my perspective I’ve found myself digging in to the nature of the Total Addressable Market (TAM) for the Cloud Brokerage industry.
For those of you keeping score at home, I said the market for cloud brokerage is more that 10 times the market for cloud computing software and related services.
Yes, 10 times.
And it is because this market is so big that cloud brokerage will spawn the next generation of technology innovation.
But before I get to the underlying technologies that are on the horizon and necessary for the future that I, along with my collaborators, envision, let me first spend a few paragraphs to explain why I am not just pulling numbers out of my, um, ‘IaaS’.
On the 6fusion iNode Network the median server in production in the cloud is a quad core dual processor unit with an average of 4TBs of available storage. Using this standard configuration, partners and customers yield approximately $42,000 per year in net billable proceeds. I would classify that number, give or take on either side of it, to be a reasonable annual revenue estimation.
IDC recently reported that the 2011 server shipments topped out at 8.3 million units. At a $42K clip, that is a market growing by a healthy $350 billion each year.
But of course, as we all know, server shelf life is not exactly the same as what you’d expect from a box of Krusty-O’s from the Kwik-E-Mart.
A quick trip down the hall to Gary Morris’s office at 6fusion is always an educational adventure. “Depreciation,” Gary explains, “is a systematic and rational process of distributing the cost of tangible assets over the life of those assets. US GAAP calls for depreciation of servers using the server’s cost, estimated useful life and residual value. Typically, computers, software and equipment are depreciated over a period of 1 to 5 years, with the average useful life being 3 years.”
If we take Gary’s use of the GAAP average as a multiplier, it means there is estimated to be over $1trillion in billable utility computing presently in use around the world.
The point here is that cloud brokerage is underpinned by the availability of both private and public compute, network and storage resources. And it is this massive untapped market that will drive the next wave of innovation.
If the origins of the cloud business belonged to the innovation of companies like Amazon, Rackspace and VMware, then the future of the cloud brokerage belongs to a new cadre of agnostic intermediaries that will enable a true utility computing marketplace to flourish.
The unification of the market is what I refer to as the point in time at which cloud computing technologies in production today can be used to interface to the commodity market. In order for that to happen, cloud brokerage as an industry must form and deliver the underlying technologies necessary to make a true market.
Just what are these technologies? Let’s take a look at three areas of innovation that will underpin the future of the utility computing.
Cloud brokerage technologies are best considered in the context of supply, demand and delivery.
Universal Resource Metering: Quantification of Demand and Supply
I delivered a presentation in Asia a few weeks ago and I opened with a slide that had two simple definitions: Utility and Commodity.
A Utility, I paraphrased, “is a service provided by organizations that are consumed by a public audience.”
A Commodity, according to common definition, “is a class of goods or services that is supplied without qualitative differentiation.”
Theoretically, you can have a utility without it necessarily being commodity. But it rarely ever works that way because in order to have a utility in the way we think about the utilities we consume every day, you must have scale. And in order to achieve scale, the utility must be pervasive and uniform. One should not require any special skills in order to use it. It must be simple and consistent to use. Think about your interaction with things like power or water services or subscribing to the Internet.
Utility is a word used quite often to describe the cloud. In a post a couple months ago Simon Wardley aptly explained the difference between the cloud and a computer utility. The difference, says Wardley, is really only that “cloud was simply a word used by people to explain something that really wasn’t well understood to people who were even more confused than they were.”
So is the cloud really a computer ‘utility’? Not yet.
You see, what the cloud is missing is the factor that truly negates qualitative differentiation – common measurement. You simply cannot claim something to be a true utility if every provider measures services differently. Common utilities all share the characteristic of universal measurement. Think about it. Power. Water. Energy. The Internet. Whatever.
A standardized unit of measurement for the computer utility will be one of the greatest innovations to come from the emerging market for cloud brokerage because it will establish basis from which a commodity market can emerge.
Cloud Infrastructure Federation: Tapping Global Supply
When you buy corn or wheat or soybeans by contract on a commodity exchange today, you don’t buy a brand. You buy a commodity. Cloud brokers of the future will move commodities, not brands. Today, cloud brokers form ‘partnerships’ with service providers. But for a true brokerage model to blossom, there can be no possibility for vendor discrimination. Anyone that brings product to market can and should trade it. The denial of interoperability cannot happen.
With this in mind true cloud brokers will overcome the interoperability hurdle through collaboration and cooperation. This doesn’t mean ascribing to one API framework or another, regardless of how high and mighty the leading retail cloud properties might become. It means absolving oneself from the politics of the API game completely.
The Underlying Transport System: Delivering the Commodity
It doesn’t always happen, but when a commodity contract comes due, something must be delivered. The party that holds the paper for a hundred thousand units of corn must be able to take possession of it. Modern commodity markets are supported by an elaborate network of supply chain delivery systems – from tankers to trains and transport trucks.
The equivalent underlying transport system must exist for the cloud infrastructure market.
Commodity brokers don’t own the transport system for the market. And for good reason. However, if you subscribe to the early analyst view of cloud brokerage, they do. The analysts see brokers facilitating the transaction and delivering the compute commodity itself. To me, they either don’t fully grasp the potential of the broker or they are describing something all together different.
Cloud interoperability is not a new concept. It has been bandied about the blogosphere for several years already. The problem to date is that such movements have been nothing more than thinly veiled product sales pitches. The cloud brokers of the future will drive the innovation to construct the underlying transport system to “connect the clouds.”
In the final part of this series I will explore the future state of cloud computing; a world where the immovable IT asset becomes movable in a commodity exchange.
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
Feb. 24, 2017 02:15 PM EST Reads: 8,395
In recent years, containers have taken the world by storm. Companies of all sizes and industries have realized the massive benefits of containers, such as unprecedented mobility, higher hardware utilization, and increased flexibility and agility; however, many containers today are non-persistent. Containers without persistence miss out on many benefits, and in many cases simply pass the responsibility of persistence onto other infrastructure, adding additional complexity.
Feb. 24, 2017 02:15 PM EST Reads: 774
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his general session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore...
Feb. 24, 2017 01:45 PM EST Reads: 1,014
Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as C...
Feb. 24, 2017 12:30 PM EST Reads: 1,190
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Feb. 24, 2017 12:00 PM EST Reads: 1,517
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
Feb. 24, 2017 11:00 AM EST Reads: 6,240
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Feb. 24, 2017 09:15 AM EST Reads: 8,237
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
Feb. 24, 2017 08:45 AM EST Reads: 1,875
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Feb. 24, 2017 08:30 AM EST Reads: 3,598
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
Feb. 24, 2017 06:45 AM EST Reads: 4,850
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Feb. 24, 2017 06:15 AM EST Reads: 1,037
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Feb. 24, 2017 06:00 AM EST Reads: 2,133
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...
Feb. 24, 2017 04:36 AM EST Reads: 130
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Ca...
Feb. 24, 2017 01:45 AM EST Reads: 9,705
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
Feb. 24, 2017 01:45 AM EST Reads: 1,597
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Feb. 24, 2017 01:45 AM EST Reads: 3,913
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Feb. 24, 2017 01:00 AM EST Reads: 2,633
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
Feb. 23, 2017 10:15 PM EST Reads: 2,732
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
Feb. 23, 2017 09:30 PM EST Reads: 1,024
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
Feb. 23, 2017 06:30 PM EST Reads: 6,431