|By John Cowan||
|May 2, 2012 07:30 AM EDT||
This is Part III in a series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here and Part II here.
The feedback and fallout from Part II of this post has been quite interesting. I thought for sure the bulk of the flack I would have to take would be from the cloud vendor incumbents I said would be relegated to the world of retail cloud business. But since I posted my perspective I’ve found myself digging in to the nature of the Total Addressable Market (TAM) for the Cloud Brokerage industry.
For those of you keeping score at home, I said the market for cloud brokerage is more that 10 times the market for cloud computing software and related services.
Yes, 10 times.
And it is because this market is so big that cloud brokerage will spawn the next generation of technology innovation.
But before I get to the underlying technologies that are on the horizon and necessary for the future that I, along with my collaborators, envision, let me first spend a few paragraphs to explain why I am not just pulling numbers out of my, um, ‘IaaS’.
On the 6fusion iNode Network the median server in production in the cloud is a quad core dual processor unit with an average of 4TBs of available storage. Using this standard configuration, partners and customers yield approximately $42,000 per year in net billable proceeds. I would classify that number, give or take on either side of it, to be a reasonable annual revenue estimation.
IDC recently reported that the 2011 server shipments topped out at 8.3 million units. At a $42K clip, that is a market growing by a healthy $350 billion each year.
But of course, as we all know, server shelf life is not exactly the same as what you’d expect from a box of Krusty-O’s from the Kwik-E-Mart.
A quick trip down the hall to Gary Morris’s office at 6fusion is always an educational adventure. “Depreciation,” Gary explains, “is a systematic and rational process of distributing the cost of tangible assets over the life of those assets. US GAAP calls for depreciation of servers using the server’s cost, estimated useful life and residual value. Typically, computers, software and equipment are depreciated over a period of 1 to 5 years, with the average useful life being 3 years.”
If we take Gary’s use of the GAAP average as a multiplier, it means there is estimated to be over $1trillion in billable utility computing presently in use around the world.
The point here is that cloud brokerage is underpinned by the availability of both private and public compute, network and storage resources. And it is this massive untapped market that will drive the next wave of innovation.
If the origins of the cloud business belonged to the innovation of companies like Amazon, Rackspace and VMware, then the future of the cloud brokerage belongs to a new cadre of agnostic intermediaries that will enable a true utility computing marketplace to flourish.
The unification of the market is what I refer to as the point in time at which cloud computing technologies in production today can be used to interface to the commodity market. In order for that to happen, cloud brokerage as an industry must form and deliver the underlying technologies necessary to make a true market.
Just what are these technologies? Let’s take a look at three areas of innovation that will underpin the future of the utility computing.
Cloud brokerage technologies are best considered in the context of supply, demand and delivery.
Universal Resource Metering: Quantification of Demand and Supply
I delivered a presentation in Asia a few weeks ago and I opened with a slide that had two simple definitions: Utility and Commodity.
A Utility, I paraphrased, “is a service provided by organizations that are consumed by a public audience.”
A Commodity, according to common definition, “is a class of goods or services that is supplied without qualitative differentiation.”
Theoretically, you can have a utility without it necessarily being commodity. But it rarely ever works that way because in order to have a utility in the way we think about the utilities we consume every day, you must have scale. And in order to achieve scale, the utility must be pervasive and uniform. One should not require any special skills in order to use it. It must be simple and consistent to use. Think about your interaction with things like power or water services or subscribing to the Internet.
Utility is a word used quite often to describe the cloud. In a post a couple months ago Simon Wardley aptly explained the difference between the cloud and a computer utility. The difference, says Wardley, is really only that “cloud was simply a word used by people to explain something that really wasn’t well understood to people who were even more confused than they were.”
So is the cloud really a computer ‘utility’? Not yet.
You see, what the cloud is missing is the factor that truly negates qualitative differentiation – common measurement. You simply cannot claim something to be a true utility if every provider measures services differently. Common utilities all share the characteristic of universal measurement. Think about it. Power. Water. Energy. The Internet. Whatever.
A standardized unit of measurement for the computer utility will be one of the greatest innovations to come from the emerging market for cloud brokerage because it will establish basis from which a commodity market can emerge.
Cloud Infrastructure Federation: Tapping Global Supply
When you buy corn or wheat or soybeans by contract on a commodity exchange today, you don’t buy a brand. You buy a commodity. Cloud brokers of the future will move commodities, not brands. Today, cloud brokers form ‘partnerships’ with service providers. But for a true brokerage model to blossom, there can be no possibility for vendor discrimination. Anyone that brings product to market can and should trade it. The denial of interoperability cannot happen.
With this in mind true cloud brokers will overcome the interoperability hurdle through collaboration and cooperation. This doesn’t mean ascribing to one API framework or another, regardless of how high and mighty the leading retail cloud properties might become. It means absolving oneself from the politics of the API game completely.
The Underlying Transport System: Delivering the Commodity
It doesn’t always happen, but when a commodity contract comes due, something must be delivered. The party that holds the paper for a hundred thousand units of corn must be able to take possession of it. Modern commodity markets are supported by an elaborate network of supply chain delivery systems – from tankers to trains and transport trucks.
The equivalent underlying transport system must exist for the cloud infrastructure market.
Commodity brokers don’t own the transport system for the market. And for good reason. However, if you subscribe to the early analyst view of cloud brokerage, they do. The analysts see brokers facilitating the transaction and delivering the compute commodity itself. To me, they either don’t fully grasp the potential of the broker or they are describing something all together different.
Cloud interoperability is not a new concept. It has been bandied about the blogosphere for several years already. The problem to date is that such movements have been nothing more than thinly veiled product sales pitches. The cloud brokers of the future will drive the innovation to construct the underlying transport system to “connect the clouds.”
In the final part of this series I will explore the future state of cloud computing; a world where the immovable IT asset becomes movable in a commodity exchange.
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
Dec. 9, 2016 11:45 AM EST Reads: 2,237
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Dec. 9, 2016 11:45 AM EST Reads: 1,539
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Dec. 9, 2016 11:30 AM EST Reads: 1,010
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Dec. 9, 2016 11:00 AM EST Reads: 714
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Dec. 9, 2016 10:15 AM EST Reads: 1,930
Today’s IT environments are increasingly heterogeneous, with Linux, Java, Oracle and MySQL considered nearly as common as traditional Windows environments. In many cases, these platforms have been integrated into an organization’s Windows-based IT department by way of an acquisition of a company that leverages one of those platforms. In other cases, the applications may have been part of the IT department for years, but managed by a separate department or singular administrator. Still, whether...
Dec. 9, 2016 09:45 AM EST Reads: 595
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud: This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Dec. 9, 2016 04:45 AM EST Reads: 2,990
I’m a huge fan of open source DevOps tools. I’m also a huge fan of scaling open source tools for the enterprise. But having talked with my fair share of companies over the years, one important thing I’ve learned is that you can’t scale your release process using open source tools alone. They simply require too much scripting and maintenance when used that way. Scripting may be fine for smaller organizations, but it’s not ok in an enterprise environment that includes many independent teams and to...
Dec. 9, 2016 02:45 AM EST Reads: 802
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 9, 2016 01:45 AM EST Reads: 1,972
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
Dec. 9, 2016 12:45 AM EST Reads: 5,148
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dec. 9, 2016 12:45 AM EST Reads: 1,242
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
Dec. 9, 2016 12:30 AM EST Reads: 929
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 8, 2016 09:15 PM EST Reads: 1,690
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Dec. 8, 2016 07:45 PM EST Reads: 5,769
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Dec. 8, 2016 05:00 PM EST Reads: 1,828
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Dec. 8, 2016 04:45 PM EST Reads: 1,877
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Dec. 8, 2016 04:45 PM EST Reads: 2,280
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 8, 2016 04:30 PM EST Reads: 1,998
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Dec. 8, 2016 04:15 PM EST Reads: 2,351
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 8, 2016 02:15 PM EST Reads: 1,217