|By John Cowan||
|May 2, 2012 07:30 AM EDT||
This is Part III in a series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here and Part II here.
The feedback and fallout from Part II of this post has been quite interesting. I thought for sure the bulk of the flack I would have to take would be from the cloud vendor incumbents I said would be relegated to the world of retail cloud business. But since I posted my perspective I’ve found myself digging in to the nature of the Total Addressable Market (TAM) for the Cloud Brokerage industry.
For those of you keeping score at home, I said the market for cloud brokerage is more that 10 times the market for cloud computing software and related services.
Yes, 10 times.
And it is because this market is so big that cloud brokerage will spawn the next generation of technology innovation.
But before I get to the underlying technologies that are on the horizon and necessary for the future that I, along with my collaborators, envision, let me first spend a few paragraphs to explain why I am not just pulling numbers out of my, um, ‘IaaS’.
On the 6fusion iNode Network the median server in production in the cloud is a quad core dual processor unit with an average of 4TBs of available storage. Using this standard configuration, partners and customers yield approximately $42,000 per year in net billable proceeds. I would classify that number, give or take on either side of it, to be a reasonable annual revenue estimation.
IDC recently reported that the 2011 server shipments topped out at 8.3 million units. At a $42K clip, that is a market growing by a healthy $350 billion each year.
But of course, as we all know, server shelf life is not exactly the same as what you’d expect from a box of Krusty-O’s from the Kwik-E-Mart.
A quick trip down the hall to Gary Morris’s office at 6fusion is always an educational adventure. “Depreciation,” Gary explains, “is a systematic and rational process of distributing the cost of tangible assets over the life of those assets. US GAAP calls for depreciation of servers using the server’s cost, estimated useful life and residual value. Typically, computers, software and equipment are depreciated over a period of 1 to 5 years, with the average useful life being 3 years.”
If we take Gary’s use of the GAAP average as a multiplier, it means there is estimated to be over $1trillion in billable utility computing presently in use around the world.
The point here is that cloud brokerage is underpinned by the availability of both private and public compute, network and storage resources. And it is this massive untapped market that will drive the next wave of innovation.
If the origins of the cloud business belonged to the innovation of companies like Amazon, Rackspace and VMware, then the future of the cloud brokerage belongs to a new cadre of agnostic intermediaries that will enable a true utility computing marketplace to flourish.
The unification of the market is what I refer to as the point in time at which cloud computing technologies in production today can be used to interface to the commodity market. In order for that to happen, cloud brokerage as an industry must form and deliver the underlying technologies necessary to make a true market.
Just what are these technologies? Let’s take a look at three areas of innovation that will underpin the future of the utility computing.
Cloud brokerage technologies are best considered in the context of supply, demand and delivery.
Universal Resource Metering: Quantification of Demand and Supply
I delivered a presentation in Asia a few weeks ago and I opened with a slide that had two simple definitions: Utility and Commodity.
A Utility, I paraphrased, “is a service provided by organizations that are consumed by a public audience.”
A Commodity, according to common definition, “is a class of goods or services that is supplied without qualitative differentiation.”
Theoretically, you can have a utility without it necessarily being commodity. But it rarely ever works that way because in order to have a utility in the way we think about the utilities we consume every day, you must have scale. And in order to achieve scale, the utility must be pervasive and uniform. One should not require any special skills in order to use it. It must be simple and consistent to use. Think about your interaction with things like power or water services or subscribing to the Internet.
Utility is a word used quite often to describe the cloud. In a post a couple months ago Simon Wardley aptly explained the difference between the cloud and a computer utility. The difference, says Wardley, is really only that “cloud was simply a word used by people to explain something that really wasn’t well understood to people who were even more confused than they were.”
So is the cloud really a computer ‘utility’? Not yet.
You see, what the cloud is missing is the factor that truly negates qualitative differentiation – common measurement. You simply cannot claim something to be a true utility if every provider measures services differently. Common utilities all share the characteristic of universal measurement. Think about it. Power. Water. Energy. The Internet. Whatever.
A standardized unit of measurement for the computer utility will be one of the greatest innovations to come from the emerging market for cloud brokerage because it will establish basis from which a commodity market can emerge.
Cloud Infrastructure Federation: Tapping Global Supply
When you buy corn or wheat or soybeans by contract on a commodity exchange today, you don’t buy a brand. You buy a commodity. Cloud brokers of the future will move commodities, not brands. Today, cloud brokers form ‘partnerships’ with service providers. But for a true brokerage model to blossom, there can be no possibility for vendor discrimination. Anyone that brings product to market can and should trade it. The denial of interoperability cannot happen.
With this in mind true cloud brokers will overcome the interoperability hurdle through collaboration and cooperation. This doesn’t mean ascribing to one API framework or another, regardless of how high and mighty the leading retail cloud properties might become. It means absolving oneself from the politics of the API game completely.
The Underlying Transport System: Delivering the Commodity
It doesn’t always happen, but when a commodity contract comes due, something must be delivered. The party that holds the paper for a hundred thousand units of corn must be able to take possession of it. Modern commodity markets are supported by an elaborate network of supply chain delivery systems – from tankers to trains and transport trucks.
The equivalent underlying transport system must exist for the cloud infrastructure market.
Commodity brokers don’t own the transport system for the market. And for good reason. However, if you subscribe to the early analyst view of cloud brokerage, they do. The analysts see brokers facilitating the transaction and delivering the compute commodity itself. To me, they either don’t fully grasp the potential of the broker or they are describing something all together different.
Cloud interoperability is not a new concept. It has been bandied about the blogosphere for several years already. The problem to date is that such movements have been nothing more than thinly veiled product sales pitches. The cloud brokers of the future will drive the innovation to construct the underlying transport system to “connect the clouds.”
In the final part of this series I will explore the future state of cloud computing; a world where the immovable IT asset becomes movable in a commodity exchange.
In case you haven’t heard, the new hotness in app architectures is serverless. Mainly restricted to cloud environments (Amazon Lambda, Google Cloud Functions, Microsoft Azure Functions) the general concept is that you don’t have to worry about anything but the small snippets of code (functions) you write to do something when something happens. That’s an event-driven model, by the way, that should be very familiar to anyone who has taken advantage of a programmable proxy to do app or API routing ...
Oct. 27, 2016 09:00 PM EDT Reads: 1,406
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
Oct. 27, 2016 09:00 PM EDT Reads: 3,684
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
Oct. 27, 2016 08:15 PM EDT Reads: 1,571
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...
Oct. 27, 2016 07:45 PM EDT Reads: 1,462
With emerging ideas, innovation, and talents, the lines between DevOps, release engineering, and even security are rapidly blurring. I invite you to sit down for a moment with Principle Consultant, J. Paul Reed, and listen to his take on what the intersection between these once individualized fields entails, and may even foreshadow.
Oct. 27, 2016 07:15 PM EDT Reads: 1,890
In many organizations governance is still practiced by phase or stage gate peer review, and Agile projects are forced to accommodate, which leads to WaterScrumFall or worse. But governance criteria and policies are often very weak anyway, out of date or non-existent. Consequently governance is frequently a matter of opinion and experience, highly dependent upon the experience of individual reviewers. As we all know, a basic principle of Agile methods is delegation of responsibility, and ideally ...
Oct. 27, 2016 06:00 PM EDT Reads: 3,681
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Oct. 27, 2016 03:00 PM EDT Reads: 3,821
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
Oct. 27, 2016 01:15 PM EDT Reads: 14,002
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service.
Oct. 27, 2016 01:15 PM EDT Reads: 1,215
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Oct. 27, 2016 01:15 PM EDT Reads: 5,097
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, will discuss how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team a...
Oct. 27, 2016 01:15 PM EDT Reads: 871
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Oct. 27, 2016 01:00 PM EDT Reads: 4,656
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Oct. 27, 2016 12:00 PM EDT Reads: 736
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Oct. 27, 2016 11:45 AM EDT Reads: 1,190
operations aren’t merging to become one discipline. Nor is operations simply going away. Rather, DevOps is leading software development and operations – together with other practices such as security – to collaborate and coexist with less overhead and conflict than in the past. In his session at @DevOpsSummit at 19th Cloud Expo, Gordon Haff, Red Hat Technology Evangelist, will discuss what modern operational practices look like in a world in which applications are more loosely coupled, are deve...
Oct. 27, 2016 11:45 AM EDT Reads: 2,076
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Oct. 27, 2016 11:15 AM EDT Reads: 2,250
DevOps is a term that comes full of controversy. A lot of people are on the bandwagon, while others are waiting for the term to jump the shark, and eventually go back to business as usual. Regardless of where you are along the specturm of loving or hating the term DevOps, one thing is certain. More and more people are using it to describe a system administrator who uses scripts, or tools like, Chef, Puppet or Ansible, in order to provision infrastructure. There is also usually an expectation of...
Oct. 27, 2016 10:30 AM EDT Reads: 1,779
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
Oct. 27, 2016 10:30 AM EDT Reads: 16,652
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 27, 2016 09:15 AM EDT Reads: 2,182
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
Oct. 27, 2016 07:30 AM EDT Reads: 1,631