|By John Cowan||
|September 9, 2012 12:00 PM EDT||
This is Part IV in a series by 6fusion Co-founder and CEO John Cowan on the emerging trend of Cloud Brokerage and the impact it will have on the technology industry and markets. Be sure to check out Part I of the series here, Part II here, and Part III here.
The IT industry to me looks a lot like the commercial airline industry did many years ago and I think the latter is rife with lessons about the power of a true commodity market.
For those of you keeping score, late last year American Airlines’ parent AMR declared bankruptcy. The Chapter 11 filing of the once largest airline in the world brought to a conclusion the era of disintegration for the legacy commercial airline market. You can argue about the principal cause for the airline industry’s demise, but ultimately it came down to the fact that the market leaders in the industry refused to adapt to the changes going on around it while others embraced it and found creative ways to rise to the top.
Flying around on airplanes became a mass-market product over the last 40 years. And with a mass market product comes mass market demands – particularly around pricing. Incumbent airlines had the benefit of established market share but the trouble of managing a return on investment in infrastructure in a changing financial dynamic. The signs were there in the 1990’s but the dramatic collapse really took place over the last 10 years or so, which I will come back to later.
The cloud computing industry is structured and organized very much like the airline industry in the years leading up to its rapid disintegration. There are a handful of incumbents that some say are untouchable and then there’s everybody else in the market.
Amazon Web Services (AWS) is one such incumbent.
Competitors to AWS are really not unlike the competitors in the airline industry. Competition centers on squeezing more value into the same dollar with hopes of swaying customer loyalty. This is further compounded by the arrival of new entrants aggressively positioned to challenge the market leaders. I see no difference between the airline that boasts three extra inches of legroom in coach and the cloud operator that boasts an extra 9 on an SLA. This is simply the nature of an increasingly competitive market.
Let me get back to the dramatic collapse of the airline industry for a moment.
After careful examination of the commodity market demand, Southwest Airlines determined the answer to challenging its industry peers had little to do with product innovation. Instead, it had much more to do with financial innovation. How did they do this? Among some other things, they began a sophisticated program to hedge their projected jet fuel consumption. Hedging jet fuel was nothing new to the industry, but Southwest made it the centerpiece of an operating strategy. If you want, you can read an extended analysis of the Southwest case study here. In short, their strategy allowed them to accurately forecast their future costs, and hence offer aggressive pricing to undercut the market, whilst remaining profitable in a market forcibly applying downward pricing pressure. It was a commodity market that allowed Southwest to give the market what it wanted – a low cost, no frills flying experience.
For an airline, Southwest’s strategy is about as radical as it gets.
It is telling at this point to illustrate the reaction of the incumbent vendors in the industry when Southwest made its move. “I don’t think any sensible airline believes that by hedging it saves on its fuel bills,” they said.
As history proved, they totally missed the point.
Southwest had worked out that it could employ hedging strategies to make the derivatives market for jet fuel work in its favor. They dramatically reorganized their financial operation in order to turn the process of commoditization into a dangerous market weapon.
Which brings me back to the subject of compute, network and storage resources. If a derivatives market for jet fuel could underpin the upheaval of the airline business, could the same thing be possible in the market for cloud computing? Not only do I think it possible, I believe it is going to happen.
Not too long ago I was chatting with Joe Weinman, author of Cloudonomics, in his Manhattan office. As I explained my perspective on the industry we both spend a lot of our time thinking about he interrupted me to ask if I had ever heard of Dr. James Mitchell. I hadn’t. James, as it turns out, is a former Morgan Stanley commodities trader who now runs what would appear to be the world’s first “cloud broker-dealer”.
Something James is not is a “technology guy”. Doctorate in physics, yes – IT background, not so much. As far as he is concerned, cloud computing infrastructure as a service is just another commodity like electricity, coal, oil or potatoes, and should be treated as such.
Sound familiar? In Part II: The Cloud Vendor and the Agnostic Intermediary I characterized the evolution of the cloud brokerage model as one that would see two distinct groups playing a role: Those that dealt with the business of compute and those that dealt with the technical organization of compute.
When I met him he expressed his frustration at having to trade compute, network and storage resources in an inefficient manner because each providers’ cloud offering was separated by qualitative differentiation. You trade a “barrel of oil”, a “kWh of electricity” or a “kilogram of coal”. “So what is the unit of cloud computing?” he asked. He made the reverse of the usual analogy between electricity and cloud computing. He said, “can you imagine letting your electricity supplier bill you for your electricity using a measurement that they have made, using a meter that they invented, and then quoting it to you in a unit that they have pulled out of thin air, that cannot be compared to their competitors? Ridiculous!”
James and I agree on two fundamental principles on which the future of the cloud industry will be based.
The first is that cloud brokering has little to do with technology. Let’s consider an illustration of my point (techie readers, you might want to tune out for a moment). Provided that there is an independent third party who is able to measure what gets consumed on various different cloud providers, then it is possible to calculate a reference price for a reference quantity. Even if what a customer actually uses is different from the standard measurement, this does not matter as the variation in the different pricing between a “special” cloud infrastructure and a “standard” cloud infrastructure will vary slowly compared to the price for the “standard” cloud infrastructure. This is akin to proxy hedging a particular type of coal with a financial settlement on an API-2 index price for standard coal delivered in, say, Amsterdam (to give a very specific analogy).
The second is that the capability to trade cloud like a real commodity will create a world where the immovable IT asset can become movable for the first time in history. IT is arguably among the single biggest sunk cost in any modern enterprise and the albatross that hobbles disrupters from challenging market incumbents in the emerging cloud computing industry. As I illustrated in Part III: The Market Unified, more than $1 Trillion is spent every year on compute, network and storage resources. Nearly all of this spend is done in a fixed capitalization structure that sits on the balance sheet for years.
Every business that consumes a significant commodity resource speculates and hedges its overall position. It is clear to me that if the opportunity to establish liquidity in IT becomes real for the modern enterprise and market brokers, compute, network and storage resources will become to the emerging service provider what jet fuel is to an airline operator. And when that happens, you will want to be Southwest, not AMR. This is a reality that few cloud incumbents see coming and, like the once powerful commercial airliners, a dynamic few will choose to embrace.
A commodity exchange cannot exist without transaction velocity and price volatility, and brokers do just that in any other example. This is precisely why the role of “cloud broker” will become so important in the years ahead. This is why, as I’ve stated so often in the past, “the future of the cloud brokerage belongs to a new cadre of agnostic intermediaries that will enable a true utility computing marketplace to flourish.” And when the modern enterprise or resource supplier can apply the principles of financial trading to the IT industry we are going to see a force capable of completely redefining everything we currently think we know about the business of technology delivery.
Considering that new thinkers like Dr. James Mitchell are already on the scene I wouldn’t go making any bets that what we see is merely a distant future.
Today’s IT environments are increasingly heterogeneous, with Linux, Java, Oracle and MySQL considered nearly as common as traditional Windows environments. In many cases, these platforms have been integrated into an organization’s Windows-based IT department by way of an acquisition of a company that leverages one of those platforms. In other cases, the applications may have been part of the IT department for years, but managed by a separate department or singular administrator. Still, whether...
Dec. 10, 2016 06:30 AM EST Reads: 852
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud: This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Dec. 10, 2016 05:00 AM EST Reads: 3,024
I’m a huge fan of open source DevOps tools. I’m also a huge fan of scaling open source tools for the enterprise. But having talked with my fair share of companies over the years, one important thing I’ve learned is that you can’t scale your release process using open source tools alone. They simply require too much scripting and maintenance when used that way. Scripting may be fine for smaller organizations, but it’s not ok in an enterprise environment that includes many independent teams and to...
Dec. 10, 2016 04:15 AM EST Reads: 837
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
Dec. 10, 2016 02:45 AM EST Reads: 2,270
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 10, 2016 02:00 AM EST Reads: 1,995
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dec. 10, 2016 01:00 AM EST Reads: 1,277
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 9, 2016 10:30 PM EST Reads: 1,740
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Dec. 9, 2016 08:15 PM EST Reads: 5,804
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Dec. 9, 2016 05:30 PM EST Reads: 2,384
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Dec. 9, 2016 05:15 PM EST Reads: 1,905
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Dec. 9, 2016 05:00 PM EST Reads: 2,096
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 9, 2016 03:30 PM EST Reads: 1,258
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 9, 2016 03:00 PM EST Reads: 2,036
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Dec. 9, 2016 02:30 PM EST Reads: 1,287
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
Dec. 9, 2016 02:30 PM EST Reads: 840
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Dec. 9, 2016 02:15 PM EST Reads: 1,316
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Dec. 9, 2016 11:45 AM EST Reads: 1,573
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Dec. 9, 2016 11:30 AM EST Reads: 1,059
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Dec. 9, 2016 11:00 AM EST Reads: 762
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Dec. 9, 2016 10:15 AM EST Reads: 1,957