Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

Five Cloud Computing Predictions for 2010

I have always said, the sky’s the limit when it comes to cloud computing

Opening Keynote at Cloud Expo

Cloud Computing Expo - I guess that I’m a bit late in the game here to throw my predictions into the clouds but late is better than never, right? It has been an exciting year for us here at GoGrid. We had many stellar and innovative announcements which you can read about here. Before I quickly go into some of my thoughts for 2010, it makes sense to see how well I did for 2009. My original predictions were here and were as follows (coupled with a quick analysis).

  1. Cloud Reduce the Effect of the Recession – Businesses definitely did not spend as much last year, however, the interest in the cloud as a financially viable way to “survive” gained traction throughout the year. All major cloud providers (including GoGrid) showed growth during the year despite the recession.
  2. Broader Depth of Clouds – the cloud continued to grow throughout the year. More smaller and larger players jumped into the mix. The relative “unknown” of the cloud began to become much clearer.
  3. VC’s, Money & Long Term Viability – hard to gauge this one without knowing what VC’s were investing in. But given the strength of leaders like AWS, Rackspace and GoGrid and the continued development therein, the long term viability seemed solid.
  4. Partnerships Galore & Weeding Out of Providers – As is evident with GoGrid’s growing partner list, this is really where the marketplace was expanding. Partners bring subject-matter expertise to the cloud, allowing for medium to large providers to focus on their core competencies. There were not big failures of providers during 2009 but the shakeout may start in 2010.
  5. Hybrid Solutions – we continued to lead the way with robust hybrid solutions in the form of cloud front-ends coupled with physical back-end server. Not too many other providers announced things similar, but I believe these will materialize this year.
  6. Web 3.0 – the “social web” definitely took off this year, as did the whole (vague) concept of data being “in the cloud”. Web 3.0 wasn’t officially announced but there were definitely lots of companies mashing up their services/data in unique, new offerings.
  7. Standards and Interoperability – many separate groups continued to work towards open standards and interoperability with definite progress being made. Unfortunately, these groups are still splintered with individuals and companies working towards their own agenda.
  8. Staggered Growth within the Cloud – the big players continued to get bigger, leaving some of the smaller or non-visionary thinkers a bit behind. More users started looking away from shared hosting and more towards the cloud for solutions. Smaller startups continued to advance using the cloud to power their infrastructure with the enterprise still testing the waters or using the cloud sporadically.
  9. Technology Advances at the Cloud Molecular Level – chip manufacturers and computer/server vendors did announce chipsets and systems optimized for the cloud.
  10. Larger Adoption – still not as fast as I expected, but the fact that the Federal (and State) governments were (and are) putting some serious thought and development work around cloud computing shows that adoption is growing across the board.

Just quickly scanning through my “results” shows that I wasn’t too off track. Some items fared a bit better than others but for the most part my “predictions” were fairly close. So what about 2010? Here’s what I’m thinking:

  1. Cloud Outages – There will be several Cloud Outages that get high visibility this year. As complexity and associated infrastructure grows and more users turn toward the cloud, any hiccups therein will receive quick and broad media coverage, with naysayers quickly stating “I told you so”. Unfortunately, any type of outage may be perceived as a “cloud failure”, resulting in the masses becoming increasingly doubtful in the reliability of the cloud. This “F.U.D. Factor” will be a steep hurdle that cloud providers and partners will have to overcome. Those companies with sound IT strategies and best practices in place will be able to weather any outages well, assuming they employ Disaster Recovery (DR) solutions and have them implemented.
  2. The Rise of Hybrid Hosting Solutions – While relatively new in 2009, more providers will consider implementing the ability to have the “best of all worlds” hosting solutions. Whether this be the combination of physical and cloud environments or, cloud bursting, or private and public clouds working congruently, there will definitely be a blurring of lines between what hosting is.
  3. Security Concerns, Vulnerabilities and Malware – this is an only logical prediction. As the number of cloud or virtualized environments increase due to their ease of use and lower cost, the possibility of environments being created and left unattended also increases. Also because of the ease of use, with “average” users deploying environments that are not hardened or at least audited from a security standpoint, there are more possibilities for hackers or users to unintentionally open their systems up to malware, botnets or other malicious code.
  4. A “Cloud” for Everyone – Towards the end of last year, we started to see a blurring of the definition of “cloud” and “cloud computing”. The mainstream media is to blame for much of this confusion. To that end, people seem to be ubiquitously interchanging the word “cloud” and “cloud computing” where they are actually quite different. Most people are simply using the word “cloud” to describe anything where the data is stored somewhere else, whether it be truly using a “cloud computing” environment or simply a cluster of servers somewhere. I predict that this confusion will get worse long before it gets better. People will continue to interchangeably use “cloud” and “cloud computing” thus forcing those of us in the industry to (re)define what “cloud computing” truly is. However, as the word “cloud” becomes incredibly mainstream, it will grow to mean anything that is delivered via the web, regardless of if it is applications, services, infrastructure, data or what have you. (In fact, I used “cloud” interchangeably throughout this post…for me, I’m talking about “cloud computing.”)
  5. Analysts will Shorten their “Coming of Age” Stories – Many of the big name players predicted that cloud computing wouldn’t really be adopted by the mainstream for another few years. I believe that they will retract or refine their statements to show how much closer to mainstream cloud computing really is. While Fortune100 companies may still be slow to adopt, the “rest of us” will get on the cloud a lot faster than analysts originally predicted.

I have always said, the sky’s the limit when it comes to cloud computing. What are your thoughts or predictions? Is 2010 going to be the year of “cloud adoption”? “Cloud expansion”? “Cloud acceleration”? Or just “the year of the cloud.”

Read the original blog entry...

More Stories By Michael Sheehan

Michael Sheehan is the Technology Evangelist for Cloud Computing Infrastructure provider GoGrid and ServePath and is an avid technology pundit. GoGrid is the cloud hosting division of ServePath Dedicated Hosting, a company with extensive expertise and experience in web hosting infrastructure. Follow him on Twitter.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...