Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Mehdi Daoudi, Astadia CloudGPS, Christoph Schell

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, @BigDataExpo, SDN Journal, FinTech Journal

@CloudExpo: Article

The Evolution of Cloud Computing

Conceptual origins of cloud computing

Definitions of cloud computing are easy to find, but a single, authoritative definition is hard to come by. Perhaps the best work in this area was done by Böhm, et al. By compiling characteristics of 17 different scholarly and industrial definitions, the authors identified five primary characteristics of cloud computing allowing a definition such as: "Cloud computing is a service that delivers scalable hardware and/or software solutions via the Internet or other network on a pay-per-usage basis." (Emphasis indicates essential definition elements).

Cloud computing can further be broken down into three common types: SaaS, PaaS, and IaaS. SaaS (Software as a Service) allows users to log into and utilize preprogrammed software that is owned and maintained by the service provider. PaaS (Platform as a Service) gives users tools and languages owned and maintained by the service provider that can be used to build and deploy customized applications. IaaS (Infrastructure as a Service) provides users with storage and processing, allowing users full control over the use of that infrastructure. There are other divisions of cloud computing, but these are the most common.

Conceptual Origins of Cloud Computing
Looking back, it seems that cloud computing was seen as the end goal of many computer pioneers in the 1960s, or, at least, the goal of the early experiments that would eventually become the Internet.

There are three main figures commonly cited as laying the conceptual framework for cloud computing: John McCarthy, JCR Licklider, and Douglas F. Parkhill.

McCarthy first proposed in 1957 that time sharing of computing resources might allow companies to sell excess computation services for maximum utilization of the resource. He even imagined that computation might be organized as a utility.

Licklider, a programmer at the Advanced Research Projects Agency, highlighted some of the promise and challenges in cloud computing in a 1963 memo to those he described as the "Members and Affiliates of the Intergalactic Computer Network." Specifically, he talked about the ability to send a problem to a network of computers that could then pool their resources to solve it, and the need to establish a shared language to allow the computers to talk to one another.

In 1966 Parkhill published "The Challenge of the Computer Utility," which identified many of the challenges facing cloud computing, such as scalability and the need for large bandwidth connections. He also initiated a comparison with electric utilities.

Why We Are in Cloud Computing Time
If cloud computing has been around for so long conceptually, why does it seem like a revolutionary idea at all? Because only now are we in cloud computing time.

Science fiction scholars commonly use the shorthand "steam engine time" to describe the phenomenon that ideas pop up several times but don't catch on for many years. They point out that the Romans knew what steam engines were and could make them, but it wasn't until 1600 years later that the technology came to fruition. The world just wasn't ready for steam engines. The same is true of cloud computing.

The necessary elements that had to be in place before cloud computing could become a reality were the presence of very large datacenters, high-speed Internet connectivity, and the acceptance of cloud computing as a viable model for supplying IT needs.

The presence of very large datacenters is a crucial piece in the foundation of cloud computing. To be able to offer cloud services at a competitive price, suppliers must have datacenters sufficiently large to take advantage of the economies of scale benefits that can reduce costs 80-86% over the medium-sized datacenters that many companies previously utilized. These very large datacenters were manufactured for their own use by many companies that would later become cloud computing providers, such as Amazon, Google, and Microsoft.

Almost universal access to high-speed Internet connectivity is crucial to cloud computing. If your data is bottlenecked getting to and from the cloud, it simply can't be a practical solution for your IT needs.

Finally, it is important for potential users to see cloud computing as a viable solution for IT needs. People need to be able to trust that some ethereal company is going to be able to provide for your urgent IT needs on a daily basis. This cultural work was done by many disparate influences, from MMOs to Google, which expanded acceptance of online resources beyond the IT community. Another crucial but oft-neglected part of this cultural work was performed by peer-to-peer computing, which introduced many people to the notion that they could utilize the resources of other computers via the Internet.

Cloud Computing Timeline: Who, When, and Why
There are many good timelines about cloud computing available, and several are available in my resources section, but it's still important to give a basic timeline to show the evolution of cloud computing service offerings:

  • 1999: Salesforce launches its SaaS enterprise applications
  • 2002: Amazon launches Amazon Web Services (AWS), which offer both artificial and human intelligence for problem solving via the Internet
  • 2006: Google launches Google Docs, a free, web-based competitor to Microsoft Office
  • 2006: Amazon launches Elastic Compute Cloud (EC2) and Simple Storage Service (S3), sometimes described as the first IaaS
  • 2007: Salesforce launches Force.com, often described as the first PaaS
  • 2008: Google App Engine launched
  • 2009: Microsoft launches Windows Azure

Armbrust, et al. note many motives that drive companies to launch cloud computing services, including:

  • Profit: By taking advantage of cost savings from very large datacenters, companies can underbid competitors and still make significant profit
  • Leverage existing investment: For example, many of the applications in AWS were developed for internal use first, then sold in slightly altered form for additional revenue
  • Defend a franchise: Microsoft launched Windows Azure to help maintain competitiveness of the Windows brand
  • Attack a competitor: Google Docs was launched partly as an attack on Microsoft's profitable Office products
  • Leverage customer relationships: Windows Azure gives existing clients a branded cloud service that plays up perceived reliability of the brand, constantly emphasizing that it is a "rock-solid" cloud service

These are the motives that bring competitors to offer cloud computing services, but what drives companies and individuals to adopt cloud computing, and what barriers still exist to full cloud implementation.

The Cloud Computing Market: Where It's At, and Where It's Going
According to a study by IT trade group CompTIA, up to 80% of businesses use some form of cloud computing, although the degree of use varies widely. IBM's studies show that although only 8% of businesses believe cloud computing currently has a significant impact on their business, it is expected to grow to more than 30% in the next three years.

Cloud computing is often sold on the basis of price, but the primary benefit companies are seeking from cloud computing, according to recent surveys, is flexibility. With the huge swings caused by viral phenomena on the Internet, companies can see demand for their site and services fluctuate wildly in a short period of time. Cloud computing gives companies the flexibility to purchase computing resources on demand. A more conventional benefit of cloud computing's flexibility is the ability to avoid hiring and firing IT personnel for short-term projects.

One of the major obstacles to full adoption of cloud computing services remains security concerns. Although cloud-based security solutions exist, there is still a perception that cloud computing puts data at risk compared to private datacenters and increases the operational impact of denial-of-service attacks.

Despite these concerns, however, all sectors of the cloud computing market are expected to thrive in the near future, with revenue in nearly all sectors doubling within the next 3-5 years.

More Stories By Matthew Candelaria

Dr. Matthew Candelaria is a professional writer with more than five years' experience writing copy in industries such as law, medicine, technology and computer security. For more information about him and his work, visit www.writermc.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
"I think DevOps is now a rambunctious teenager – it’s starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - we've lost control, we've given up cost to a certain extent, and then security, flexibility," explained Steve Conner, VP of Sales at Cloudistics,in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
For over a decade, Application Programming Interface or APIs have been used to exchange data between multiple platforms. From social media to news and media sites, most websites depend on APIs to provide a dynamic and real-time digital experience. APIs have made its way into almost every device and service available today and it continues to spur innovations in every field of technology. There are multiple programming languages used to build and run applications in the online world. And just li...
If you are thinking about moving applications off a mainframe and over to open systems and the cloud, consider these guidelines to prioritize what to move and what to eliminate. On the surface, mainframe architecture seems relatively simple: A centrally located computer processes data through an input/output subsystem and stores its computations in memory. At the other end of the mainframe are printers and terminals that communicate with the mainframe through protocols. For all of its appare...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
From personal care products to groceries and movies on demand, cloud-based subscriptions are fulfilling the needs of consumers across an array of market sectors. Nowhere is this shift to subscription services more evident than in the technology sector. By adopting an Everything-as-a-Service (XaaS) delivery model, companies are able to tailor their computing environments to shape the experiences they want for customers as well as their workforce.
If you read a lot of business and technology publications, you might think public clouds are universally preferred over all other cloud options. To be sure, the numbers posted by Amazon Web Services (AWS) and Microsoft’s Azure platform are nothing short of impressive. Statistics reveal that public clouds are growing faster than private clouds and analysts at IDC predict that public cloud growth will be 3 times that of private clouds by 2019.
"Peak 10 is a hybrid infrastructure provider across the nation. We are in the thick of things when it comes to hybrid IT," explained Michael Fuhrman, Chief Technology Officer at Peak 10, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
Data reduction delivers compelling cost reduction that substantially improves the business case in every cloud deployment model. No matter which cloud approach you choose, the cost savings benefits from data reduction should not be ignored and must be a component of your cloud strategy. IT professionals are finding that the future of IT infrastructure lies in the cloud. Data reduction technologies enable clouds — public, private, and hybrid — to deliver business agility and elasticity at the lo...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We focus on SAP workloads because they are among the most powerful but somewhat challenging workloads out there to take into public cloud," explained Swen Conrad, CEO of Ocean9, Inc., in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Outscale was founded in 2010, is based in France, is a strategic partner to Dassault Systémes and has done quite a bit of work with divisions of Dassault," explained Jackie Funk, Digital Marketing exec at Outscale, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.