|By Matthew Candelaria||
|January 2, 2013 08:45 AM EST||
Definitions of cloud computing are easy to find, but a single, authoritative definition is hard to come by. Perhaps the best work in this area was done by Böhm, et al. By compiling characteristics of 17 different scholarly and industrial definitions, the authors identified five primary characteristics of cloud computing allowing a definition such as: "Cloud computing is a service that delivers scalable hardware and/or software solutions via the Internet or other network on a pay-per-usage basis." (Emphasis indicates essential definition elements).
Cloud computing can further be broken down into three common types: SaaS, PaaS, and IaaS. SaaS (Software as a Service) allows users to log into and utilize preprogrammed software that is owned and maintained by the service provider. PaaS (Platform as a Service) gives users tools and languages owned and maintained by the service provider that can be used to build and deploy customized applications. IaaS (Infrastructure as a Service) provides users with storage and processing, allowing users full control over the use of that infrastructure. There are other divisions of cloud computing, but these are the most common.
Conceptual Origins of Cloud Computing
Looking back, it seems that cloud computing was seen as the end goal of many computer pioneers in the 1960s, or, at least, the goal of the early experiments that would eventually become the Internet.
There are three main figures commonly cited as laying the conceptual framework for cloud computing: John McCarthy, JCR Licklider, and Douglas F. Parkhill.
McCarthy first proposed in 1957 that time sharing of computing resources might allow companies to sell excess computation services for maximum utilization of the resource. He even imagined that computation might be organized as a utility.
Licklider, a programmer at the Advanced Research Projects Agency, highlighted some of the promise and challenges in cloud computing in a 1963 memo to those he described as the "Members and Affiliates of the Intergalactic Computer Network." Specifically, he talked about the ability to send a problem to a network of computers that could then pool their resources to solve it, and the need to establish a shared language to allow the computers to talk to one another.
In 1966 Parkhill published "The Challenge of the Computer Utility," which identified many of the challenges facing cloud computing, such as scalability and the need for large bandwidth connections. He also initiated a comparison with electric utilities.
Why We Are in Cloud Computing Time
If cloud computing has been around for so long conceptually, why does it seem like a revolutionary idea at all? Because only now are we in cloud computing time.
Science fiction scholars commonly use the shorthand "steam engine time" to describe the phenomenon that ideas pop up several times but don't catch on for many years. They point out that the Romans knew what steam engines were and could make them, but it wasn't until 1600 years later that the technology came to fruition. The world just wasn't ready for steam engines. The same is true of cloud computing.
The necessary elements that had to be in place before cloud computing could become a reality were the presence of very large datacenters, high-speed Internet connectivity, and the acceptance of cloud computing as a viable model for supplying IT needs.
The presence of very large datacenters is a crucial piece in the foundation of cloud computing. To be able to offer cloud services at a competitive price, suppliers must have datacenters sufficiently large to take advantage of the economies of scale benefits that can reduce costs 80-86% over the medium-sized datacenters that many companies previously utilized. These very large datacenters were manufactured for their own use by many companies that would later become cloud computing providers, such as Amazon, Google, and Microsoft.
Almost universal access to high-speed Internet connectivity is crucial to cloud computing. If your data is bottlenecked getting to and from the cloud, it simply can't be a practical solution for your IT needs.
Finally, it is important for potential users to see cloud computing as a viable solution for IT needs. People need to be able to trust that some ethereal company is going to be able to provide for your urgent IT needs on a daily basis. This cultural work was done by many disparate influences, from MMOs to Google, which expanded acceptance of online resources beyond the IT community. Another crucial but oft-neglected part of this cultural work was performed by peer-to-peer computing, which introduced many people to the notion that they could utilize the resources of other computers via the Internet.
Cloud Computing Timeline: Who, When, and Why
There are many good timelines about cloud computing available, and several are available in my resources section, but it's still important to give a basic timeline to show the evolution of cloud computing service offerings:
- 1999: Salesforce launches its SaaS enterprise applications
- 2002: Amazon launches Amazon Web Services (AWS), which offer both artificial and human intelligence for problem solving via the Internet
- 2006: Google launches Google Docs, a free, web-based competitor to Microsoft Office
- 2006: Amazon launches Elastic Compute Cloud (EC2) and Simple Storage Service (S3), sometimes described as the first IaaS
- 2007: Salesforce launches Force.com, often described as the first PaaS
- 2008: Google App Engine launched
- 2009: Microsoft launches Windows Azure
Armbrust, et al. note many motives that drive companies to launch cloud computing services, including:
- Profit: By taking advantage of cost savings from very large datacenters, companies can underbid competitors and still make significant profit
- Leverage existing investment: For example, many of the applications in AWS were developed for internal use first, then sold in slightly altered form for additional revenue
- Defend a franchise: Microsoft launched Windows Azure to help maintain competitiveness of the Windows brand
- Attack a competitor: Google Docs was launched partly as an attack on Microsoft's profitable Office products
- Leverage customer relationships: Windows Azure gives existing clients a branded cloud service that plays up perceived reliability of the brand, constantly emphasizing that it is a "rock-solid" cloud service
These are the motives that bring competitors to offer cloud computing services, but what drives companies and individuals to adopt cloud computing, and what barriers still exist to full cloud implementation.
The Cloud Computing Market: Where It's At, and Where It's Going
According to a study by IT trade group CompTIA, up to 80% of businesses use some form of cloud computing, although the degree of use varies widely. IBM's studies show that although only 8% of businesses believe cloud computing currently has a significant impact on their business, it is expected to grow to more than 30% in the next three years.
Cloud computing is often sold on the basis of price, but the primary benefit companies are seeking from cloud computing, according to recent surveys, is flexibility. With the huge swings caused by viral phenomena on the Internet, companies can see demand for their site and services fluctuate wildly in a short period of time. Cloud computing gives companies the flexibility to purchase computing resources on demand. A more conventional benefit of cloud computing's flexibility is the ability to avoid hiring and firing IT personnel for short-term projects.
One of the major obstacles to full adoption of cloud computing services remains security concerns. Although cloud-based security solutions exist, there is still a perception that cloud computing puts data at risk compared to private datacenters and increases the operational impact of denial-of-service attacks.
Despite these concerns, however, all sectors of the cloud computing market are expected to thrive in the near future, with revenue in nearly all sectors doubling within the next 3-5 years.
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the ...
Sep. 4, 2015 05:00 AM EDT Reads: 1,663
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Sep. 4, 2015 04:30 AM EDT Reads: 1,703
ElasticBox, the agile application delivery manager, announced freely available public boxes for the DevOps community. ElasticBox works with enterprises to help them deploy any application to any cloud. Public boxes are curated reference boxes that represent some of the most popular applications and tools for orchestrating deployments at scale. Boxes are an adaptive way to represent reusable infrastructure as components of code. Boxes contain scripts, variables, and metadata to automate proces...
Sep. 4, 2015 02:15 AM EDT Reads: 132
To support developers and operations professionals in their push to implement DevOps principles for their infrastructure environments, ProfitBricks, a provider of cloud infrastructure, is adding support for DevOps tools Ansible and Chef. Ansible is a platform for configuring and managing data center infrastructure that combines multi-node software deployment, ad hoc task execution, and configuration management, and is used by DevOps professionals as they use its playbooks functionality to autom...
Sep. 4, 2015 02:00 AM EDT Reads: 128
It’s been proven time and time again that in tech, diversity drives greater innovation, better team productivity and greater profits and market share. So what can we do in our DevOps teams to embrace diversity and help transform the culture of development and operations into a true “DevOps” team? In her session at DevOps Summit, Stefana Muller, Director, Product Management – Continuous Delivery at CA Technologies, answered that question citing examples, showing how to create opportunities for ...
Sep. 4, 2015 01:00 AM EDT Reads: 543
Puppet Labs has announced the next major update to its flagship product: Puppet Enterprise 2015.2. This release includes new features providing DevOps teams with clarity, simplicity and additional management capabilities, including an all-new user interface, an interactive graph for visualizing infrastructure code, a new unified agent and broader infrastructure support.
Sep. 4, 2015 12:45 AM EDT Reads: 590
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, discussed why containers should be paired with new architectural practices such as microservices rathe...
Sep. 4, 2015 12:00 AM EDT Reads: 451
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
Sep. 4, 2015 12:00 AM EDT Reads: 388
Puppet Labs is pleased to share the findings from our 2015 State of DevOps Survey. We have deepened our understanding of how DevOps enables IT performance and organizational performance, based on responses from more than 20,000 technical professionals we’ve surveyed over the past four years. The 2015 State of DevOps Report reveals high-performing IT organizations deploy 30x more frequently with 200x shorter lead times. They have 60x fewer failures and recover 168x faster
Sep. 3, 2015 11:15 PM EDT Reads: 129
Containers are not new, but renewed commitments to performance, flexibility, and agility have propelled them to the top of the agenda today. By working without the need for virtualization and its overhead, containers are seen as the perfect way to deploy apps and services across multiple clouds. Containers can handle anything from file types to operating systems and services, including microservices. What are microservices? Unlike what the name implies, microservices are not necessarily small,...
Sep. 3, 2015 10:00 PM EDT Reads: 202
DevOps has traditionally played important roles in development and IT operations, but the practice is quickly becoming core to other business functions such as customer success, business intelligence, and marketing analytics. Modern marketers today are driven by data and rely on many different analytics tools. They need DevOps engineers in general and server log data specifically to do their jobs well. Here’s why: Server log files contain the only data that is completely full and accurate in th...
Sep. 3, 2015 09:45 PM EDT Reads: 450
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Learn what is going on, contribute to the discussions, and e...
Sep. 3, 2015 08:30 PM EDT Reads: 173
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Sep. 3, 2015 05:00 PM EDT Reads: 550
Whether you like it or not, DevOps is on track for a remarkable alliance with security. The SEC didn’t approve the merger. And your boss hasn’t heard anything about it. Yet, this unruly triumvirate will soon dominate and deliver DevSecOps faster, cheaper, better, and on an unprecedented scale. In his session at DevOps Summit, Frank Bunger, VP of Customer Success at ScriptRock, will discuss how this cathartic moment will propel the DevOps movement from such stuff as dreams are made on to a prac...
Sep. 3, 2015 04:00 PM EDT Reads: 266
Docker containerization is increasingly being used in production environments. How can these environments best be monitored? Monitoring Docker containers as if they are lightweight virtual machines (i.e., monitoring the host from within the container), with all the common metrics that can be captured from an operating system, is an insufficient approach. Docker containers can’t be treated as lightweight virtual machines; they must be treated as what they are: isolated processes running on hosts....
Sep. 3, 2015 03:00 PM EDT Reads: 192
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
Sep. 3, 2015 02:45 PM EDT Reads: 490
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Sep. 3, 2015 02:30 PM EDT Reads: 966
Introducing Containers & Microservices Bootcamp at @CloudExpo Silicon Valley | #Containers #Microservices
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
Sep. 3, 2015 02:15 PM EDT Reads: 419
Microservice architecture is fast becoming a go-to solution for enterprise applications, but it's not always easy to make the transition from an established, monolithic infrastructure. Lightweight and loosely coupled, building a set of microservices is arguably more difficult than building a monolithic application. However, once established, microservices offer a series of advantages over traditional architectures as deployment times become shorter and iterating becomes easier.
Sep. 3, 2015 02:15 PM EDT
The pricing of tools or licenses for log aggregation can have a significant effect on organizational culture and the collaboration between Dev and Ops teams. Modern tools for log aggregation (of which Logentries is one example) can be hugely enabling for DevOps approaches to building and operating business-critical software systems. However, the pricing of an aggregated logging solution can affect the adoption of modern logging techniques, as well as organizational capabilities and cross-team ...
Sep. 3, 2015 01:30 PM EDT Reads: 440