Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Jason Bloomberg, Elizabeth White, Roger Strukhoff

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, Government Cloud, SDN Journal

@CloudExpo: Blog Feed Post

Cloud Computing Service Models

A look at the three different service models for cloud computing as defined by NIST

In this post I will look at the three different service models for cloud computing as defined by NIST. More specifically I will look at the management and operations overhead for each one of the models and compare it to the traditional on-premise model.

Traditional Model

Let's look at how things have been done in the past. Traditionally enterprises have been responsible for managing their own IT infrastructure as well as the software stack that runs their applications. For small companies that meant hiring polyglot employees with wide range of skills varying from low level networking to high level application support. For larger ones, that can afford more staff, it meant creating specialized teams responsible for only networking infrastructure, or only storage or servers and virtualization. However for lot of those enterprises the core business has never been managing IT infrastructure - the only thing they are interested in is to manage their line-of-business (LOB) apps.

Cloud Service Models

Here are just some of the tasks IT teams enterprises have been required to do in the past:

  • Build racks with servers and wire them into the network
  • Build storage arrays and wire them into the network
  • Configure routers
  • Configure firewalls and DMZ zones
  • Install operating system software on the servers
  • Create virtual machines (if virtualization is utilized)
  • Install operating system software on the virtual machines
  • Install databases, set up replication and backups
  • Install middleware used for hosting the application code
  • Patch and update operating system software
  • Patch and update databases
  • Patch and update middleware
  • Patch and update runtime
  • Install application software
  • Patch and update application software

Although long this is by no mean the complete list of tasks that IT personnel has been responsible for. From the list above only the last two (install application software and update application software) have been essential to the core business of the enterprise. In addition to the IT operational costs (OpEx) enterprises also incurred significant capital expenditures (CapEx) used to procure the necessary hardware.

More than a decade ago hosting providers recognized the need to help businesses with those tasks and allowed them to outsource the build-up of infrastructure, and concentrate on just managing their applications. Although hosting providers helped enterprises with OpEx and CapEx they still lacked some of the essential cloud characteristics like on-demand self-service, rapid elasticity and measured service as outlined in Essential Cloud Computing Characteristics.

Infrastructure-as-a-Service (IaaS) Model
IaaS model was the first model that complies with NIST cloud computing characteristics. In essence it offers cloud computing environment consisting of virtual machines. It offers self-service portal where you can on-demand start a virtual machine with preferred operating system, it is broadly accessible, elastic (you can easily start identical virtual machine or shut down exiting one), it uses pool of virtual machines that are collocated on common hardware, and at the end it measures your usage of those virtual machines.

If you look at the picture above you will see that the IaaS model provides automation in the lower layers (up to the virtualization layer) of the application stack. What that means is that tasks like starting the virtual machine, adding it to the network, configuring the routing and the firewalls and attaching storage to it is automatically done by the automation software. The vendor that provides the service is also responsible for managing any hardware failures and service the underlying hardware.

As you have already noticed the IaaS model provides cloud services up to the virtualization layer of the application stack. However as a consumer of the IaaS service you are still responsible for managing the virtual machine. Hence you are still responsible for patching and updating the operating system on the VM, installing and maintaining any databases or middleware that your application uses in addition to maintaining your actual application.

IaaS is very similar to the traditional hosting model with the added benefits of self-service, elasticity and metering.

Platform-as-a-Service (PaaS) Model
With PaaS you have much less things to worry about. As you can see from the picture the whole stack needed for your application is managed by the vendor. Your only responsibilities are your application and the data your application uses. In addition to the tasks managed in IaaS case the vendor (or in the case of private PaaS the platform owner) is also responsible for patching and updating the operating system, installing and maintaining the middleware as well as the runtime your application uses.

One important thing that you need to be aware of when using PaaS is that the automatic updates the vendor does may sometimes have negative impact on your application. Why is that? Very often OS and middleware vendors do incompatible changes between versions of their software. If your application depends on any underlying OS and middleware functionality it may break between platform updates. And because you are not in control of those updates you may end up with your application being down.

The premise of PaaS though is not only to offer maintenance free application stack but also additional services that you can utilize in your application. Very often PaaS providers are exposing middleware and databases as services and abstract the connectivity to those through APIs in order to free up developers from the need to locate the actual systems. Additional services can be authentication and authorization, video encoding, location based services etc. Using the PaaS services will allow you to abstract your applications from the underlying stack and as long as the APIs are kept intact it will be protected from failures between platform updates.

Software-as-a-Service (SaaS) Model
SaaS is the model with the highest abstraction and offers the most maintenance free option. As a SaaS consumer you are just using the software offered by the vendor. As depicted on the picture the whole stack is maintained by the vendor. This includes also updates for the application as well as application data management. SaaS model is very similar to the off-the-shelf software model where you go and buy the CD, install the software and start using it.

Traditionally one of the hardest problems application developers had to deal with was the data migration between different versions. SaaS vendors are also responsible for migrating your data and keeping it consistent. Similar to the off-the-shelf software model you can rely that you can access and read your data once you upgrade to a new version.

The SaaS model is the most resource-efficient model because it utilizes application multi-tenancy. What this means is that the same application instance handles multiple user-organizations. This is good for both the vendor and the customer because better resource utilization brings the maintenance costs down and hence the price for the services down. On the other side though tenant data is comingled and there is the security risk of one tenant accidentally getting access to another tenant's data.

Although not exhaustive the cloud computing service models explanation above should be enough to kick-start your initial discussion about your cloud strategy.

Read the original blog entry...

More Stories By Toddy Mladenov

Toddy Mladenov has more than 15 years experience in software development and technology consulting at companies like Microsoft, SAP and 3Com. Currently he is a CTO of Agitare Technologies, Inc. - a boutique consulting company that specializes in Cloud Computing and Big Data Solutions. Before Agitare Tech Toddy spent few years with PaaS startup Apprenda and more than six years working on Microsft's cloud computing platform Windows Azure, Windows Client and MSN/Windows Live. During his career at Microsoft he managed different aspects of the software development process for Windows Azure and Windows Services. He also evangelized Microsoft cloud services among open source communities like PHP and Java. In the past he developed enterprise software for German's software giant SAP and several startups in Europe, and managed the technical sales for 3Com in the Balkan region.

With his broad industry experience, international background and end-user point of view Toddy has an unique approach towards technology. He believes that technology should be develop to improve people's lives and is eager to share his knowledge in topics like cloud computing, mobile and web development.

@MicroservicesExpo Stories
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
IT leaders face a monumental challenge. They must figure out how to sort through the cacophony of new technologies, buzzwords, and industry hype to find the right digital path forward for their organizations. And they simply cannot afford to fail. Those organizations that are fastest to the right digital path will be the ones that win. The path forward, however, is strewn with the legacy of decisions made long ago — often before any of the current leadership team assumed their roles. While it’s ...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud: This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, showed how customers are able to achieve a level of transparency that enables everyone fro...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.