Welcome!

Microservices Expo Authors: Pat Romanski, Liz McMillan, Simon Hill, Madhavan Krishnan, VP, Cloud Solutions, Virtusa, John Rauser

Related Topics: @CloudExpo, @DXWorldExpo, @DevOpsSummit

@CloudExpo: Article

Is SaaS Dead? | @CloudExpo #BigData #IoT #IaaS #SaaS #AI #DevOps #FinTech

The rapid growth of hyperscale IaaS platforms such as AWS is changing the SaaS playing field

Is SaaS Dead? Are We Headed to a World of ASP 2.0 or Dedicated SaaS?

The emergence of hyper-scale Infrastructure as-a-Service (IaaS) platforms such as Amazon Web Services (AWS) is challenging the traditional Software-as-a-Service (SaaS) value proposition. SaaS CEOs, investors and SaaS buyers must carefully evaluate the implications of "all-in" migration to hyperscale IaaS platforms that offer value-added platform services that go beyond simple infrastructure services. This article provides a point of view and insights into changes in the SaaS market, which has three principal drivers 1) technical, 2) business, and 3) security and compliance.

History of SaaS
SaaS solutions were born in 2000 with the promise of web-based delivery of hosted software delivered over the internet. SaaS solutions helped reduce the pain and cost associated with upgrades and maintenance headaches of "shrink-wrapped" software. The SaaS business over the last 15 years has disrupted major software platforms and in many ways helped set the stage for wider enterprise adoption of cloud services.

The early success of SaaS helped change customer behavior. Customers became comfortable with data and software running outside of their firewalls. The concept of management by SLAs was in many ways driven by the initial success of SaaS. In the past few years the trend towards SaaS creation and adoption has exploded with the adoption of hyperscale IaaS platforms by SaaS vendors.

This is the great paradox.

The emergence of public cloud IaaS platforms on the one hand is fueling the explosive growth of SaaS, but at the same time it will kill (or dramatically disrupt) SaaS as it destroys the fundamental value proposition.

Will SaaS Work in the Public Cloud Era?
The explosive and constantly accelerating growth of IaaS platforms such as AWS Elastic Cloud Computing (EC2) is changing the economics of SaaS.

Your Cloud or My Cloud?
The trend is unmistakable - public IaaS has won. Major SaaS providers as well as large enterprises that are major SaaS buyers are both migrating and embracing the IaaS cloud by going "all-in." Figure 1 shows the implications of this migration and shift.

Figure 1

As both SaaS providers and enterprise SaaS consumers start to operate on the same platform, the SaaS "value-stack" is significantly diminished. Artificial data silos are created for the enterprise when their "SaaS" data is resident right next to their enterprise data separated just by a contractual boundary. SaaS buyers in many ways "pay" twice for their data. Once for storing it in the SaaS platform and a second time for backing it up into their "cloud environment" for security or analytical purposes. Many enterprises often overlook the hidden costs associated with SaaS. These include hard network and storage costs associated with copying data back and forth for backup/retention or analytics purposes.

Is Upgrading and Updating Software Really That Hard Now?
A key tenet of SaaS was to help reduce the cost and pain associated with upgrading and maintaining "shrink-wrapped" software. In 2016 and beyond, upgrading and managing software and systems is not as hard as it used to be. Continuous integration/continuous delivery (CI/CD), microservices, serverless computing, and managed cloud-native services are making it relatively easy to upgrade and deliver software.

If both the SaaS provider and the consumer have access to the same set of tools on the same cloud platform, what is the real value-add from the SaaS service? Just the software, right?

Upended Revenue Model
Over the years most SaaS services have been delivered on a "per-user" based annual subscription model. If a SaaS license for one user is $100 per year, and the enterprise bought licenses for 100 users, the enterprise would pay $10,000 per year. In the pre-IaaS days, this might have been great economics for both parties - the SaaS vendor and the SaaS buyer. But not anymore.

The customer now wants true pay-as-you-go or consumption-based pricing. This means paying for actual transactions when actual costs are actually incurred. This is especially true now that SaaS providers migrate to IaaS platforms that offer transaction-based pricing. The SaaS providers' underlying cost basis for infrastructure and related components is truly hourly and consumption driven. This has huge implications. Let's see this through a sample scenario.

Assume those 100 users consume 100 GB of space and five cloud servers. The customer adds another 50 users but because of limited usage, the storage and compute requirements did not really change much. Why should the customer pay an additional $5,000 for those additional 50 users, if the underlying incremental costs did not change that much?

SaaS buyers are starting to think real hard when deciding between a "per-user" SaaS annual subscription model versus a "cloud-hosted" managed "shrink-wrapped" solution. Given that in both scenarios the "raw material" is the same cloud platform.

Security and Compliance
Most newer entrants to the SaaS market underestimate the cost and complexity associated with the "Ops" part of the business. Many SaaS providers struggle with issues around vulnerability management, security scanning, compliance reporting, backup and recovery and service uptime. There are many factors for this. One of them is the changed cybersecurity environment. Buyers are demanding evidence of security best practices. Industry recognized certifications such as HIPAA, FFIEC, FedRAMP, or ISO 27001 are increasingly required by customers from SaaS providers. These come at a heavy price.

Further, as enterprises get better with cloud services, they are increasingly seeking access to their data stored on SaaS platforms. Data residency, access control and enterprise risk management are difficult questions that must be addressed. Now that the enterprise and the SaaS service are operating on the same cloud platform, the argument for "dedicated" SaaS is increasingly coming up.

Fast Forward to the Past?
As enterprises get comfortable with hyperscale cloud platforms such as AWS, there is a serious need to carefully evaluate the SaaS value proposition. New distribution models such as cloud marketplaces are making it easier for cloud buyers to find and deploy enterprise software within their own cloud environments. Wanting control on the data and emergence of data science are causing enterprises to view their data more strategically.

Will we see Application Server Providers (ASP2) that focus on delivering great software hosted within "dedicated" cloud environments or within the clients' cloud environment? Will the SaaS revenue model change from a per user annual subscription to a pay-per-call model as serverless computing takes shape?

Making the right strategic decisions is critical for both cloud services buyers and providers to ensure their ability to sustain and thrive in the cloud computing age. The emergence of serverless computing, cloud marketplaces and pay-per-call transaction pricing offer entrepreneurs a rich canvas to build the next generation of cloud-native services.

Industry Perspective
"Pricing, governance and security are key for enterprises consuming multiple SaaS services. It becomes challenging to understand the security footprint of the system as a whole as the number of SaaS services consumed increases." - Derek Collison, CEO and Founder, Apcera.

Call-to Action
Cloud executives both on the vendor and enterprise buyer sides must carefully evaluate and understand new cloud deployment and consumption patterns. Taking a strategic look at Managed IaaS, PaaS, Container-as-a-Service or Micro Marketplace deployment models is critical.

More Stories By Gaurav Pal

Gaurav “GP” Pal is CEO and Founder of is stackArmor. He is an award-winning Senior Business Leader with a successful track record of growing and managing a secure cloud solutions practice with over $30 million in annual revenues focused on US Federal, Department of Defense, non-profit and financial services clients. Successfully led and delivered multi-million-dollar Amazon Web Services (AWS) cloud migration and broker programs for US Government customers including the Department of the Treasury, and Recovery Accountability & Transparency Board (RATB) since 2009.

GP is the Industry Chair at the University of Maryland’s Center for Digital Innovation, Technology and Strategy (DIGITS). He has strong relationship-based consultative selling experience with C-level executives providing DevOps, Managed Services, IaaS, Managed IaaS, PaaS and SaaS in compliance with US FedRAMP, FISMA, HIPAA and NIST Security Frameworks. He has a successful track record of delivering multiple cloud solutions with leading providers including Amazon Web Services (AWS), Microsoft, Google and among others.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
Cavirin Systems has just announced C2, a SaaS offering designed to bring continuous security assessment and remediation to hybrid environments, containers, and data centers. Cavirin C2 is deployed within Amazon Web Services (AWS) and features a flexible licensing model for easy scalability and clear pay-as-you-go pricing. Although native to AWS, it also supports assessment and remediation of virtual or container instances within Microsoft Azure, Google Cloud Platform (GCP), or on-premise. By dr...
The cloud revolution in enterprises has very clearly crossed the phase of proof-of-concepts into a truly mainstream adoption. One of most popular enterprise-wide initiatives currently going on are “cloud migration” programs of some kind or another. Finding business value for these programs is not hard to fathom – they include hyperelasticity in infrastructure consumption, subscription based models, and agility derived from rapid speed of deployment of applications. These factors will continue to...
While we understand Agile as a means to accelerate innovation, manage uncertainty and cope with ambiguity, many are inclined to think that it conflicts with the objectives of traditional engineering projects, such as building a highway, skyscraper or power plant. These are plan-driven and predictive projects that seek to avoid any uncertainty. This type of thinking, however, is short-sighted. Agile approaches are valuable in controlling uncertainty because they constrain the complexity that ste...
identify the sources of event storms and performance anomalies will require automated, real-time root-cause analysis. I think Enterprise Management Associates said it well: “The data and metrics collected at instrumentation points across the application ecosystem are essential to performance monitoring and root cause analysis. However, analytics capable of transforming data and metrics into an application-focused report or dashboards are what separates actual application monitoring from relat...
"This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Many enterprise and government IT organizations are realizing the benefits of cloud computing by extending IT delivery and management processes across private and public cloud services. But they are often challenged with balancing the need for centralized cloud governance without stifling user-driven innovation. This strategy requires an approach that fundamentally reshapes how IT is delivered today, shifting the focus from infrastructure to services aggregation, and mixing and matching the bes...
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
We just came off of a review of a product that handles both containers and virtual machines in the same interface. Under the covers, implementation of containers defaults to LXC, though recently Docker support was added. When reading online, or searching for information, increasingly we see “Container Management” products listed as competitors to Docker, when in reality things like Rocket, LXC/LXD, and Virtualization are Dockers competitors. After doing some looking around, we have decided tha...
The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably. The impact of cloud adoption on test environments is a valuable advancement in both cost savings and agility. The on-demand model takes advantage of public cloud APIs requiring only payment for t...
DevOps teams have more on their plate than ever. As infrastructure needs grow, so does the time required to ensure that everything's running smoothly. This makes automation crucial - especially in the server and network monitoring world. Server monitoring tools can save teams time by automating server management and providing real-time performance updates. As budgets reset for the New Year, there is no better time to implement a new server monitoring tool (or re-evaluate your current solution)....
The benefits of automation are well documented; it increases productivity, cuts cost and minimizes errors. It eliminates repetitive manual tasks, freeing us up to be more innovative. By that logic, surely, we should automate everything possible, right? So, is attempting to automate everything a sensible - even feasible - goal? In a word: no. Consider this your short guide as to what to automate and what not to automate.
"We are an integrator of carrier ethernet and bandwidth to get people to connect to the cloud, to the SaaS providers, and the IaaS providers all on ethernet," explained Paul Mako, CEO & CTO of Massive Networks, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
From our perspective as consumers, perhaps the best thing about digital transformation is how consumerization is making technology so much easier to use. Sure, our television remote controls still have too many buttons, and I have yet to figure out the digital display in my Honda, but all in all, tech is getting easier for everybody. Within companies – even very large ones – the consumerization of technology is gradually taking hold as well. There are now simple mobile apps for a wide range of ...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.