Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Yeshim Deniz, Kevin Jackson, ManageEngine IT Matters

Related Topics: Containers Expo Blog, Microservices Expo, @CloudExpo

Containers Expo Blog: Blog Post

Managing Data Integrity in SOA and SaaS Based Environments

Techniques for managing transactions in the cloud

Data integrity is one of the most critical elements in any system. Data integrity is easily achieved in a standalone system with a single database. Data integrity in such a system is maintained via database constraints and transactions. Transactions should follow ACID (Atomicity, Consistency, Isolation, Durability) properties to ensure data integrity. Most databases support ACID transactions and can preserve data integrity.

Next in the complexity chain are distributed systems. In a distributed system, there are multiple databases and multiple applications. In order to maintain data integrity in a distributed system, transactions across multiple data sources need to be handled correctly in a fail-safe manner. This is usually done via a central global transaction manager. Each application in the distributed system should be able to participate in the global transaction via a resource manager. This is achieved using a 2-phase commit protocol as per the XA standard. Most databases and custom applications have the ability to participate in a global transaction. Many packaged applications can also participate in a global transaction via EAI adapters. In reality, in most environments, some of the applications may support participation in a global transaction via 2-phase commit, some may support only single phase commit transactions and some may not support any transaction capability at all.

Moving further up in complexity are distributed systems with a mix of on-premise and partner applications. In this case, not all applications in the system are under the control of the organization and partner application interface may not support XA. B2B integration standards such as EDI and ebXML are the primary methods of ensuring reliability and data integrity across partner systems.

Enter the world of SOA and Cloud computing, and the problem of data integrity gets magnified even more, as there is a mix of on-prem and SaaS applications exposed as services. SaaS applications are multi-tenant applications hosted by a third party. SaaS applications usually expose their functionality via XML based APIs over HTTP protocol. SOAP and REST based web services are the most common methods of implementing these APIs. Also, in SOA based environments, many on-prem applications expose their functionality via SOAP and REST web services as well. One of the biggest challenges with web services is transaction management. At the protocol level, HTTP doesn’t support transactions or guaranteed delivery, so the only option is to implement these at the API level. Although there are standards available for managing data integrity with web services such as WS-Transaction and WS-Reliability, these standards are not yet mature and not many vendors have implemented these. Most SaaS vendors expose their web services APIs without any support for transactions. Also, each SaaS application may have different levels of availability and SLA (Service Level Agreement), which further complicates management of transactions and data integrity across multiple SaaS applications. There are several techniques that can be applied to ensure data integrity in such environments.

Let’s take a simple scenario of new customer creation at a company. This company uses 2 SaaS vendors, one for Marketing and one for CRM. In addition, there is an on-premise ERP application. When a new customer places an order, the customer information needs to be sent to the Marketing service (for marketing campaigns), CRM service (for customer management) and ERP application (for order fulfillment). Both Marketing and CRM applications expose their customer creation APIs via SOAP web services over HTTP, whereas the ERP application exposes customer creation via a database API. Here is the sequence of operations in this transaction:

1. Customer creation in Marketing via SOAP web service (Doesn't support transaction)

2. Customer creation in CRM via SOAP web service (Doesn’t support transaction)

3. Customer creation in ERP via database insert (Supports transaction)

In order to maintain data integrity across the 3 applications, either all the steps should get successfully executed or none of them should get executed. In the above sequence of operations, if step 1 succeeds but step 2 fails, step 1 can’t be rolled back. If step 1 and 2 succeed but step 3 fails, steps 1 and 2 can’t be rolled back. So we have a data integrity issue at hand in various failure scenarios and customer record will exist in some systems but not in others. This is usually not acceptable in any production environment. So what can be done to handle this problem? There are several techniques that can be applied in this scenario:

Technique 1: Perform the operations that support transactions before the operations that don’t support transactions

In our example, step 3 should be moved to the beginning as follows:

1. Customer creation in ERP via database insert (Supports transaction)

2. Customer creation in Marketing via SOAP web service (Doesn't support transaction)

3. Customer creation in CRM via SOAP web service (Doesn’t support transaction)

With this change in the sequence of operations, if step 1 succeeds but step 2 fails, step 1 can just be rolled back. We still have a problem if step 1 and 2 succeed but step 3 fails. This is where the following techniques come in handy.

Technique 2: Use compensating transactions

In our new sequence as per technique 1, if steps 1 and 2 succeed but step 3 fails, rollback step 1 and issue a compensating transaction for step 2. Compensating transaction in this case will be to delete the customer. Of course, for this to work, the Marketing SaaS application needs to provide a “delete customer” API which should be requested before signing up with this SaaS vendor.

Technique 3: Break the transaction into multiple decoupled transactions

In our example, step 3 can be executed in a separate asynchronous transaction using a queue. Queue can be implemented using database or some messaging technology such as JMS. In either case, both write and read of messages from queue will support transactions. Here is the sequence of operations with this change:

First transaction:

1. Customer creation in ERP via database insert (Supports transaction)

2. Post message to a queue for customer creation in CRM (Supports transaction)

3. Customer creation in Marketing via SOAP web service (Doesn't support transaction)

In the above sequence, if step 2 fails, step 1 can be rolled back and if step3 fails, steps 1 and 2 can be rolled back. Note that posting message to queue is done before customer creation in Marketing to make sure the step that doesn’t support transaction is executed last (as per Technique 1).

Second transaction:

1. Queue listener retrieves message from queue (Supports transaction)

2. Customer creation in CRM via SOAP web service (Doesn't support transaction)

In the above sequence, if step 2 fails, step 1 can be rolled back.

So by breaking a transaction into multiple smaller transactions separated by queues, we are able to achieve data integrity.

Technique 4: Execute the transaction as a long-running transaction

If all the steps of the transaction are orchestrated as separate tasks of a long-running process using a state machine or BPM (Business process management) tool, then failure at any step will result in the process not progressing to the next step. Retries can be introduced at every step to ensure that every step is successful before the whole process is finished. This is the most reliable technique of all the techniques discussed but this can also introduce latency as the process can take a long-time to finish if any application or service is down for a long-time. This solution introduces more complexity into the environment and may not be acceptable in all situations but this is also the most reliable way to design distributed transactions in services based environments.

By applying the techniques discussed in this article, most failure scenarios can be handled effectively so that data integrity is not compromised. These techniques can be applied to any distributed system but are most useful (and almost mandatory) in SOA and SaaS based environments where interfaces are exposed via web services.

More Stories By Vinay Singla

Vinay Singla is a senior technology professional with extensive experience in the SaaS and SOA space.

@MicroservicesExpo Stories
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
"Tintri focuses on the Ops side of the DevOps, which basically is pushing more and more of the accessibility of the infrastructure to the developers and trying to get behind the scenes," explained Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Managing mission-critical SAP systems and landscapes has never been easy. Add public cloud with its myriad of powerful cloud native services and this may not change any time soon. Public cloud offers exciting new possibilities for enterprise workloads. But to make use of these possibilities and capabilities, IT teams need to re-think everything they have done before. Otherwise, they will just end up using public cloud as a hosting platform for their workloads, aka known as “lift and shift.”
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
The reality of data ubiquity is here—data is buried in operational statistics, machine logs, stacks of overflowing tickets and customer details, among other things. How can any user get valuable information amid this rapid influx of data? Imagine a situation where your firm’s revenue takes a hit owing to an unexpected failure in some business process. It would be a nightmare for IT admins to sift through the interminable piles of data to deduce exactly why and where the problem occurred. To sav...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of th...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.