Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: Microservices Expo, Java IoT, Machine Learning

Microservices Expo: Article

Real SOA - Web Services and Service Oriented Architecture

An overview of SCA and SDO

A challenge facing many organizations is how to quickly and effectively react to frequent changes in business requirements, whilst improving productivity and reducing costs. To achieve this, you need a flexible infrastructure that can meet the demands of a changing marketplace and seize emerging opportunities. To address this challenge, Service Oriented Architecture (SOA) promotes an architectural approach that replaces rigid proprietary systems with heterogeneous, "loosely-coupled" services. The Service Component Architecture (SCA), along with Service Data Objects (SDO), makes this architectural concept a reality and provides the programming model to build SOA solutions for agile businesses.

SCA is a powerful and simple business level programming model which extends and complements prior approaches for implementing services based solutions. SCA defines how services can be described, assembled, and deployed in a meta-data driven fashion, independent of an implementation language and a deployment platform. The approach is based on the idea that each business function consists of one or more components brought together into a composite application. These, in turn, are composed into a network of services that create specific business solutions.

This article describes some of the key values of SCA by modeling an SOA based solution for a fictitious company called MostMortgage. We shall assume a simple business process in which an applicant signs up for a loan and provides his or her identity information and loan requirements. MostMortgage evaluates the new applicant based on their credit approval and searches for an appropriate mortgage rate.

By using the SCA programing model, MostMortgage's developer can build a solution for this problem quickly and effectively, separating the business logic from technology concerns and enabling re-use of existing applications. In this case, there is already a well understood credit appliction that can be re-used (CreditCheck); and MostMortgage has a subscription to a Web service that searches for the best loan rates (FindRates).

The solution developer completes the following steps:
     1)  Define the business logic for LoanApproval and AccountVerification;
     2)  Define references for each component (this identifies what other services, if any, the component is dependent on);
     3)  Define the services provided by each component, if any;
     4)  Assemble the components and choose the binding to be used.

The MostMortgage solution (as shown in Figure 1) is then ready for deployment.

Components can be implemented in any language supported by an SCA runtime, including BPEL, Java, Ruby, and C++. Outside of any program logic, these components can be assembled or "wired" into a com-position using any appropriate binding, such as WS-* or JMS.

Let's update the MostMortgage application by making one technology domain change and one business domain change.

First, we improve the security of the calls to the CreditCheck component, which happens to run in a remote data center. The MostMortgage company developer need not be concerned about these new infrastructure requirements. SCA separates infrastructure capabilities from business logic and allows the security requirements to be defined as policies during assembly. The resulting flexibility enables IT infrastructure policies to change at anytime without requiring a re-code.

Secondly, it is decided to introduce a bespoke rate optimization layer in front of the FindRate Web service (see Figure 2). MostMortgage can add value to the white box FindRate service by combining a mortgage account with in-house financial products. Here, the developer is involved, but he is able to reuse his previous work directly in a controlled and modular way by simply extending the assembly to include the new RateOptimizer component.

It is important not to forget the complexity introduced by handling data in such a heterogeneous network of services. A technology called Service Data Objects (SDO) addresses this problem. SDO offers a format-neutral API that provides a uniform way to access data, regardless of how it is physically stored. By using SDO, the solution developer will not pollute a business application with code to handle diverse choices of data access, such as JDBC Result Sets, JCA records, DOM, JAXB, and EJB entities.

SDO supports a disconnected style of data access and can record a summary based on any changes made to data objects. SDO's ability to maintain a summary of the changes made allows data transfers to include only the portion of data that has changed, therefore improving environments where bandwidth is a constraint. The change summary information can be used to resolve data access conflicts and concurrency issues.

SDO supplies a powerful yet simple programming model for data with first class support for XML and the ability to automatically persist data via the use of a Data Access Service (DAS). A DAS allows the data to be stored or retrieved from a relational database or another repository, and helps to link the SDO models to enterprise data storage.

SCA and SDO provide technologies that simplify the development of SOA solutions. SCA and SDO technologies work well together and independently. More detailed information about SCA and SDO will be available in future articles.

The importance of these technologies has led many vendors who experienced customer pain points to collaborate and develop specifications for SCA and SDO, and crystallize best practices that have been utilized for the past few years. More information about the Open SOA collaboration and its many participating vendors can be found at www.osoa.org.

You can try out Java and C++ implementations of the SCA and SDO technologies by visiting the Apache Tuscany open source project at incubator.apache.org/tuscany . Tuscany provides a simple "on ramp" for developers who want to create applications using a service-oriented approach. As an early implementer of SCA and SDO specifications, the Tuscany project is able to provide timely feedback on the specification to the Open SOA collaboration. Other implementations of this technology are also beginning to appear, for example, the PHP PECL SDO project at http://pecl.php.net/package/sca_sdo.

In summary, today's organizations must be able to quickly react to change. SCA promotes flexible and reusable solutions by encouraging componentization and by clearly separating business logic from underlying technology concerns. SCA and SDO independently increase developer productivity by shielding them from infrastructure complexity and the necessity to develop deep infrastructure technology skills. SCA and SDO together provide IT with a flexible model for building SOA based solutions and, more importantly, for effectively and efficiently handling change.

More Stories By Andrew Borley

Andrew Borley is an IBMer enjoying life working on the Apache Tuscany project. He's helping to define the Service Component Architecture (SCA) specification and is a committer on Apache Tuscany, developing implementations of SCA and Service Data Objects.

More Stories By Haleh Mahbod

Haleh Mahbod is a program director with IBM, managing the team contributing to the Apache Tuscany as well as SOA for PHP open source. She has extensive development experience with database technologies and integration servers.

More Stories By Simon Laws

Simon Laws is a member of the IBM Open Source SOA project team working with the open source Apache and PHP communities to build Java, C++, and PHP implementations of the Service Component Architecture (SCA) and Service Data Object (SDO) specifications. Prior to this role he was working in the distributed computing space building service-oriented solutions for customers with a particular interest in grid computing and virtualization.

Comments (3)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...