Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Pat Romanski, Yeshim Deniz, Zakia Bouachraoui

Related Topics: Microservices Expo

Microservices Expo: Article

Does a Web Service Make a Service for SOA?

SOA Service & SOA SLA as drivers for renovating legacy applications for SOA

Certainly we have to balance complexity and flexibility: how granular should a Service be to avoid unnecessary communications and function compositions? Is it always reasonable to extract data access from the business service and put it into separate specialized services? Answering these and similar questions, one caution has to be observed - never consider a solution in isolation from the environment. For instance, application developers believe that calling a database directly via JDBC/ODBC guarantees maximum performances. It's true, but if the deployment environment requires a security authorization control for a particular database column or raw, your application is responsible for implementing and supporting such a control. Would you prefer coding this yourself or using an existing Security Service? That is, are you sure the performance you can provide would be better than a professionally developed Security Service? If you choose the latter, you can easily find that a Data Service is already integrated with the Security Service and, probably, is the way to go. What will happen to overall performance when you use the Data Service? May be nothing bad if you consider the Services and optimize your application design upfront.

Now is the time to ask: can our legacy application act as a service (an aggregation of services)? The answer may be "yes" if we transform the application with a service-oriented model. It's a real opportunity if we can build or rebuild the application from scratch. Usually, the application is crucial for the business and/or not easily modifiable. Even in that case it's not a lost investment but it will be if the application can't keep up with the architecture evolution. For demonstration purposes, we'll describe a gradual transformation of a data store into a SOA Service.

From the Data Store to the Data Service
We'll start with a real-life example happened in a financial company providing support for a 401(k) plan over the Internet. Its customers complained that they couldn't see some of their investments. The errors occasionally appeared and were reported by the end users. It took several days for the issue to work its way up the management line and for developers to locate the related data store. The data store provider worked on the basis of a regular service agreement that said that certain data had to be sent via a Web Service to the Web site component on-demand. The data was always sent, but sometimes one piece of data filed contained a NULL. For the data store, it wasn't a mandatory field and it could be a NULL. The service agreement didn't say anything about the quality of the data when the Web site was developed on the supposition that the data field was available and had a "non-null" value (as represented by the data store in the example).

The request to the data store to change the field constraint to avoid NULL was denied based on the fact that a lot of other customers considered the field optional. The behavior of the data store team was very typical - "We're glad to support SOA and we give you all that we have." Oops, there's the problem: the end users and the Web site weren't interested in what the data store could or couldn't do but in the correct portfolios. It's simple - if you want to wash you favorite Hawaiian shirt and the laundry only has bleach, will you use that service or go to another one?

Let's look at the situation from the data store's side and assume it's serious about providing a SOA Data Service. Figure 1 and Figure 2 demonstrate possible actions the data store's management can take to transit the data store into a data resource for the Service. The transition takes two phases - analysis and execution.

The analysis phase can be organized in two ways depending on the demand for the Service: a) the Service is really required and its customers can be identified; b) the Service is just a proposal intended to facilitate customer demand. Activities involved in the analysis phase are applicable in both cases with light modifications. In our example, the data field's requirements are already identified. Further activities can be:

  • Identify all dependent data and estimate the impact if the metadata has to be modified
  • Identify the sources of the data and related SLA, if applicable, i.e., the quality of the available data
  • If the Data Service has to engage other Services, review the relationship with the providers of the potential helper Services and related SLAs
  • Define a basic SLA for the Data Service to meet the requirements; there may be multiple SLAs for different customers but they can't be contradictory
  • Identify the customer community and its dynamics
  • Identify policy-based constraints on the Data Service (security, accessibility, internationalization, etc.)
  • Identify the available or needed software and hardware.
Based on the results of the analysis, we can determine if a data transformation is needed for a particular SLA. This may mean modifying the data field metadata, particularly the database constraint that restricts the value from being equal to NULL. It's also very important to recognize all the risks - operational and programming - associated with the data and the Data Service for adequate Risk Management and compliance with corporate and industry regulations. The findings will drive the Execution Phase.

The execution phase aims at what and in which order things have to be done, when intermediary decisions have to be made, and what the controls have to be implemented and preserved in the transition process. This phase ends in actually implementing the plan. The common rule is - when implementing a SOA Service or orchestrating a Service execution scenario, it's not always necessary to exclude human intervention. First, it may be too costly to automate everything and second, because orchestration standards, e.g., BPEL, let long-living transactions integrate human actions into the service process.

Well, we assume that the metadata for a selected data field can't be modified right away. It requires a grace period to take care of all customers of the data. So, a temporary Transition Data Service may be a solution here.

We can also create an intermediary data field that meets the Data Service requirements identified in the analysis phase. The mechanism of refreshing an intermediary data field is also defined (it may be based on a schedule or on a value-change event initiated by a manual operation). The Transition Data Service binds the intermediary data field as its data source.

The Transition Data Service has one specific: it provides the data with an indicator of "freshness" because the intermediary data field isn't updated with new data if the data doesn't meet the Service SLA. For example, if the Transition Data Service provides a Mutual Fund price and the latter has been set to NULL for today (e.g., the real price wasn't calculated on time, by the deadline), the Transition Data Service will show the Fund price as a "yesterday" price but not as NULL.

Then we have to migrate existing customers of the selected data to the intermediary data field even if they don't use a Data Service. It's an operational process and it can take some time. This "delay" may even be good for the Data Service because it gets time to demonstrate its advantages. Simultaneously with the migration effort, we have to find either another data source that always has current data or work out this issue with the existing data provider.

When all the customers start using the intermediary data field and we get a proper data source, we promote the intermediary data field to a Master Data store and retire the initially selected data field. The Transition Data Service now becomes a fully scaled Data Service. Unfortunately, it's not always possible to get rid of the initial data field because some immutable legacy applications couldn't migrate. But it's not a hopeless situation: those legacy applications can be temporarily buffered with a SQL substitution component (assuming that there are only a few such applications left). Technology evolution dictates that data source providers improve their services and eventually the inappropriate data will go away. This will enforce the legacy applications either take the new data or retire themselves.

Conclusion
The notion of a Service in a Service Oriented Architecture goes far beyond the definition of a new interface even if it's a Web Service interface. Though the service interface is very important, the service provider has to provide the service, not the access to an application whose architecture and functionality may be inadequate for the service's behavior.

We reviewed a few service characteristics and identified Service Level Agreements as an instrument for effective service interoperability and as a driver for service-oriented transformation in a service provider architecture. The obvious conclusion that a Web Service isn't enough to constitute a service in a SOA was demonstrated by a real-life case of transition of a regular data store into a Data Service provider for SOA.

Webopedia - SOA definition: Abbreviated SOA, an application architecture in which all functions, or services, are defined using a description language and have invokable interfaces that are called to perform business processes. Each interaction is independent of each and every other interaction and the interconnect protocols of the communicating devices (i.e., the infrastructure components that determine the communication system do not affect the interfaces). Because interfaces are platform-independent, a client from any device using any operating system in any language can use the service.

Though built on similar principles, SOA is not the same as Web services, which indicates a collection of technologies, such as SOAP and XML. SOA is more than a set of technologies and runs independent of any specific technologies.

Webopedia - Data Warehouse definition: Abbreviated DW, a collection of data designed to support management decision making. Data warehouses contain a wide variety of data that present a coherent picture of business conditions at a single point in time.

Development of a data warehouse includes development of systems to extract data from operating systems plus installation of a warehouse database system that provides managers flexible access to the data.

The term data warehousing generally refers to the combination of many different databases across an entire enterprise. Contrast with data mart.

References

  1. Sutor, Bob. "Something Old, Something New: Integrating Legacy Systems" www.ebizq.net/topics/legacy_integration/features/5229.html?&pp=1
  2. Web Service Level Agreements. www.research.ibm.com/wsla/WSLASpecV1-20030128.pdf
  3. Schmelzer, Ronald. "What Belongs in a Service Contract?" http://searchwebservices.techtarget.com/tip/1,289483,sid26_gci1120180,00.html
  4. Meehan, Michael. "HP looks To Give Legacy an SOA Upgrade" http://searchwebservices.techtarget.com/originalContent/0,289142,sid26_gci1144084,00.html
  5. Teubner, Russ. Integrating CICS Applications as Web Services. SWSJ, http://webservices.sys-con.com/read/39850.htm
  6. Integrate existing assets and create new functionality. www.softwareag.com/Corporate/products/cv/leg_int/default.asp
  7. Application Modernization & Legacy-to-SOA. www.interactive-objects.com/solutions/application-modernization/legacy_modernization_eng.pdf
  8. Poulin, Michael. "Entitlement to Data" JDJ, Vol.10, issue 12, 2005 http://java.sys-con.com/author/poulin.htm
  9. Service Oriented Legacy Architecture - SOA CICS. www.soa.com/index.php/section/products/sola/
  10. Business Process Execution Language for Web Services. www-128.ibm.com/developerworks/library/specification/ws-bpel/
  11. Web Services-Interoperability Basic Profile. http://publib.boulder.ibm.com/infocenter/wasinfo/v6r0/index.jsp? topic=/com.ibm.websphere.base.doc/info/aes/ae/cwbs_wsiprofile.html
  12. Web Services Security (WS-Security). www-128.ibm.com/developerworks/library/ws-secure/

More Stories By Michael Poulin

Michael Poulin works as an enterprise-level solution architect in the financial industry in the UK. He is a Sun Certified Architect for Java Technology, certified TOGAF Practitioner, and Licensed ZapThink SOA Architect. Michael specializes in distributed computing, SOA, and application security.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...