Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Pat Romanski, Zakia Bouachraoui

Related Topics: Microservices Expo

Microservices Expo: Article

Best Practices for SOA: Building a Data Services Layer

Choosing best-of-breed data access middleware is an important enabler of SOA

Properly architected SOA presents both business functionality and data as abstracted services. Applied to data access, if access is abstracted as data services and the access code moved into supporting infrastructure, then problems can be addressed and changes can be supported throughout the environment in a much more loosely coupled and flexible manner. In essence, the data services layer provides a single abstracted point of access for all data access, update, and creation operations, providing a holistic view of the data models that the underlying persistence layer relies on. It acts as a bridge between the business services and the underlying data persistence layer; business users needn’t concern themselves with whether the data they are consuming originated in the database, an enterprise application, a file system, another company, or anywhere else for that matter. This promise of ubiquitously accessible data freed from the constraints of its sources is liberating for companies as they struggle with the challenges of systems integration.

A data services layer must provide an interface that exposes a standard set of reusable data services for reading and writing data, independent from the underlying data sources. The loose coupling it enables between applications using the services and the underlying data source providers lets database architects modify, combine, relocate, or even remove underlying data sources from the data services layer without requiring changes to the interfaces that the data services expose. As a result, the architects can retain control over the structure of their data while providing relevant information to the applications that need it. Over time, this increased flexibility eases the maintenance of enterprise applications.

Before the advent of SOA, developers built the capabilities that the abstracted data service layer could provide using manual coding, tightly embedding that code into the application under construction. Embedding such data access and data abstraction code directly into applications limits the flexibility and reusability of the resulting applications and, as a result, enterprises looked to traditional middleware such as Extract, Transform, Load (ETL), and Enterprise Application Integration (EAI) products to provide the capabilities of the data service layer from a middleware perspective. The ETL approach is best suited to static applications that don’t require the flexibility that SOA can provide. It can be costly, as well, and requires a high management overhead. The EAI approach centralizes data interaction logic, but still fails to provide the flexibility many organizations require from their data services layer.

Even when an organization is implementing true SOA, however, a poorly architected data services layer can lead to performance issues. In many cases, each application has its own database which contains a duplicate copy of business reference data such as customer information, product information, and inventory levels, as shown in Figure 1.

The organization must synchronize such databases on a regular schedule, which can lead to stale or inconsistent data between applications. In this case, even when an organization has exposed application functionality as loosely coupled services the persistence tier still limits the flexibility and reusability of the services. Incorporating a data services layer into the SOA implementation addresses this problem. Data services offer transaction and connection management to multiple applications, as shown in Figure 2.

The data services layer manages the relationships between the data services and ensures that each application is aware of all data changes, regardless of the cause of the change. Leveraging data services as infrastructure services beneath the business service abstraction increases reusability and flexibility, and shortens development and rollout time for new services. A well-architected data services layer relies on consistent, high-performance data access middleware that leverages industry-standard APIs such as ODBC, JDBC, ADO.NET, and SDO.

To sum it up, building a data services layer as part of an SOA implementation provides the infrastructure necessary to reap the full benefits of SOA. Specifically, it provides the following solutions to common data problems:

  • Reduced reliance on hand-coded persistence logic: By building abstracted, shared data services, it’s possible to leverage best-of-breed data access middleware to support the data services as part of an architected plan.
  • Inflexible integration is addressed: Instead of the point-to-point or hard-coded integration of traditional integration approaches, the data services layer provides for loose coupling at the persistence tier.
  • Data query bottlenecks removed: The data services layer enables organizations to implement content-based routing techniques to avoid bottlenecks.
  • Improved data consistency: The data services layer functions as a single locus for data access in the enterprise, helping an organization drive toward a single source of truth for its data. Implementing a data services layer helps ensure that applications always pull data from the correct sources and consistently provide appropriate information back to all applications.

Without doubt, architects who focus on SOA must take a hard look at solving the data access challenges in order to plan a successful, loosely coupled architecture. Accessing data in the various stores across the enterprise in the most efficient, flexible manner is the basis for building data services, and is thus critical for building all the services in the SOA implementation.

More Stories By John Goodson

As vice-president of product operations, John Goodson leads the product strategy, direction, and development efforts at DataDirect Technologies. For more than 10 years, he has worked closely with Sun and Microsoft on the development and evolution of database connectivity standards including J2EE, JDBC, .NET, ODBC, and ADO. His active memberships in various standards committees, including the JDBC Expert Group, have helped Goodson's team develop the most
technically advanced data connectivity technologies. He holds a BS in computer science from Virginia Tech.

More Stories By Jason Bloomberg

Jason Bloomberg is a leading IT industry analyst, Forbes contributor, keynote speaker, and globally recognized expert on multiple disruptive trends in enterprise technology and digital transformation. He is ranked #5 on Onalytica’s list of top Digital Transformation influencers for 2018 and #15 on Jax’s list of top DevOps influencers for 2017, the only person to appear on both lists.

As founder and president of Agile Digital Transformation analyst firm Intellyx, he advises, writes, and speaks on a diverse set of topics, including digital transformation, artificial intelligence, cloud computing, devops, big data/analytics, cybersecurity, blockchain/bitcoin/cryptocurrency, no-code/low-code platforms and tools, organizational transformation, internet of things, enterprise architecture, SD-WAN/SDX, mainframes, hybrid IT, and legacy transformation, among other topics.

Mr. Bloomberg’s articles in Forbes are often viewed by more than 100,000 readers. During his career, he has published over 1,200 articles (over 200 for Forbes alone), spoken at over 400 conferences and webinars, and he has been quoted in the press and blogosphere over 2,000 times.

Mr. Bloomberg is the author or coauthor of four books: The Agile Architecture Revolution (Wiley, 2013), Service Orient or Be Doomed! How Service Orientation Will Change Your Business (Wiley, 2006), XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996). His next book, Agile Digital Transformation, is due within the next year.

At SOA-focused industry analyst firm ZapThink from 2001 to 2013, Mr. Bloomberg created and delivered the Licensed ZapThink Architect (LZA) Service-Oriented Architecture (SOA) course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, which was acquired by Dovel Technologies in 2011.

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting), and several software and web development positions.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
Consumer-driven contracts are an essential part of a mature microservice testing portfolio enabling independent service deployments. In this presentation we'll provide an overview of the tools, patterns and pain points we've seen when implementing contract testing in large development organizations.
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.