Welcome!

Microservices Expo Authors: Yeshim Deniz, Pat Romanski, Todd Matters, Mark Leake, Stefana Muller

Related Topics: Microservices Expo

Microservices Expo: Blog Feed Post

Juggling Multiple Data Models with Services

Guidelines for Success

Recently, a client approached me with a quandary.  When designing XML schemas for Web services, how do you balance the desire to use industry standards such as UBL ( Universal Business Language) or CICA ( Context Inspired Component Architecture) to support data interoperability with the unique needs of particular domains and sub-systems within the enterprise?  The client’s service design team is rightfully concerned with the competing interests of internal enterprise standardization, interoperability with external entities, and addressing the unique needs of local domains and process constraints. How can these competing needs be effectively managed when designing the schema for a given service?

The Standard Answer

Naturally, you start by giving the standard answer - “it depends”.  This is an essential and carefully crafted phrase that all consultants are taught to give according to page 17 of How to Win Clients and Influence Budgets. Of course, the standard answer is rarely sufficient, but it is not without merit.  It is certainly true that the decision regarding your data model design depends significantly upon a host of factors:

  • Which use cases are high priority?
  • Which use cases are high impact (to stakeholders and/or to customers)?
  • Which systems and/or processes are mission critical?
  • Are you including industry standard data models in you enterprise out of necessity (i.e. ONLY when you are forced to interact with other entities), or is it part of a broader modernization effort within your organization?
  • Do interoperability challenges or integration challenges currently contribute most significantly to your development and/or maintenance costs?
  • Which interaction scenarios are of higher priority, internal or external service calls?
  • Would aligning more closely to one of the available models lead to a significant reduction in data mapping activities or is the environment simply too fragmented to support this?
  • Is there one particular system or business process that is unique and skewing perceptions regarding our service data models?

While all of these questions (and many more) are important and aid in facilitating a thorough examination of the problem, we rarely have the time to examine each problem from all possible angles.  Consequently, we must look toward guidelines and rules of thumbs.

Guidelines for Managing Multiple Service Schema

1)       Your business processes should only work with a single data model if at all possible.  Business processes are best designed to be data model agnostic and operate off of an internal, process-centric model.  I explained the importance of process-centric data models in a previous post.

2)       Your services should work with as few data models as possible (one model being preferable).  This is just good commonsense.  For each data model that is added to the mix, your development time and long-term maintenance costs increase at a non-linear rate.  The pain will intensify and it will do so rapidly with each new model you add to the mix.

3)       If your services are going to work with multiple models, you should put some sort of taxonomy / categorization scheme in place to distinguish the data models used by services.  For example, services that are outward facing might use a data model accepted more broadly by the industry.  Services used by a particular LOB might use a certain data model, and services used by another LOB might use another.  Another distinction could be infrastructure services vs data services vs application services.  Regardless of the approach, there needs to be some methodology that is objective and governable for when one data model is used vs when another data model is used.

4)       Data model transformation is a necessary evil.  It should be done only when necessary and you should contain the transformation to a designated component.  Transformation activities should be handled by intermediaries (data services, ESB, network appliance, other mediation framework) when possible rather than building it into the internals of a service or process.  This keeps your services clean, provides a nice reusable transformation mechanism, keeps your interoperability more loosely-coupled, and provides for agility and extensibility in the future.

Juggling multiple data models within a service oriented environment is no one’s idea of fun.  When possible, aim for a more comprehensive and strategic analysis of the environment (see the ‘Standard Answer’ outlined above).  When this is not realistic,  try to use the above guidelines and rules of thumb to help you tactically navigate the murky waters of data model incongruity.  Service design isn’t easy, but it doesn’t have to be rocket surgery either.  Best of luck!

More Stories By Kyle Gabhart

Kyle Gabhart is a subject matter expert specializing in strategic planning and tactical delivery of enterprise technology solutions, blending EA, BPM, SOA, Cloud Computing, and other emerging technologies. Kyle currently serves as a director for Web Age Solutions, a premier provider of technology education and mentoring. Since 2001 he has contributed extensively to the IT community as an author, speaker, consultant, and open source contributor.

@MicroservicesExpo Stories
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
For most organizations, the move to hybrid cloud is now a question of when, not if. Fully 82% of enterprises plan to have a hybrid cloud strategy this year, according to Infoholic Research. The worldwide hybrid cloud computing market is expected to grow about 34% annually over the next five years, reaching $241.13 billion by 2022. Companies are embracing hybrid cloud because of the many advantages it offers compared to relying on a single provider for all of their cloud needs. Hybrid offers bala...
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
If you cannot explicitly articulate how investing in a new technology, changing the approach or re-engineering the business process will help you achieve your customer-centric vision of the future in direct and measurable ways, you probably shouldn’t be doing it. At Intellyx, we spend a lot of time talking to technology vendors. In our conversations, we explore emerging new technologies that are either disrupting the way enterprise organizations work or that help enable those organizations to ...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of t...
Microservices are increasingly used in the development world as developers work to create larger, more complex applications that are better developed and managed as a combination of smaller services that work cohesively together for larger, application-wide functionality. Tools such as Service Fabric are rising to meet the need to think about and build apps using a piece-by-piece methodology that is, frankly, less mind-boggling than considering the whole of the application at once. Today, we'll ...
In his session at Cloud Expo, Alan Winters, an entertainment executive/TV producer turned serial entrepreneur, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to ma...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
Colocation is a central pillar of modern enterprise infrastructure planning because it provides greater control, insight, and performance than managed platforms. In spite of the inexorable rise of the cloud, most businesses with extensive IT hardware requirements choose to host their infrastructure in colocation data centers. According to a recent IDC survey, more than half of the businesses questioned use colocation services, and the number is even higher among established businesses and busin...