Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: Microservices Expo, Open Source Cloud

Microservices Expo: Article

Open SOA Collaboration

The birth of another standard

Last month an alliance of leading vendors announced progress on specifications to define a language-neutral programming model for application development in SOA environments. They call this specification Open SOA Collaboration. In essence, they are proposing a new standard to create and manage IT, making the process of integrating different third-party SOA technologies "less onerous," they say. Or, we can call this a standard way of delivering services, making it easier to work and play well together.

So, who's in the gang? BEA, IBM, Oracle, and SAP first got together last November to begin work on the common programming model, along with Iona, Sybase, Xcalia SA, and Zend Technologies Ltd. Others are joining the mix, including Software AG and Red Hat.

This group has concentrated its efforts on two projects - service component architecture (SCA) and service data objects (SDO). If this sounds familiar, it is. We've seen this type of standard with components, distributed objects, and, more recently, Java.

SCA is looking to provide a model for creating service components in a wide range of languages and a model for assembling service components in a business solution. In essence this is a standard that defines how services are created so they interact with each other without a lot of customization. This will benefit those who are looking to create composite applications that use these services.

SCA encourages an SOA organization of business application code based on components that implement business logic. It offers capabilities through services that consume functions offered by other components through services called references. SCA divides up the steps in building a service-oriented application into two major parts: The implementation of components that provide services and consume other services and the assembly of sets of components to build business applications by wiring references to the services.

SCA stresses decoupling the service implementation and service assembly from the details of the infrastructure capabilities and the access methods used to invoke services. SCA components operate at a business level, according to the spec.

SDO is looking to provide a consistent way of handling data in applications, whatever its source or format may be. Okay, that would be data abstraction. Moreover, SDO provides a way to unify data handling for databases and services.

It's clear that SDO is designed to unify the way in which SOA applications handle data. Using SDO, application programmers can uniformly access and manipulate data from heterogeneous data sources, including relational databases, XML data sources, Web Services, and enterprise information systems.

SDO is based on the concept of disconnected data graphs or a collection of tree-structured or graph-structured data objects. Under a disconnected data graphs architecture, a client retrieves a data graph from a data source, mutates the data graph, and then applies the data graph changes back to the data source.

Databases are connected to the applications by data mediator services. Client applications query a data mediator service and get a data graph in response. Client applications send an updated data graph to a data mediator service to have the updates applied to the original data source, and this architecture allows applications to deal principally with data graphs and data objects.

New? No. Interesting? Sure. We've seen these types of standards before with the rise of client/server, CORBA, and Java, all looking to provide standard mechanisms for developing SOA, or, the way we bind all of these things together to form applications. The SDA concept especially has been done to death, with some successes and some classic failures.

As always, the real battle to be won here is the developer's acceptance of these standards. For that, the vendors have to work together to implement the standards in the very same way...something that's been tough to do in the past. So they'll have to put aside their desire to stand out and focus on being the same...an unnatural act for most.

It will also be interesting to see where this standard goes in the context of BPEL and other standards that provide the same solution patterns. At the end of the day, standards are only useful if there's one for each problem pattern. So far in the world of SOA, we have three or more standards for each problem pattern. Those who consume the technology won't touch standards until the problem is solved. Once bitten, twice shy.

More Stories By David Linthicum

David Linthicum is the Chief Cloud Strategy Officer at Deloitte Consulting, and was just named the #1 cloud influencer via a recent major report by Apollo Research. He is a cloud computing thought leader, executive, consultant, author, and speaker. He has been a CTO five times for both public and private companies, and a CEO two times in the last 25 years.

Few individuals are true giants of cloud computing, but David's achievements, reputation, and stellar leadership has earned him a lofty position within the industry. It's not just that he is a top thought leader in the cloud computing universe, but he is often the visionary that the wider media invites to offer its readers, listeners and viewers a peek inside the technology that is reshaping businesses every day.

With more than 13 books on computing, more than 5,000 published articles, more than 500 conference presentations and numerous appearances on radio and TV programs, he has spent the last 20 years leading, showing, and teaching businesses how to use resources more productively and innovate constantly. He has expanded the vision of both startups and established corporations as to what is possible and achievable.

David is a Gigaom research analyst and writes prolifically for InfoWorld as a cloud computing blogger. He also is a contributor to “IEEE Cloud Computing,” Tech Target’s SearchCloud and SearchAWS, as well as is quoted in major business publications including Forbes, Business Week, The Wall Street Journal, and the LA Times. David has appeared on NPR several times as a computing industry commentator, and does a weekly podcast on cloud computing.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...