Welcome!

Microservices Expo Authors: Liz McMillan, Harry Trott, Pat Romanski, Mamoon Yunus, Elizabeth White

Related Topics: Microservices Expo

Microservices Expo: Article

Building SOA with Tuscany SCA

A simple service-oriented infrastructure

Many articles have already been written about service-oriented architecture (SOA) and Service Component Architecture (SCA), for example, see references [1] and [2]. In this article we'll focus on a freely available, open source implementation of the Service Component Architecture that provides a simple way to implement SOA solutions. This SCA implementation is being developed in the Apache Tuscany Incubator project. The project started in 2006 and is being used by many who are looking for a simple SOA infrastructure. The recent Tuscany SCA version 1.0, which was released in September 2007, supports the Service Component Architecture specifications 1.0. In addition to implementing the SCA specifications, Tuscany is also a nursery for new ideas. Some of these ideas will find their way into the specifications and some will be regarded as extensions available in Apache Tuscany Incubator. For example, support for Ruby, JavaScript, XQuery, data binding and Web 2.0 are currently extensions beyond the specification.

This article will walk you through what is available in Apache Tuscany Incubator and therefore highlight the benefits of SCA.

Using Tuscany SCA
A Common Approach to Application Construction and Deployment

Enterprise software development is increasingly influenced by technology choices, regulations, competition and expectations for responsiveness to change. Enterprises need the flexibility to adopt new business practices (like outsourcing of mortgage handling by a bank), enforce new regulations, and extend or down-size without much cost (mergers and acquisitions). In addition, as the complexity of the enterprise grows, a common management paradigm becomes a necessity for managing business applications. Service Component Architecture provides a simple programming model to address these challenges. SCA's simple language maps easily to the business. Let's consider we are building a banking application that handles account inquiries. Table 1 maps business-level questions to SCA.

SCA provides a consistent model of distributed applications and of the components from which they are constructed. This model explicitly separates business logic (Component/Services/References) from the details of how a running application is assembled (Composite/Wire) and deployed. This promotes a common terminology and supports a common understanding of the capability of applications and the way those applications work together. This common model also provides the hooks for tooling, governance, monitoring, and management in the service-oriented world.

When it comes to building a solution for real, one of the most important questions is likely to be "how can existing IT infrastructure and skills be used?" Tuscany SCA does not invent new technologies for component implementations (Implementation) and message exchange (Binding). It neither requires you to learn a new programming language nor communications protocols. You are free to leverage your existing investment in applications, technology, and skills as long as suitable support exists in Tuscany SCA. This is not much of a hurdle; Tuscany SCA has a straightforward extensibility model so new or proprietary technologies can easily be included.

The following sections describe Tuscany SCA in the context of three familiar scenarios. It should be noted that Tuscany SCA is not restricted to these scenarios. The sample code and configuration used here can be found in the Tuscany SCA Java distribution [3] and is available under the Apache License [4].

Enterprise Applications
In a typical enterprise, business functions are implemented using various technologies, business data is represented in different formats, and business applications communicate using heterogeneous protocols. It is almost impossible to converge all applications onto one technology stack such as Web services and so it remains difficult and costly to integrate different applications in an enterprise. Enterprises face many challenges including the following.

•  Business applications are tightly coupled with the IT infrastructure and early design decisions have to be made before real deployment.
•  Application developers are forced to learn and understand many technologies beyond business domain knowledge
•  Business logic is polluted and coupled to various technology-specific API calls imposed by the IT infrastructure. It's not easy to write and not easy to change.

SCA separates business services from the concerns related to specific hardware, software and network protocols by providing a unified programming model that allows the SCA runtime to handle these issues transparently. Let's look at a simple business scenario to see how Tuscany SCA can help enterprise application integration. The scenario here is the BigBank demo from the Tuscany SCA distribution [5]. As illustrated in Figure 1, the application comprises a number of assembled components and ultimately returns a total account balance in response to account inquires.

The use of the SCA programming model allows the BigBank developer to decouple the process of designing and creating the scenario from infrastructure concerns. In the BigBank composite, basic units of business logic are modelled as SCA components called AccountComponent, StockQuoteComponent, etc. Their business logic is implemented using Java and various scripting languages. Components are assembled by wiring references to services. Once all business logic is implemented, appropriate bindings are applied to references and services to indicate how the components should communicate.

The XML-based SCA configuration language describes all of the information about loosely coupled enterprise services and the bindings to be used. Since binding information can be changed in the SCA configuration without changing the business logic, the implementation code is not polluted with protocol handling information and, furthermore, bindings can be changed during deployment without impacting the application.

The following SCA configuration shows the AccountService exposed using JSONRPC (binding.jsonrpc) and Web services (binding.ws). The service can easily be made accessible over RMI by simply adding binding.rmi.

<component name="AccountServiceComponent">
    <implementation.java class="bigbank.account.AccountServiceImpl" />

    <service name="AccountService">
       <tuscany:binding.jsonrpc uri="/AccountJSONService" />
       <binding.ws
       wsdlElement="http://bigbank#wsdl.port(AccountService/AccountServiceSoap)" />
    </service>

    ...
</component>

The following SCA configuration shows bindings applied to component references. Again these bindings can be changed or augmented without changing the business logic.

<component name="AccountServiceComponent">
    ...
    <reference name="calculatorService">
       <tuscany:binding.rmi host="localhost" port="8099" serviceName="CalculatorRMIService" />
    </reference>

    <reference name="stockQuoteService">
       <binding.ws uri="http://localhost:8081/services/StockQuoteWebService" />
    </reference>
    ...
</component>

More Stories By Haleh Mahbod

Haleh Mahbod is a program director with IBM, managing the team contributing to the Apache Tuscany as well as SOA for PHP open source. She has extensive development experience with database technologies and integration servers.

More Stories By Raymond Feng

Raymond Feng is a senior software engineer with IBM. He is now working on the Service Component Architecture (SCA) runtime implementation in Apache Tuscany project as a committer. Raymond has been developing SOA for more than 4 years and he was a key developer and team lead for WebSphere Process Server products since 2002.

More Stories By Simon Laws

Simon Laws is a member of the IBM Open Source SOA project team working with the open source Apache and PHP communities to build Java, C++, and PHP implementations of the Service Component Architecture (SCA) and Service Data Object (SDO) specifications. Prior to this role he was working in the distributed computing space building service-oriented solutions for customers with a particular interest in grid computing and virtualization.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
anonymous 03/12/08 06:13:08 PM EDT

SCA sounds a lot like CORBA..
or is it just me seeing this?

anonymous 11/11/07 09:28:46 PM EST

SCA = CORBA
CORBA = failure
SCA = ??

@MicroservicesExpo Stories
There are several reasons why businesses migrate their operations to the cloud. Scalability and price are among the most important factors determining this transition. Unlike legacy systems, cloud based businesses can scale on demand. The database and applications in the cloud are not rendered simply from one server located in your headquarters, but is instead distributed across several servers across the world. Such CDNs also bring about greater control in times of uncertainty. A database hack ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
API Security is complex! Vendors like Forum Systems, IBM, CA and Axway have invested almost 2 decades of engineering effort and significant capital in building API Security stacks to lockdown APIs. The API Security stack diagram shown below is a building block for rapidly locking down APIs. The four fundamental pillars of API Security - SSL, Identity, Content Validation and deployment architecture - are discussed in detail below.
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
API Security has finally entered our security zeitgeist. OWASP Top 10 2017 - RC1 recognized API Security as a first class citizen by adding it as number 10, or A-10 on its list of web application vulnerabilities. We believe this is just the start. The attack surface area offered by API is orders or magnitude larger than any other attack surface area. Consider the fact the APIs expose cloud services, internal databases, application and even legacy mainframes over the internet. What could go wrong...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically ab...
Web services have taken the development world by storm, especially in recent years as they've become more and more widely adopted. There are naturally many reasons for this, but first, let's understand what exactly a web service is. The World Wide Web Consortium (W3C) defines "web of services" as "message-based design frequently found on the Web and in enterprise software". Basically, a web service is a method of sending a message between two devices through a network. In practical terms, this ...
Docker is on a roll. In the last few years, this container management service has become immensely popular in development, especially given the great fit with agile-based projects and continuous delivery. In this article, I want to take a brief look at how you can use Docker to accelerate and streamline the software development lifecycle (SDLC) process.
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
We define Hybrid IT as a management approach in which organizations create a workload-centric and value-driven integrated technology stack that may include legacy infrastructure, web-scale architectures, private cloud implementations along with public cloud platforms ranging from Infrastructure-as-a-Service to Software-as-a-Service.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
We have already established the importance of APIs in today’s digital world (read about it here). With APIs playing such an important role in keeping us connected, it’s necessary to maintain the API’s performance as well as availability. There are multiple aspects to consider when monitoring APIs, from integration to performance issues, therefore a general monitoring strategy that only accounts for up-time is not ideal.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that’s no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, will explore how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He wi...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
These days, change is the only constant. In order to adapt and thrive in an ever-advancing and sometimes chaotic workforce, companies must leverage intelligent tools to streamline operations. While we're only at the dawn of machine intelligence, using a workflow manager will benefit your company in both the short and long term. Think: reduced errors, improved efficiency and more empowered employees-and that's just the start. Here are five other reasons workflow automation is leading a revolution...