Welcome!

Microservices Expo Authors: Derek Weeks, Elizabeth White, Gopala Krishna Behara, Sridhar Chalasani, Tirumala Khandrika

Related Topics: Microservices Expo

Microservices Expo: Article

BPEL Unleashed

Putting a modern business process execution standard to work

BPEL (Business Process Execution Language) makes business processes and composite Web services first-class citizens of the Java and .NET platforms, while preventing vendor lock-in. The result is a drastic reduction in the complexity, delivery time, and cost associated with implementing workflow, BPM (business process management), and related business integration projects.

BPEL is a new standard for implementing business processes in an emerging service-oriented architecture world. As such, applying BPEL introduces new considerations, challenges, and pitfalls for delivering process-aware applications based on a service-oriented architecture (SOA).

The Rise of BPEL
There has been a continuous need in the enterprise to integrate systems and applications into end-to-end business processes. Traditional integration solutions are arcane, proprietary, and expensive, and have failed to be widely adopted across the enterprise. This has driven the industry to a new business process execution standard called BPEL. A widespread group of technology companies have committed to supporting BPEL, including all the Java and .NET platform vendors, and even at this early stage, portability across vendor implementations is being demonstrated.

In less than two years since being unveiled, BPEL has become the de-facto orchestration language standard, bypassing a number of alternative specifications such as BPML and WSCI. As Web services march towards mainstream adoption, focus is now shifting to delivery of process-centric applications based on a service-oriented architecture. To a large extent, the creation of BPEL was driven by the need to facilitate flexibility, visibility, and ease of management at the process layer. Applying BPEL effectively separates process logic from the rest of the application.

Separating process logic from the rest of the application and making it explicit is similar in pattern to the earlier separation of data representation from application logic. BPEL is a standardized, semantically rich, process language based on the Web services stack. The XML-based nature of BPEL processes enables the editing of process logic using visual design and modeling tools. It also allows for progress tracking, aggregate reporting, and complete end-to-end management of deployed processes. Access to runtime process execution information can in turn be used to analyze and further optimize BPEL processes.

With BPEL promoting interoperability at the tools level, over time a more collaborative development process is possible, allowing for developers, business analysts, and other skilled professionals to engage where they fit best at various stages of design, development, deployment, and management. The gap between business analysts and software developers has been a well-known impediment to efficient and successful implementations of process-aware applications. BPEL as a standardized process orchestration language carries the promise of narrowing that gap and allowing for effective collaboration between these otherwise disjointed groups of professionals.

The Scope of BPEL
Although relatively new, BPEL essentially derives its maturity from its predecessor orchestration languages, XLANG and WSFL, representing a vast implementation experience through research and commercial deployment. That background largely mitigates the risk generally associated with betting on a new standard and is augmented by the support of 100+ members backing the progress of the BPEL specification through the WSBPEL technical committee at OASIS.

With a design goal of maintaining a clean specification, BPEL focuses on supporting composition and coordination of services into business processes and new compound services. To keep complexity in check, BPEL does not directly support BPM- and workflow-specific constructs, such as tasks or roles. The explicit XML-based representation of BPEL process logic allows BPM functions to be provided independently on top of the existing language capabilities without adding to the complexity of the language. Workflow functions can be similarly provided through the notion of infrastructure services, for example, a task service for handling human interaction with a BPEL process (see Figure 1).

In a clean slate enterprise environment, all IT resources that need to be orchestrated into business processes would already be exposed as Web services. The prevailing reality is that this is seldom the case and existing legacy applications as well as newer applications are commonly exposed through a variety of proprietary interfaces and protocols. Supporting such environments in a pragmatic fashion becomes possible by using a WSDL binding framework (see Figure 2), such as Apache WSIF (Web Services Invocation Framework). This binding framework can support an arbitrary set of protocols in addition to SOAP, effectively decoupling resource connectivity from process design. The WSDL interface allows such resources to be viewed as Web services, ready for orchestration into a BPEL process, while native protocols are used to connect to them.

The BPEL Environment
Business integration scenarios that are appropriate to implement in BPEL commonly include several of the following technical requirements:

  • Access heterogeneous systems
  • Multiparty data transformations
  • Asynchronous interactions (state management, correlation)
  • Parallel processing (sophisticated join patterns)
  • Compensation logic ("Undo" operations)
  • Exception management
  • Reliability and scalability (high performance)
  • Operations management (auditing, monitoring)
  • Change management (side-by-side versioning)
BPEL is quickly becoming a mandatory requirement for customers evaluating BPM and workflow solutions as it inherently addresses many of these technical requirements and others that are supported by any mature commercial BPEL server implementation. The benefits of BPEL far outweigh the risks associated with adopting an emerging standard, considering the typical shortcomings of existing BPM and workflow solutions: high cost, complexity, and unavoidable vendor lock-in. BPEL avoids these issues by drastically reducing the skill level required for implementing process logic and seamlessly fitting into the interoperable Web services stack. Most importantly, BPEL provides a portable process-logic representation with a growing number of vendors to choose from for BPEL process execution.

Unlike existing BPM and workflow solutions, BPEL frees customers from having to choose an all-in-one solution and naturally leads to component choice and the utilization of standards-based architectures. Sun's JSR 208 (aka Java Business Integration, www.jcp.org/en/jsr/detail?id=208) describes one such blueprint that includes a BPEL process engine in a best-of-breed integrated framework for business integration. JSR 208 demonstrates that BPEL can elegantly integrate with existing infrastructure, in particular with the Java/J2EE platform. Finally, BPEL relies on XML as a universal data model for message exchange between a process and its related services. This further enables BPEL to interoperate with a wide variety of value-add application components, most of which now support XML for communication, e.g., rules engines, transformation services, and more.

BPEL and SOA
The introduction of Web services is creating a material change in the integration space, mostly due to the success and ROI of numerous Web services projects. Experience shows that the technologies for implementing a proper service-oriented architecture exist and work. The important aspect of this architecture is that it allows an organization to maintain a focus on solving business problems and not just application problems. From this perspective, solutions to business problems can now be addressed and implemented in a consistent and methodical fashion using this architectural framework.

It's important to note that there is a distinction between Web services and a service-oriented architecture. SOA is truly just an architecture and is independent of any particular set of technologies, including Web services. It is defined as an application architecture where functions are defined as loosely coupled (read: independent) services, with well-defined interfaces that can be called in specified sequences to form business processes. Web services, at the moment, act as the set of standards and technologies (such as SOAP, WSDL, and related XML standards) that collectively serve as the foundation for implementing SOAs.

With the above definition, it is apparent that SOAs will serve to address integration problems, leveraging services as building blocks. These building blocks will be composed and coordinated to form business processes, defining the role of BPEL in this architecture. The reusability and flexibility of services leads to integration problems from a top-down perspective rather than the traditional bottom up. Consequently, integration based on SOA can be addressed one problem at a time, rather than having to deploy upfront expensive and complex integration infrastructure throughout an enterprise prior to addressing any integration issue.

Services are then created to address specific integration problems and can later be reused to address new problems. Over time, collections of existing reusable services can be used to fully address an organization's integration problems. In that light, the value of SOAs grows incrementally as services are implemented and added as assets. Furthermore, with composition and coordination of services enabled by BPEL, functionality can be encapsulated at multiple levels to create more valuable services that can be leveraged across a wide range of new applications.

BPEL in Use
Commercial support for and implementations of the BPEL standard are rising at a fast pace, as evidenced by recent product announcements by leading platform vendors, EAI vendors, and startups. BPEL engine implementations span both the Java and .NET platforms while other vendors have added support for BPEL code generation from visual models created in their tool environments. Customer inquiries indicate that a growing number of end users are evaluating the use of BPEL for mission-critical projects. Such evaluations typically involve a technical hands-on product evaluation, a proof of concept, and sometimes an initial production deployment to prove the maturity and ROI of the approach. At this stage of adoption, many prospects are still discovering the proper criteria for evaluating BPEL tools and engines and treating their early implementations using BPEL as a stepping stone to more mission-critical applications.

However, as is commonly the case, certain industries are quicker than others to adopt new standards and technologies. Such industries include financial services, telecommunications, and various service providers to both business and government entities to name a few. Early examples for the use of BPEL in commercial applications include:

  • Resource management workflows:
    - Company: ISV offering workplace resource management solution to Fortune 500 companies
    - Operation: BPEL server bundled with product offering, using BPEL processes to manage workflows
  • New Policy Issuance Process:
    - Company: Fortune 50 insurance and annuities issuer company
    - Operation: Selling policies to consumers through brokers and agents
  • Service provisioning process:
    - Company: Information services provider (to global space agencies)
    - Operation: Supply satellite data to commercial and government customers
  • Enterprise-wide integration standards:
    - Company: Multinational Consumer Product Manufacturing Company
    - Operation: Low-cost simplified integration solution for online markets
The common drivers for automation in the above use cases are reduction of cycle times and processing costs as well as minimizing or eliminating manual coordination tasks (which commonly contribute to less efficient, error-prone processing). The choice of BPEL in all of these use cases involved both business requirements as well as technical requirements.

Business requirements include:

  • Lowest cost of acquisition and ownership
  • Processes in a standard way to avoid vendor lock-in
  • Support for both automated and human workflow
Technical requirements for the above include:
  • Support for Web services standards and XML
  • Scalability and reliability
  • Portable
  • Reusable
  • Easily modifiable
The Road Ahead
BPEL is on its way to becoming the cornerstone of SOA implementations in an enterprise environment, responsible for coordinating disparate resources and services into end-to-end business processes. The explicit XML-based representation of the BPEL process allows for extensive management, reporting, and analysis functions to be added on top of the process execution layer. These functions can be provided by the BPEL engine vendor of choice, or by ISVs who provide value-add components that can enhance an overall solution structured around a BPEL engine. While some of these value-added functions are available in commercial BPEL implementations today, we believe that this is just the tip of the iceberg. The next 12 months should be very interesting and offer significant ROI opportunities for the companies that are ready to move to business process management with BPEL.

More Stories By Doron Sherman

Doron Sherman is the CTO of Collaxa, Inc., a Web Service Orchestration Server vendor and a BEA Partner located in Redwood Shores, California. He has been involved with Java since its early days and pioneered application server technology while being a founder and chief scientist at NetDynamics.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to clos...
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things c...
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
Many IT organizations have come to learn that leveraging cloud infrastructure is not just unavoidable, it’s one of the most effective paths for IT organizations to become more responsive to business needs. Yet with the cloud comes new challenges, including minimizing downtime, decreasing the cost of operations, and preventing employee burnout to name a few. As companies migrate their processes and procedures to their new reality of a cloud-based infrastructure, an incident management solution...
Cloud Governance means many things to many people. Heck, just the word cloud means different things depending on who you are talking to. While definitions can vary, controlling access to cloud resources is invariably a central piece of any governance program. Enterprise cloud computing has transformed IT. Cloud computing decreases time-to-market, improves agility by allowing businesses to adapt quickly to changing market demands, and, ultimately, drives down costs.
Recent survey done across top 500 fortune companies shows almost 70% of the CIO have either heard about IAC from their infrastructure head or they are on their way to implement IAC. Yet if you look under the hood while some level of automation has been done, most of the infrastructure is still managed in much tradition/legacy way. So, what is Infrastructure as Code? how do you determine if your IT infrastructure is truly automated?