Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: Microservices Expo

Microservices Expo: Article

BPEL Unleashed

Putting a modern business process execution standard to work

BPEL (Business Process Execution Language) makes business processes and composite Web services first-class citizens of the Java and .NET platforms, while preventing vendor lock-in. The result is a drastic reduction in the complexity, delivery time, and cost associated with implementing workflow, BPM (business process management), and related business integration projects.

BPEL is a new standard for implementing business processes in an emerging service-oriented architecture world. As such, applying BPEL introduces new considerations, challenges, and pitfalls for delivering process-aware applications based on a service-oriented architecture (SOA).

The Rise of BPEL
There has been a continuous need in the enterprise to integrate systems and applications into end-to-end business processes. Traditional integration solutions are arcane, proprietary, and expensive, and have failed to be widely adopted across the enterprise. This has driven the industry to a new business process execution standard called BPEL. A widespread group of technology companies have committed to supporting BPEL, including all the Java and .NET platform vendors, and even at this early stage, portability across vendor implementations is being demonstrated.

In less than two years since being unveiled, BPEL has become the de-facto orchestration language standard, bypassing a number of alternative specifications such as BPML and WSCI. As Web services march towards mainstream adoption, focus is now shifting to delivery of process-centric applications based on a service-oriented architecture. To a large extent, the creation of BPEL was driven by the need to facilitate flexibility, visibility, and ease of management at the process layer. Applying BPEL effectively separates process logic from the rest of the application.

Separating process logic from the rest of the application and making it explicit is similar in pattern to the earlier separation of data representation from application logic. BPEL is a standardized, semantically rich, process language based on the Web services stack. The XML-based nature of BPEL processes enables the editing of process logic using visual design and modeling tools. It also allows for progress tracking, aggregate reporting, and complete end-to-end management of deployed processes. Access to runtime process execution information can in turn be used to analyze and further optimize BPEL processes.

With BPEL promoting interoperability at the tools level, over time a more collaborative development process is possible, allowing for developers, business analysts, and other skilled professionals to engage where they fit best at various stages of design, development, deployment, and management. The gap between business analysts and software developers has been a well-known impediment to efficient and successful implementations of process-aware applications. BPEL as a standardized process orchestration language carries the promise of narrowing that gap and allowing for effective collaboration between these otherwise disjointed groups of professionals.

The Scope of BPEL
Although relatively new, BPEL essentially derives its maturity from its predecessor orchestration languages, XLANG and WSFL, representing a vast implementation experience through research and commercial deployment. That background largely mitigates the risk generally associated with betting on a new standard and is augmented by the support of 100+ members backing the progress of the BPEL specification through the WSBPEL technical committee at OASIS.

With a design goal of maintaining a clean specification, BPEL focuses on supporting composition and coordination of services into business processes and new compound services. To keep complexity in check, BPEL does not directly support BPM- and workflow-specific constructs, such as tasks or roles. The explicit XML-based representation of BPEL process logic allows BPM functions to be provided independently on top of the existing language capabilities without adding to the complexity of the language. Workflow functions can be similarly provided through the notion of infrastructure services, for example, a task service for handling human interaction with a BPEL process (see Figure 1).

In a clean slate enterprise environment, all IT resources that need to be orchestrated into business processes would already be exposed as Web services. The prevailing reality is that this is seldom the case and existing legacy applications as well as newer applications are commonly exposed through a variety of proprietary interfaces and protocols. Supporting such environments in a pragmatic fashion becomes possible by using a WSDL binding framework (see Figure 2), such as Apache WSIF (Web Services Invocation Framework). This binding framework can support an arbitrary set of protocols in addition to SOAP, effectively decoupling resource connectivity from process design. The WSDL interface allows such resources to be viewed as Web services, ready for orchestration into a BPEL process, while native protocols are used to connect to them.

The BPEL Environment
Business integration scenarios that are appropriate to implement in BPEL commonly include several of the following technical requirements:

  • Access heterogeneous systems
  • Multiparty data transformations
  • Asynchronous interactions (state management, correlation)
  • Parallel processing (sophisticated join patterns)
  • Compensation logic ("Undo" operations)
  • Exception management
  • Reliability and scalability (high performance)
  • Operations management (auditing, monitoring)
  • Change management (side-by-side versioning)
BPEL is quickly becoming a mandatory requirement for customers evaluating BPM and workflow solutions as it inherently addresses many of these technical requirements and others that are supported by any mature commercial BPEL server implementation. The benefits of BPEL far outweigh the risks associated with adopting an emerging standard, considering the typical shortcomings of existing BPM and workflow solutions: high cost, complexity, and unavoidable vendor lock-in. BPEL avoids these issues by drastically reducing the skill level required for implementing process logic and seamlessly fitting into the interoperable Web services stack. Most importantly, BPEL provides a portable process-logic representation with a growing number of vendors to choose from for BPEL process execution.

Unlike existing BPM and workflow solutions, BPEL frees customers from having to choose an all-in-one solution and naturally leads to component choice and the utilization of standards-based architectures. Sun's JSR 208 (aka Java Business Integration, www.jcp.org/en/jsr/detail?id=208) describes one such blueprint that includes a BPEL process engine in a best-of-breed integrated framework for business integration. JSR 208 demonstrates that BPEL can elegantly integrate with existing infrastructure, in particular with the Java/J2EE platform. Finally, BPEL relies on XML as a universal data model for message exchange between a process and its related services. This further enables BPEL to interoperate with a wide variety of value-add application components, most of which now support XML for communication, e.g., rules engines, transformation services, and more.

The introduction of Web services is creating a material change in the integration space, mostly due to the success and ROI of numerous Web services projects. Experience shows that the technologies for implementing a proper service-oriented architecture exist and work. The important aspect of this architecture is that it allows an organization to maintain a focus on solving business problems and not just application problems. From this perspective, solutions to business problems can now be addressed and implemented in a consistent and methodical fashion using this architectural framework.

It's important to note that there is a distinction between Web services and a service-oriented architecture. SOA is truly just an architecture and is independent of any particular set of technologies, including Web services. It is defined as an application architecture where functions are defined as loosely coupled (read: independent) services, with well-defined interfaces that can be called in specified sequences to form business processes. Web services, at the moment, act as the set of standards and technologies (such as SOAP, WSDL, and related XML standards) that collectively serve as the foundation for implementing SOAs.

With the above definition, it is apparent that SOAs will serve to address integration problems, leveraging services as building blocks. These building blocks will be composed and coordinated to form business processes, defining the role of BPEL in this architecture. The reusability and flexibility of services leads to integration problems from a top-down perspective rather than the traditional bottom up. Consequently, integration based on SOA can be addressed one problem at a time, rather than having to deploy upfront expensive and complex integration infrastructure throughout an enterprise prior to addressing any integration issue.

Services are then created to address specific integration problems and can later be reused to address new problems. Over time, collections of existing reusable services can be used to fully address an organization's integration problems. In that light, the value of SOAs grows incrementally as services are implemented and added as assets. Furthermore, with composition and coordination of services enabled by BPEL, functionality can be encapsulated at multiple levels to create more valuable services that can be leveraged across a wide range of new applications.

BPEL in Use
Commercial support for and implementations of the BPEL standard are rising at a fast pace, as evidenced by recent product announcements by leading platform vendors, EAI vendors, and startups. BPEL engine implementations span both the Java and .NET platforms while other vendors have added support for BPEL code generation from visual models created in their tool environments. Customer inquiries indicate that a growing number of end users are evaluating the use of BPEL for mission-critical projects. Such evaluations typically involve a technical hands-on product evaluation, a proof of concept, and sometimes an initial production deployment to prove the maturity and ROI of the approach. At this stage of adoption, many prospects are still discovering the proper criteria for evaluating BPEL tools and engines and treating their early implementations using BPEL as a stepping stone to more mission-critical applications.

However, as is commonly the case, certain industries are quicker than others to adopt new standards and technologies. Such industries include financial services, telecommunications, and various service providers to both business and government entities to name a few. Early examples for the use of BPEL in commercial applications include:

  • Resource management workflows:
    - Company: ISV offering workplace resource management solution to Fortune 500 companies
    - Operation: BPEL server bundled with product offering, using BPEL processes to manage workflows
  • New Policy Issuance Process:
    - Company: Fortune 50 insurance and annuities issuer company
    - Operation: Selling policies to consumers through brokers and agents
  • Service provisioning process:
    - Company: Information services provider (to global space agencies)
    - Operation: Supply satellite data to commercial and government customers
  • Enterprise-wide integration standards:
    - Company: Multinational Consumer Product Manufacturing Company
    - Operation: Low-cost simplified integration solution for online markets
The common drivers for automation in the above use cases are reduction of cycle times and processing costs as well as minimizing or eliminating manual coordination tasks (which commonly contribute to less efficient, error-prone processing). The choice of BPEL in all of these use cases involved both business requirements as well as technical requirements.

Business requirements include:

  • Lowest cost of acquisition and ownership
  • Processes in a standard way to avoid vendor lock-in
  • Support for both automated and human workflow
Technical requirements for the above include:
  • Support for Web services standards and XML
  • Scalability and reliability
  • Portable
  • Reusable
  • Easily modifiable
The Road Ahead
BPEL is on its way to becoming the cornerstone of SOA implementations in an enterprise environment, responsible for coordinating disparate resources and services into end-to-end business processes. The explicit XML-based representation of the BPEL process allows for extensive management, reporting, and analysis functions to be added on top of the process execution layer. These functions can be provided by the BPEL engine vendor of choice, or by ISVs who provide value-add components that can enhance an overall solution structured around a BPEL engine. While some of these value-added functions are available in commercial BPEL implementations today, we believe that this is just the tip of the iceberg. The next 12 months should be very interesting and offer significant ROI opportunities for the companies that are ready to move to business process management with BPEL.

More Stories By Doron Sherman

Doron Sherman is the CTO of Collaxa, Inc., a Web Service Orchestration Server vendor and a BEA Partner located in Redwood Shores, California. He has been involved with Java since its early days and pioneered application server technology while being a founder and chief scientist at NetDynamics.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...