Welcome!

Microservices Expo Authors: Liz McMillan, Todd Matters, Pat Romanski, Elizabeth White, Stefana Muller

Related Topics: Microservices Expo

Microservices Expo: Article

BPEL Unleashed

Putting a modern business process execution standard to work

BPEL (Business Process Execution Language) makes business processes and composite Web services first-class citizens of the Java and .NET platforms, while preventing vendor lock-in. The result is a drastic reduction in the complexity, delivery time, and cost associated with implementing workflow, BPM (business process management), and related business integration projects.

BPEL is a new standard for implementing business processes in an emerging service-oriented architecture world. As such, applying BPEL introduces new considerations, challenges, and pitfalls for delivering process-aware applications based on a service-oriented architecture (SOA).

The Rise of BPEL
There has been a continuous need in the enterprise to integrate systems and applications into end-to-end business processes. Traditional integration solutions are arcane, proprietary, and expensive, and have failed to be widely adopted across the enterprise. This has driven the industry to a new business process execution standard called BPEL. A widespread group of technology companies have committed to supporting BPEL, including all the Java and .NET platform vendors, and even at this early stage, portability across vendor implementations is being demonstrated.

In less than two years since being unveiled, BPEL has become the de-facto orchestration language standard, bypassing a number of alternative specifications such as BPML and WSCI. As Web services march towards mainstream adoption, focus is now shifting to delivery of process-centric applications based on a service-oriented architecture. To a large extent, the creation of BPEL was driven by the need to facilitate flexibility, visibility, and ease of management at the process layer. Applying BPEL effectively separates process logic from the rest of the application.

Separating process logic from the rest of the application and making it explicit is similar in pattern to the earlier separation of data representation from application logic. BPEL is a standardized, semantically rich, process language based on the Web services stack. The XML-based nature of BPEL processes enables the editing of process logic using visual design and modeling tools. It also allows for progress tracking, aggregate reporting, and complete end-to-end management of deployed processes. Access to runtime process execution information can in turn be used to analyze and further optimize BPEL processes.

With BPEL promoting interoperability at the tools level, over time a more collaborative development process is possible, allowing for developers, business analysts, and other skilled professionals to engage where they fit best at various stages of design, development, deployment, and management. The gap between business analysts and software developers has been a well-known impediment to efficient and successful implementations of process-aware applications. BPEL as a standardized process orchestration language carries the promise of narrowing that gap and allowing for effective collaboration between these otherwise disjointed groups of professionals.

The Scope of BPEL
Although relatively new, BPEL essentially derives its maturity from its predecessor orchestration languages, XLANG and WSFL, representing a vast implementation experience through research and commercial deployment. That background largely mitigates the risk generally associated with betting on a new standard and is augmented by the support of 100+ members backing the progress of the BPEL specification through the WSBPEL technical committee at OASIS.

With a design goal of maintaining a clean specification, BPEL focuses on supporting composition and coordination of services into business processes and new compound services. To keep complexity in check, BPEL does not directly support BPM- and workflow-specific constructs, such as tasks or roles. The explicit XML-based representation of BPEL process logic allows BPM functions to be provided independently on top of the existing language capabilities without adding to the complexity of the language. Workflow functions can be similarly provided through the notion of infrastructure services, for example, a task service for handling human interaction with a BPEL process (see Figure 1).

In a clean slate enterprise environment, all IT resources that need to be orchestrated into business processes would already be exposed as Web services. The prevailing reality is that this is seldom the case and existing legacy applications as well as newer applications are commonly exposed through a variety of proprietary interfaces and protocols. Supporting such environments in a pragmatic fashion becomes possible by using a WSDL binding framework (see Figure 2), such as Apache WSIF (Web Services Invocation Framework). This binding framework can support an arbitrary set of protocols in addition to SOAP, effectively decoupling resource connectivity from process design. The WSDL interface allows such resources to be viewed as Web services, ready for orchestration into a BPEL process, while native protocols are used to connect to them.

The BPEL Environment
Business integration scenarios that are appropriate to implement in BPEL commonly include several of the following technical requirements:

  • Access heterogeneous systems
  • Multiparty data transformations
  • Asynchronous interactions (state management, correlation)
  • Parallel processing (sophisticated join patterns)
  • Compensation logic ("Undo" operations)
  • Exception management
  • Reliability and scalability (high performance)
  • Operations management (auditing, monitoring)
  • Change management (side-by-side versioning)
BPEL is quickly becoming a mandatory requirement for customers evaluating BPM and workflow solutions as it inherently addresses many of these technical requirements and others that are supported by any mature commercial BPEL server implementation. The benefits of BPEL far outweigh the risks associated with adopting an emerging standard, considering the typical shortcomings of existing BPM and workflow solutions: high cost, complexity, and unavoidable vendor lock-in. BPEL avoids these issues by drastically reducing the skill level required for implementing process logic and seamlessly fitting into the interoperable Web services stack. Most importantly, BPEL provides a portable process-logic representation with a growing number of vendors to choose from for BPEL process execution.

Unlike existing BPM and workflow solutions, BPEL frees customers from having to choose an all-in-one solution and naturally leads to component choice and the utilization of standards-based architectures. Sun's JSR 208 (aka Java Business Integration, www.jcp.org/en/jsr/detail?id=208) describes one such blueprint that includes a BPEL process engine in a best-of-breed integrated framework for business integration. JSR 208 demonstrates that BPEL can elegantly integrate with existing infrastructure, in particular with the Java/J2EE platform. Finally, BPEL relies on XML as a universal data model for message exchange between a process and its related services. This further enables BPEL to interoperate with a wide variety of value-add application components, most of which now support XML for communication, e.g., rules engines, transformation services, and more.

BPEL and SOA
The introduction of Web services is creating a material change in the integration space, mostly due to the success and ROI of numerous Web services projects. Experience shows that the technologies for implementing a proper service-oriented architecture exist and work. The important aspect of this architecture is that it allows an organization to maintain a focus on solving business problems and not just application problems. From this perspective, solutions to business problems can now be addressed and implemented in a consistent and methodical fashion using this architectural framework.

It's important to note that there is a distinction between Web services and a service-oriented architecture. SOA is truly just an architecture and is independent of any particular set of technologies, including Web services. It is defined as an application architecture where functions are defined as loosely coupled (read: independent) services, with well-defined interfaces that can be called in specified sequences to form business processes. Web services, at the moment, act as the set of standards and technologies (such as SOAP, WSDL, and related XML standards) that collectively serve as the foundation for implementing SOAs.

With the above definition, it is apparent that SOAs will serve to address integration problems, leveraging services as building blocks. These building blocks will be composed and coordinated to form business processes, defining the role of BPEL in this architecture. The reusability and flexibility of services leads to integration problems from a top-down perspective rather than the traditional bottom up. Consequently, integration based on SOA can be addressed one problem at a time, rather than having to deploy upfront expensive and complex integration infrastructure throughout an enterprise prior to addressing any integration issue.

Services are then created to address specific integration problems and can later be reused to address new problems. Over time, collections of existing reusable services can be used to fully address an organization's integration problems. In that light, the value of SOAs grows incrementally as services are implemented and added as assets. Furthermore, with composition and coordination of services enabled by BPEL, functionality can be encapsulated at multiple levels to create more valuable services that can be leveraged across a wide range of new applications.

BPEL in Use
Commercial support for and implementations of the BPEL standard are rising at a fast pace, as evidenced by recent product announcements by leading platform vendors, EAI vendors, and startups. BPEL engine implementations span both the Java and .NET platforms while other vendors have added support for BPEL code generation from visual models created in their tool environments. Customer inquiries indicate that a growing number of end users are evaluating the use of BPEL for mission-critical projects. Such evaluations typically involve a technical hands-on product evaluation, a proof of concept, and sometimes an initial production deployment to prove the maturity and ROI of the approach. At this stage of adoption, many prospects are still discovering the proper criteria for evaluating BPEL tools and engines and treating their early implementations using BPEL as a stepping stone to more mission-critical applications.

However, as is commonly the case, certain industries are quicker than others to adopt new standards and technologies. Such industries include financial services, telecommunications, and various service providers to both business and government entities to name a few. Early examples for the use of BPEL in commercial applications include:

  • Resource management workflows:
    - Company: ISV offering workplace resource management solution to Fortune 500 companies
    - Operation: BPEL server bundled with product offering, using BPEL processes to manage workflows
  • New Policy Issuance Process:
    - Company: Fortune 50 insurance and annuities issuer company
    - Operation: Selling policies to consumers through brokers and agents
  • Service provisioning process:
    - Company: Information services provider (to global space agencies)
    - Operation: Supply satellite data to commercial and government customers
  • Enterprise-wide integration standards:
    - Company: Multinational Consumer Product Manufacturing Company
    - Operation: Low-cost simplified integration solution for online markets
The common drivers for automation in the above use cases are reduction of cycle times and processing costs as well as minimizing or eliminating manual coordination tasks (which commonly contribute to less efficient, error-prone processing). The choice of BPEL in all of these use cases involved both business requirements as well as technical requirements.

Business requirements include:

  • Lowest cost of acquisition and ownership
  • Processes in a standard way to avoid vendor lock-in
  • Support for both automated and human workflow
Technical requirements for the above include:
  • Support for Web services standards and XML
  • Scalability and reliability
  • Portable
  • Reusable
  • Easily modifiable
The Road Ahead
BPEL is on its way to becoming the cornerstone of SOA implementations in an enterprise environment, responsible for coordinating disparate resources and services into end-to-end business processes. The explicit XML-based representation of the BPEL process allows for extensive management, reporting, and analysis functions to be added on top of the process execution layer. These functions can be provided by the BPEL engine vendor of choice, or by ISVs who provide value-add components that can enhance an overall solution structured around a BPEL engine. While some of these value-added functions are available in commercial BPEL implementations today, we believe that this is just the tip of the iceberg. The next 12 months should be very interesting and offer significant ROI opportunities for the companies that are ready to move to business process management with BPEL.

More Stories By Doron Sherman

Doron Sherman is the CTO of Collaxa, Inc., a Web Service Orchestration Server vendor and a BEA Partner located in Redwood Shores, California. He has been involved with Java since its early days and pioneered application server technology while being a founder and chief scientist at NetDynamics.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
For most organizations, the move to hybrid cloud is now a question of when, not if. Fully 82% of enterprises plan to have a hybrid cloud strategy this year, according to Infoholic Research. The worldwide hybrid cloud computing market is expected to grow about 34% annually over the next five years, reaching $241.13 billion by 2022. Companies are embracing hybrid cloud because of the many advantages it offers compared to relying on a single provider for all of their cloud needs. Hybrid offers bala...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Microservices are increasingly used in the development world as developers work to create larger, more complex applications that are better developed and managed as a combination of smaller services that work cohesively together for larger, application-wide functionality. Tools such as Service Fabric are rising to meet the need to think about and build apps using a piece-by-piece methodology that is, frankly, less mind-boggling than considering the whole of the application at once. Today, we'll ...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
In his session at Cloud Expo, Alan Winters, an entertainment executive/TV producer turned serial entrepreneur, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to ma...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of th...