Welcome!

Microservices Expo Authors: Derek Weeks, Liz McMillan, Elizabeth White, John Katrick, Pat Romanski

Related Topics: Microservices Expo

Microservices Expo: Article

SOA World Magazine "BPEL's Growing Up"

What's Next?

IT architectures have evolved to include process orchestration as a fundamental layer due in no small part to the emergence and widespread adoption of the WS-BPEL standard. WS-BPEL, also known as Business Process Execution Language or just BPEL, is a standard owned by OASIS that provides rich and comprehensive orchestration semantics. This article will provide a brief overview of how BPEL came to be what it is today and then focus on the latest developments in the BPEL standard and where we believe this standards area will go over the next few years. In particular some of the key areas for growth in this space include the standardization of human workflow support and better integration with process modeling and analysis tools and standards.

A Brief History of BPEL
Much of BPEL's broad adoption and acceptance comes from its ancestry. Historically, BPEL emerged when IBM and Microsoft joined forces and merged their proprietary workflow languages, WSFL and XLANG, respectively, into a next-generation business process language specification that at the time was called BPEL4WS (Business Process Execution Language for Web Services). BPEL4WS 1.0, known as the "license plate standard," was first publicly released in August 2002 by IBM, BEA, Microsoft, SAP, and Siebel Systems (BEA, SAP, and Siebel joined the initiative just before the publication of the specification.) Though BPEL was a new specification at the time, even then it was relatively mature due to its underpinnings, having been built on workflow languages that had been production tested over several years. This has benefited BPEL greatly, making it much more mature than its age would otherwise indicate. In fact, an educational institution in the Netherlands has published a site with an academic analysis of the comparative richness of workflow languages that demonstrates that while BPEL isn't perfect, it's better than any of its predecessors at implementing different workflow patterns. (see http://is.tm.tue.nl/research/patterns/standards.htm)

What BPEL Is Today
After BPEL4WS 1.0, a minor 1.1 upgrade was released in May 2003 and was submitted officially to OASIS. A WS-BPEL technical committee was assembled that is now one of OASIS' largest technical committees. From there, the standardization process slowed evolution significantly (as often happens...) and since then, OASIS has been working to release WS-BPEL 2.0. As such, most of the public implementations today of BPEL tools are based on the BPEL 1.1 specification. However, as BPEL 2.0 is very close to being finalized, several vendors including Oracle have implemented varying degrees of BPEL 2.0 functionality in currently shipping products.

In any case, BPEL has become deeply entrenched in the enterprise IT toolkit. We now see developers get excited about working on BPEL projects because it keeps their skills up-to-date. To get a sense of the current adoption of the BPEL standard, a search on a job search site, SimplyHired.com, for job postings with BPEL in their description yields hundreds of current job postings for BPEL-related positions (www.simplyhired.com/index.php?ds=sr&q=BPEL ).

Where We Are Going
People who are looking to point out weaknesses in the BPEL standard often comment that it does not include support for human workflow tasks. This is true, of course, even with the BPEL 2.0 standard. However BPEL does have rich support for asynchronous services and so one approach is to build a human workflow service engine that BPEL processes can then call out to. With this architecture, human tasks and manual steps can be incorporated in 100% standard BPEL process flows, just like any other asynchronous service. This architecture was detailed in a Web Services Journal article "BPEL Processes and Human Workflow" dated April 12, 2006 by Matjaz Juric and Doug H. Todd (http://webservices.sys-con.com/read/204417.htm). This approach has been adopted by several vendors including Oracle; we believe this provides a very clean architecture while the standardization process catches up in this area.

Going forward, we're already seeing the next generation of standards around BPEL being discussed. For example, the "BPEL4People" effort was first announced in late 2005 and is intended to standardize an approach similar to the one described above for incorporating human workflow tasks in BPEL processes. Besides being one of our favorite standards acronyms, BPEL4People is an important area of work since most business processes span both systems and humans. It also answers the question once and for all as to whether BPEL is properly pronounced "bepple," "bee-pull," or "bee-pell." (Answer - it must rhyme nicely with "people".)

Another area we see evolving is tighter integration between a process implementation language like BPEL and standards like BPMN that describe a business process modeling notation - a business analyst-friendly visual representation of a process. Since BPEL says nothing about the visual representation of a process and BPMN says nothing about the save format, they would seem like a perfect match. In practice, there are still some gaps to be filled, but in general we believe that tighter coupling between the standards (and tools) for business analysts and process developers will be a fantastic development for the IT world at large.

In the next sections we look in more detail at these growth areas that will expand the reach of business process standards and help BPEL achieve its full potential.

Process Orchestration
Business processes span services, applications, and human activities; these processes need to be orchestrated in an agile fashion with end-to-end control, visibility, and rich exception management. Process orchestration is the heart of Business Process Management enabling creation of executable business processes from services and human activities. Process Orchestration requirements include:

  • Sequencing, including serial, parallel, and other control flow patterns
  • Exception handling including error conditions, transactions, and compensation
  • Data flow and transformation
  • Event handling including timers and other out-of-band events (Figure 1)

    BPEL today addresses these requirements in a mature fashion. Some of the salient features of BPEL include:

  • Rich sequencing semantics including parallel and asynchronous processing
  • A compensation-based long-running transaction model
  • Rich scoped fault-handling capabilities
  • Asynchronous event handling allowing time-based alerts as well as out-of-band events like a status request or cancellation event
  • A Web service-based model for process decomposition and assembly: each BPEL process exposes a Web Services interface and can be easily composed into other higher-level compound flows
  • Standard use of XML and XPath for data access and manipulation


More Stories By Manoj Das

Manoj Das is senior manager in the product management group for Oracle Fusion Middleware. His focus is on BPEL and Business Rules. Manoj joins Oracle from the Siebel acquisition where he was responsible for driving the next generation process-centric application platform.

More Stories By Dave Shaffer

Dave Shaffer has been helping customers use the Oracle BPEL Process Manager since 2001, managing implementation projects, providing technical training, and ensuring successful implementations. Prior to joining Oracle, Shaffer was a principal consultant at Collaxa, a managing director at Eleven Acceleration, and manager of a professional services group at Apple Computer.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Chris 03/07/07 10:12:00 PM EST

You can also see the trend for "BPEL" appearing in job postings over the past year using Indeed's job trends tool:

http://www.indeed.com/jobtrends?q=BPEL%2C+BPMN

SOA News 03/07/07 01:40:30 PM EST

IT architectures have evolved to include process orchestration as a fundamental layer due in no small part to the emergence and widespread adoption of the WS-BPEL standard. WS-BPEL, also known as Business Process Execution Language or just BPEL, is a standard owned by OASIS that provides rich and comprehensive orchestration semantics. This article will provide a brief overview of how BPEL came to be what it is today and then focus on the latest developments in the BPEL standard and where we believe this standards area will go over the next few years. In particular some of the key areas for growth in this space include the standardization of human workflow support and better integration with process modeling and analysis tools and standards.

@MicroservicesExpo Stories
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things ...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
Cloud Governance means many things to many people. Heck, just the word cloud means different things depending on who you are talking to. While definitions can vary, controlling access to cloud resources is invariably a central piece of any governance program. Enterprise cloud computing has transformed IT. Cloud computing decreases time-to-market, improves agility by allowing businesses to adapt quickly to changing market demands, and, ultimately, drives down costs.
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Recent survey done across top 500 fortune companies shows almost 70% of the CIO have either heard about IAC from their infrastructure head or they are on their way to implement IAC. Yet if you look under the hood while some level of automation has been done, most of the infrastructure is still managed in much tradition/legacy way. So, what is Infrastructure as Code? how do you determine if your IT infrastructure is truly automated?
Every few years, a disruptive force comes along that prompts us to reframe our understanding of what something means, or how it works. For years, the notion of what a computer is and how you make one went pretty much unchallenged. Then virtualization came along, followed by cloud computing, and most recently containers. Suddenly the old rules no longer seemed to apply, or at least they didn’t always apply. These disruptors made us reconsider our IT worldview.
As people view cloud as a preferred option to build IT systems, the size of the cloud-based system is getting bigger and more complex. As the system gets bigger, more people need to collaborate from design to management. As more people collaborate to create a bigger system, the need for a systematic approach to automate the process is required. Just as in software, cloud now needs DevOps. In this session, the audience can see how people can solve this issue with a visual model. Visual models ha...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, will discuss some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he’ll go over some of the best practices for structured team migrat...