Welcome!

Microservices Expo Authors: Liz McMillan, Todd Matters, Pat Romanski, Elizabeth White, Stefana Muller

Related Topics: Microservices Expo

Microservices Expo: Article

Twenty-First Century Business Architecture

Twenty-First Century Business Architecture

While the vision of process management is not new, existing theories and systems have not been able to cope with the reality of business processes - until now. By placing business processes on center stage, as first class citizens in computing, corporations can gain the capabilities they need to innovate, reenergize performance, and deliver the value today's markets demand.

Business process management (BPM) systems discover what you do, and then manage the life cycle of improvement and optimization in a way that translates directly to operation. They see the world in terms of processes using notations and representations business people intuitively understand, and reflect the nature of the way business has to be - connected, collaborative, asynchronous, coordinated, conversational, and constantly changing.

Reengineering Reengineering
During the business process reengineering wave of the 1990s, management prophets' books of stories about other companies were all you had to guide the transformation of your business. Although the underlying theories were based on age-old common sense and general systems theory proposed 50 years earlier, reengineering advocates offered no path to execution. New processes could be envisaged but what happened next? There was no engineering in reengineering. Instead, processes were handed off - or, more precisely, thrown over the wall - to IT.

By contrast, the process-managed enterprise takes control of internal processes and communicates with a universal process language that enables partners and internal business units to execute on a shared vision - to understand each other's operations in detail, jointly design processes, and manage the entire life cycle of their business improvement initiatives. Companies embracing this approach to enterprise computing are using a new class of mission-critical infrastructure, a new category of software, the Business Process Management System (BPMS) - a business platform for business processes that exploits a company's existing technology infrastructures and assets.

Today, the vast majority of employees in large enterprises rely on nothing more than e-mail, spreadsheets, and word processing tools to coordinate work. Beyond this, automation is provided by expensive software applications maintained solely in the data center and by the staff of the IT department. Yet the majority of automation tasks needed each and every day in business are modest in relation to the complexity of today's IT systems. For example, nearly everyone needs more visibility of and control over the activities around them, as they interact with colleagues, partners and customers. Such iterations and communications are indeed the essence of business processes. Business users need control of information flows so that everyone remains focused on the task and is coordinated with everyone else - business processes follow no simple pattern and cannot be packaged easily.

Perhaps 80% of process-related tasks and their coordination could be designed and implemented by business people themselves - if only they had properly designed tools to enable them to directly manipulate their business processes. Moreover, business people should be able to implement changes to live business processes, meaning that the life cycle of the process design and modification needs to be where the process is used, not in the data center. Such environments are possible as companies acquire a BPMS capability. By contrast, business intelligence and action lag behind the current business activity if business processes are ingrained in rigid and brittle software systems.

Direct Representation and Manipulation
When business people develop numerical models using spreadsheets, they do not confuse the model and tool - the model and its representation has nothing to do with information technology. For example, a spreadsheet model might represent a budget, or an analysis of an engineering part. In contrast, the distinction between model and tool is frequently lost when trying to define BPM, which is often equated with systems integration or composite software development. While a BPMS may be integrated with other computing systems, and while the ingrained processes in those systems may be reused to create new process models, the process model itself has little to do with "systems integration" or "composite applications," and a lot to do with the budgeting or the engineering process.

Many in the IT industry perceive BPM only as a better, faster, cheaper way to integrate applications, and this view is exacerbated by the focus on languages used to support Web services orchestration, such as BPEL. For all that is written about such languages you would think that BPM is only about systems interoperability, application integration, and a smart new way to develop more software. This thinking totally misses the point. BPM is about better, faster, cheaper business processes, not better, faster, cheaper technology.

BPM technologies provide direct representation of business processes, and then open those processes to complete life-cycle management: from discovery to design, deployment, execution, operations, analysis, and optimization. Tell business people that BPM is about technical integration and watch their eyes glaze over. Tell them their "problem" is applications integration or composite applications and watch them excuse themselves from the conversation.

In short, integration technology, however wrapped in process clothing, solves only an integration need. This is not to say that integration products cannot evolve to become BPM products, or that BPM products cannot provide integration, but the distinction needs to be made. What distinguishes BPMS is its central focus on the direct representation and manipulation of business processes, just as RDBMS provides the representation and manipulation of business data and the spreadsheet provides the representation and manipulation of numerical data. On the other hand, a comprehensive BPMS incorporates robust application integration facilities, as corporations need to integrate automated processes in legacy systems or best-of-breed packages to the BPM level - integrate once to the BPM level of abstraction, then develop and manage many business processes without returning to the technology plumbing. Processes embedded in legacy systems can be made "reusable," and are mandatory participants in many of the business processes companies wish to manage more actively and directly.

BPM systems are helping organizations to obliterate, not just bridge, the business-IT divide by placing control of business processes directly in the hands of business people, including front-line workers. Personal, workgroup, and departmental BPM tools, akin to tools commonly found in office productivity suites, are emerging. The role of IT is changing, away from custom development of more and more application software and toward the provision of BPM systems. Imagine a "Process Office" suite providing an integrated, process-centric approach to collaboration, computation, work management, process modeling, and simulation.

Aberdeen Group elaborates, "The BPM category may arguably provide the greatest return on investment compared to any other category available on the market today. BPM gives organizations the ability to cut operational costs at a time when the economic downturn makes it increasingly difficult to boost revenues... Business Process Management enables government agencies to dismantle obsolete bureaucratic divisions by cutting the labor- and paper-intensive inefficiency from manual, back-end processes. Faster and auditable processes allow employees to do more in less time, reducing paper use as well as administrative overhead and resources." In short, BPM is becoming the bedrock for a whole new world of process work.

Imagine a sales campaign "application." It could be developed upon a relational database management system (RDBMS), but would the data model and software provide the flexibility required? Would such an application naturally fit and adapt to the business process? Companies in different industries have diverse needs for sales campaign automation, and individual companies in the same sector compete with each other by differentiating sales processes. Packaging a sales-campaign application in software on a static data model seems inappropriate. Not only is each sales campaign in each company different, they are different within the same company for different types of products and services. In addition, as each campaign progresses, processes associated with each prospective customer may have to vary widely from the initial "sales plan." Therefore, instead of packaging the sales campaign as a software application, why not deliver it as a process? Give business people tools to build their own sales process. Allow them to customize the process for each customer. Give them the tools to include participants in the campaign as required, including employees, partners, systems, and information sources. Let the BPMS manage the end-to-end state of all processes. Provide business people with the tools they need to query the state of the campaign along key dimensions such as customer, product, and part; and based on this business intelligence, make adjustments to the process in order to respond to individual customer needs.

A Formal Foundation
Many trends have converged to create the brave new world of business process management - workflow management, business process modeling, quality management, business reengineering, change management, and distributed computing, to name but a few. Yet there was a vital and missing ingredient, the direct manipulation of business processes. The IT facade behind today's business processes (consisting of disjointed data models, application logic, workflows, and integration systems, repeated a hundred times in a hundred silos) can now be rationalized, not by replacing previous investments but by exploiting what they offer in combination, recast in the form of new process models and systems. It has taken the IT industry 20 years to find a way to represent the computational elements needed for a unified process representation on which to build tools that can be used to conceive, compose, and put new processes into operation.

The unifying theories needed for business process computing lie in an obscure branch of mathematics called the pi-calculus, whose conceptual father is Robin Milner, professor of computer science at Cambridge University and a Turing Award winner. Pi-calculus plays a role in BPMS similar to the roles finite state machines play in the business spreadsheet and the relational algebra in database management systems. Pi-calculus and related formalisms are complex, but business people couldn't care less about formalisms. On the other hand, the automation tools they use, each and every day, depend upon such science for robustness and reliability. By representing business processes in a mathematically formalized way, processes developed in one part of the business, or by a business partner, can be connected, combined and analyzed in real time, providing a foundation for the true real-time enterprise behind the real-time enterprise slogan. While the notion of a real-time enterprise is all about agility, and while the basis for technical agility may be a service-oriented architecture (SOA), the basis for business agility is BPM, making the SOA necessary but insufficient for meeting today's business needs. Just as the operating system emerged as the platform for the RDBMS capability, Web services and SOA are emerging as the platform for BPMS capability.

The central insight of pi-calculus is that all processes are acts of communication. This paradigm has enabled the Business Process Management Initiative (BPMI.org) to define document structures that capture the day-to-day communication that occurs in business at all levels - formal, informal, asynchronous, synchronous, human originated, or machine initiated. Such process document structures can be used to define any process, from the highest level of business strategy to the most basic numerical computation. Process-based documents can evolve with the business - just like spreadsheets and word processing documents do today. Milner calls such processes mobile, reflecting the dynamic, agile, real-time and adaptive nature of real business processes, not the rigid automation functions of typical hard-coded computer applications.

Unlike application packages, the BPMS adapts to a company's processes, not the other way around. The BPMS platform is targeted at a new hybrid business role that combines the skills of the enterprise data architect and enterprise business architect, allowing them to create process tools for all employees to power their work, each and every day - the process architect is the true architect of 21st century business, and BPMS is the foundation of 21st century enterprise architecture. To wit, a global telecommunications operator moving into broadband used BPM to facilitate the aggressive acquisition of large numbers of new customers, accelerating customer satisfaction well beyond the competition. Flexible new processes allowed the operator to collect, store, and queue orders to ensure that customers did not experience outages from failures in dependent systems operated by third parties. BPM allowed the customer service staff to be flexible in responding to numerous and diverse customer requests - it seems every customer has a custom need that demands custom business processes to fulfill. BPM insulated the customer service representative from changes occurring in third-party service providers and the changes arising from the unbundling of service elements as a result of deregulation.

The Business Process Management System
Process management borrows and combines features from a number of familiar tools and technologies, but differs in its central focus on communicating processes. BPM feels similar to computer-aided software engineering (CASE) because of its emphasis on graphical notation, collaborative discovery, and design. It shares with workflow management a focus on scripted events and task management. From the viewpoint of the systems architect, comparisons can be drawn with transaction processing monitors and application servers. For ERP practitioners, BPM's focus on process definition and optimization will have strong associations. Developers who have struggled with legacy system integration and who have employed enterprise application integration solutions will recognize similar ideas in BPM, especially where applications are to play key roles in end-to-end business processes. Process analysis tools used in conjunction with the BPMS will be familiar to users of online analytical processing (OLAP).

Because of these prior experiences, a company's existing IT skills can be readily transferred to the world of BPM. However, do not let these similarities lead you to conclude that BPM is simply a repackaging of existing technology - this year's latest IT branding. Although many vendors will try to hijack the BPM title, the reality of BPM is that it is unique in its ability to provide a top-down approach to business design that isolates business users from the vagaries of the numerous enterprise application systems already in place. Process models act like "live applications," but they are only process-schemas deployed on and directly executed by the BPMS. There is no waterfall model of process development as there is in software development. Top to bottom, at all levels of the process model the process representation is the same and is directly executed by a BPMS in the way that an RDBMS directly executes database structured query language commands. But "direct execution" does not quite capture what is going on, for processes are often confused with more traditional software procedures or scripted workflows. Rather, the process definition is a declarative description of the now and the future - as instances of the process are created, like a new row being added to a spreadsheet or a database, they proceed in line with their design.

In look and feel, the BPMS is to the business process designer what a design workstation is to the automobile designer. The computer-aided design and computer-aided manufacturing (CAD/CAM) system of the automobile designer becomes the computer-aided modeling/computer-aided deployment (CAM/CAD) system of the business process designer. Underlying the BPMS, as in the case of CAD/CAM systems, is a digital representation and simulation of the real "thing" with which the designer is working. While the automobile designer works with digital artifacts such as tires, engines, body frames, and aerodynamics, the process designer works with digital artifacts such as orders, suppliers' fulfillment services, third-party billing services, bills for materials, shipping schedules of trading partners, and so on. When the automobile engineer pushes the "make it so button," the computer-aided manufacturing part of the system actually implements the building of the new car. When the business process engineer pushes the "make it so button," the computer-aided deployment part of the system actually implements the mission-critical, end-to-end business process across the disparate legacy systems inside the enterprise and across the value chain.

What about all the C++, Java, scripting, EAI, and other computer technologies that are involved? Where did they go in all this? They are still there, only now it is the BPMS that deals with them, not the designers and other business people who use the business process workstations and the underlying BPMS. With the BPMS, business information systems are developed and evolved by manipulating the business process directly using the language and concepts of business, not the language and concepts of machines.

BPM is all about raising the level of abstraction from machine concepts to business concepts. Although BPM isn't a panacea for all computing needs, it requires a deliberate step of abstraction and application integration to a common process model, and fulfills an increasing number of business-critical needs.

Crossing the Chasm
Business process management products are available from many vendors, in versions ranging from departmental workgroup solutions to enterprise-scale infrastructures - a spectrum of solutions to meet diverse needs. Not all BPM systems, by any means, use the pi-calculus formalism yet, or process languages built from it such as the Business Process Modeling Language (BPML) published by BPMI.org. But as other make-do approaches hit technological walls, this will change. The underlying mathematics of pi-calculus and the semantics of BPML are hard to ignore, for these foundations are paramount to robust and reliable business process management. Process languages, such as the vendor-sponsored Business Process Execution Language for Web Services (BPEL4WS), will converge and evolve towards the needs of a BPMS with a solid mathematical underpinning.

Today, BPEL is primarily advocated for loosely coupled application integration and development, but as the needs for BPM go well beyond Web services and simple workflow requirements, BPEL will require the same theoretical foundation. CIOs will rightly disregard any other simplistic BPM "layer" as "yet another point solution" unless BPM systems can be shown that they embody a strong formal model of enterprise computing and mobile processes. Only then can BPMS migrate from a niche category to a mainstream platform, similar to what companies already know and understand in other areas of IT support such as relational data management and network management. BPM is far more than another way to develop applications. BPMS is a platform that will support a raft of new processes, tools, and applications. A sales campaign isn't a software application - it's an application of process management.

How will BPM be assimilated by end-user organizations? There is no doubt that businesses will continue to look to their current software suppliers for BPM innovations, yet they need a true BPMS today even if their preferred supplier cannot deliver, opening the market for new entrants. In addition, companies that survived the turbulent era of reengineering may be tempted to feel that they have already reengineered, reinvented, mapped, analyzed, and improved every aspect of their business processes. The reality is that they know, deep down, that they have barely started and business processes are in a continual state of flux. The reengineering prophecy - "we've not done reengineering"- is indeed true.

Now we are in uncertain times again - a downturned economy, corporate scandals, and the IT Ice Age. Today, companies are experiencing not one broad-based economic reality, but a multitude of process-related problems. They absolutely must be able to do more with less. Early adopters of BPM systems will therefore be those companies that face the most severe process management problems, just as early adopters of relational database systems were those that faced severe data management problems. It is not easy to cast business process-related problems into neat little categories or magic quadrants and no pattern of an "ideal BPMS" has yet emerged. On the other hand, business processes and their management lie at the heart of all business activity. As a result, processes are taking center stage, driving demand for powerful BPM systems with "pi-calculus inside" that take the process complexity outside of stovepipe applications and allow existing applications to be expressed in a form that business people, not just programmers, can understand, evolve, and manage.

Sitting right at the divide between humans and machines - between business and IT - the BPMS represents a paradigm shift in the world of business automation - business process computing - that has a profound impact on the way companies structure and perform work, letting people speak in their native tongue and enabling machines to understand them, not vice versa. Designed top down and deployed directly in accordance with a company's strategy, business processes can now be unhindered by the constraints of existing IT systems.

The implications are equally profound for the IT industry, for it must enter its next phase of evolution and maturity. As CAD/CAM systems enabled computer-integrated and "just-in-time" manufacturing, BPM can facilitate collaborative "just-in-time" business processes and a new era of process manufacturing. Those players in the IT industry that master BPM will share the new wealth with their customers: productivity gains, innovation, and lowered costs like those the industrial design and manufacturing industries have already realized as a result of implementing a direct path from design to execution.

Welcome to the company of the future, the fully digitized corporation - the process-managed enterprise. Welcome to the next 50 years of business and IT.

More Stories By Peter Fingar

Peter Fingar is an Executive Partner with the digital strategy firm, the Greystone Group. He delivers keynotes worldwide and is author of the best-selling books, The Death of "e" and the Birth of the Real New
Economy and Enterprise E-Commerce. Over his 30-year career he has taught graduate and undergraduate computing studies and held management, technical and consulting positions with GTE Data Services, Saudi Aramco, the Technical Resource Connection division of Perot Systems and IBM Global Services, as well as serving as CIO for the University of Tampa.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
For most organizations, the move to hybrid cloud is now a question of when, not if. Fully 82% of enterprises plan to have a hybrid cloud strategy this year, according to Infoholic Research. The worldwide hybrid cloud computing market is expected to grow about 34% annually over the next five years, reaching $241.13 billion by 2022. Companies are embracing hybrid cloud because of the many advantages it offers compared to relying on a single provider for all of their cloud needs. Hybrid offers bala...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Microservices are increasingly used in the development world as developers work to create larger, more complex applications that are better developed and managed as a combination of smaller services that work cohesively together for larger, application-wide functionality. Tools such as Service Fabric are rising to meet the need to think about and build apps using a piece-by-piece methodology that is, frankly, less mind-boggling than considering the whole of the application at once. Today, we'll ...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
In his session at Cloud Expo, Alan Winters, an entertainment executive/TV producer turned serial entrepreneur, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to ma...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of th...