Welcome!

Microservices Expo Authors: Derek Weeks, Elizabeth White, Gopala Krishna Behara, Sridhar Chalasani, Tirumala Khandrika

Related Topics: Microservices Expo

Microservices Expo: Article

Twenty-First Century Business Architecture

Twenty-First Century Business Architecture

While the vision of process management is not new, existing theories and systems have not been able to cope with the reality of business processes - until now. By placing business processes on center stage, as first class citizens in computing, corporations can gain the capabilities they need to innovate, reenergize performance, and deliver the value today's markets demand.

Business process management (BPM) systems discover what you do, and then manage the life cycle of improvement and optimization in a way that translates directly to operation. They see the world in terms of processes using notations and representations business people intuitively understand, and reflect the nature of the way business has to be - connected, collaborative, asynchronous, coordinated, conversational, and constantly changing.

Reengineering Reengineering
During the business process reengineering wave of the 1990s, management prophets' books of stories about other companies were all you had to guide the transformation of your business. Although the underlying theories were based on age-old common sense and general systems theory proposed 50 years earlier, reengineering advocates offered no path to execution. New processes could be envisaged but what happened next? There was no engineering in reengineering. Instead, processes were handed off - or, more precisely, thrown over the wall - to IT.

By contrast, the process-managed enterprise takes control of internal processes and communicates with a universal process language that enables partners and internal business units to execute on a shared vision - to understand each other's operations in detail, jointly design processes, and manage the entire life cycle of their business improvement initiatives. Companies embracing this approach to enterprise computing are using a new class of mission-critical infrastructure, a new category of software, the Business Process Management System (BPMS) - a business platform for business processes that exploits a company's existing technology infrastructures and assets.

Today, the vast majority of employees in large enterprises rely on nothing more than e-mail, spreadsheets, and word processing tools to coordinate work. Beyond this, automation is provided by expensive software applications maintained solely in the data center and by the staff of the IT department. Yet the majority of automation tasks needed each and every day in business are modest in relation to the complexity of today's IT systems. For example, nearly everyone needs more visibility of and control over the activities around them, as they interact with colleagues, partners and customers. Such iterations and communications are indeed the essence of business processes. Business users need control of information flows so that everyone remains focused on the task and is coordinated with everyone else - business processes follow no simple pattern and cannot be packaged easily.

Perhaps 80% of process-related tasks and their coordination could be designed and implemented by business people themselves - if only they had properly designed tools to enable them to directly manipulate their business processes. Moreover, business people should be able to implement changes to live business processes, meaning that the life cycle of the process design and modification needs to be where the process is used, not in the data center. Such environments are possible as companies acquire a BPMS capability. By contrast, business intelligence and action lag behind the current business activity if business processes are ingrained in rigid and brittle software systems.

Direct Representation and Manipulation
When business people develop numerical models using spreadsheets, they do not confuse the model and tool - the model and its representation has nothing to do with information technology. For example, a spreadsheet model might represent a budget, or an analysis of an engineering part. In contrast, the distinction between model and tool is frequently lost when trying to define BPM, which is often equated with systems integration or composite software development. While a BPMS may be integrated with other computing systems, and while the ingrained processes in those systems may be reused to create new process models, the process model itself has little to do with "systems integration" or "composite applications," and a lot to do with the budgeting or the engineering process.

Many in the IT industry perceive BPM only as a better, faster, cheaper way to integrate applications, and this view is exacerbated by the focus on languages used to support Web services orchestration, such as BPEL. For all that is written about such languages you would think that BPM is only about systems interoperability, application integration, and a smart new way to develop more software. This thinking totally misses the point. BPM is about better, faster, cheaper business processes, not better, faster, cheaper technology.

BPM technologies provide direct representation of business processes, and then open those processes to complete life-cycle management: from discovery to design, deployment, execution, operations, analysis, and optimization. Tell business people that BPM is about technical integration and watch their eyes glaze over. Tell them their "problem" is applications integration or composite applications and watch them excuse themselves from the conversation.

In short, integration technology, however wrapped in process clothing, solves only an integration need. This is not to say that integration products cannot evolve to become BPM products, or that BPM products cannot provide integration, but the distinction needs to be made. What distinguishes BPMS is its central focus on the direct representation and manipulation of business processes, just as RDBMS provides the representation and manipulation of business data and the spreadsheet provides the representation and manipulation of numerical data. On the other hand, a comprehensive BPMS incorporates robust application integration facilities, as corporations need to integrate automated processes in legacy systems or best-of-breed packages to the BPM level - integrate once to the BPM level of abstraction, then develop and manage many business processes without returning to the technology plumbing. Processes embedded in legacy systems can be made "reusable," and are mandatory participants in many of the business processes companies wish to manage more actively and directly.

BPM systems are helping organizations to obliterate, not just bridge, the business-IT divide by placing control of business processes directly in the hands of business people, including front-line workers. Personal, workgroup, and departmental BPM tools, akin to tools commonly found in office productivity suites, are emerging. The role of IT is changing, away from custom development of more and more application software and toward the provision of BPM systems. Imagine a "Process Office" suite providing an integrated, process-centric approach to collaboration, computation, work management, process modeling, and simulation.

Aberdeen Group elaborates, "The BPM category may arguably provide the greatest return on investment compared to any other category available on the market today. BPM gives organizations the ability to cut operational costs at a time when the economic downturn makes it increasingly difficult to boost revenues... Business Process Management enables government agencies to dismantle obsolete bureaucratic divisions by cutting the labor- and paper-intensive inefficiency from manual, back-end processes. Faster and auditable processes allow employees to do more in less time, reducing paper use as well as administrative overhead and resources." In short, BPM is becoming the bedrock for a whole new world of process work.

Imagine a sales campaign "application." It could be developed upon a relational database management system (RDBMS), but would the data model and software provide the flexibility required? Would such an application naturally fit and adapt to the business process? Companies in different industries have diverse needs for sales campaign automation, and individual companies in the same sector compete with each other by differentiating sales processes. Packaging a sales-campaign application in software on a static data model seems inappropriate. Not only is each sales campaign in each company different, they are different within the same company for different types of products and services. In addition, as each campaign progresses, processes associated with each prospective customer may have to vary widely from the initial "sales plan." Therefore, instead of packaging the sales campaign as a software application, why not deliver it as a process? Give business people tools to build their own sales process. Allow them to customize the process for each customer. Give them the tools to include participants in the campaign as required, including employees, partners, systems, and information sources. Let the BPMS manage the end-to-end state of all processes. Provide business people with the tools they need to query the state of the campaign along key dimensions such as customer, product, and part; and based on this business intelligence, make adjustments to the process in order to respond to individual customer needs.

A Formal Foundation
Many trends have converged to create the brave new world of business process management - workflow management, business process modeling, quality management, business reengineering, change management, and distributed computing, to name but a few. Yet there was a vital and missing ingredient, the direct manipulation of business processes. The IT facade behind today's business processes (consisting of disjointed data models, application logic, workflows, and integration systems, repeated a hundred times in a hundred silos) can now be rationalized, not by replacing previous investments but by exploiting what they offer in combination, recast in the form of new process models and systems. It has taken the IT industry 20 years to find a way to represent the computational elements needed for a unified process representation on which to build tools that can be used to conceive, compose, and put new processes into operation.

The unifying theories needed for business process computing lie in an obscure branch of mathematics called the pi-calculus, whose conceptual father is Robin Milner, professor of computer science at Cambridge University and a Turing Award winner. Pi-calculus plays a role in BPMS similar to the roles finite state machines play in the business spreadsheet and the relational algebra in database management systems. Pi-calculus and related formalisms are complex, but business people couldn't care less about formalisms. On the other hand, the automation tools they use, each and every day, depend upon such science for robustness and reliability. By representing business processes in a mathematically formalized way, processes developed in one part of the business, or by a business partner, can be connected, combined and analyzed in real time, providing a foundation for the true real-time enterprise behind the real-time enterprise slogan. While the notion of a real-time enterprise is all about agility, and while the basis for technical agility may be a service-oriented architecture (SOA), the basis for business agility is BPM, making the SOA necessary but insufficient for meeting today's business needs. Just as the operating system emerged as the platform for the RDBMS capability, Web services and SOA are emerging as the platform for BPMS capability.

The central insight of pi-calculus is that all processes are acts of communication. This paradigm has enabled the Business Process Management Initiative (BPMI.org) to define document structures that capture the day-to-day communication that occurs in business at all levels - formal, informal, asynchronous, synchronous, human originated, or machine initiated. Such process document structures can be used to define any process, from the highest level of business strategy to the most basic numerical computation. Process-based documents can evolve with the business - just like spreadsheets and word processing documents do today. Milner calls such processes mobile, reflecting the dynamic, agile, real-time and adaptive nature of real business processes, not the rigid automation functions of typical hard-coded computer applications.

Unlike application packages, the BPMS adapts to a company's processes, not the other way around. The BPMS platform is targeted at a new hybrid business role that combines the skills of the enterprise data architect and enterprise business architect, allowing them to create process tools for all employees to power their work, each and every day - the process architect is the true architect of 21st century business, and BPMS is the foundation of 21st century enterprise architecture. To wit, a global telecommunications operator moving into broadband used BPM to facilitate the aggressive acquisition of large numbers of new customers, accelerating customer satisfaction well beyond the competition. Flexible new processes allowed the operator to collect, store, and queue orders to ensure that customers did not experience outages from failures in dependent systems operated by third parties. BPM allowed the customer service staff to be flexible in responding to numerous and diverse customer requests - it seems every customer has a custom need that demands custom business processes to fulfill. BPM insulated the customer service representative from changes occurring in third-party service providers and the changes arising from the unbundling of service elements as a result of deregulation.

The Business Process Management System
Process management borrows and combines features from a number of familiar tools and technologies, but differs in its central focus on communicating processes. BPM feels similar to computer-aided software engineering (CASE) because of its emphasis on graphical notation, collaborative discovery, and design. It shares with workflow management a focus on scripted events and task management. From the viewpoint of the systems architect, comparisons can be drawn with transaction processing monitors and application servers. For ERP practitioners, BPM's focus on process definition and optimization will have strong associations. Developers who have struggled with legacy system integration and who have employed enterprise application integration solutions will recognize similar ideas in BPM, especially where applications are to play key roles in end-to-end business processes. Process analysis tools used in conjunction with the BPMS will be familiar to users of online analytical processing (OLAP).

Because of these prior experiences, a company's existing IT skills can be readily transferred to the world of BPM. However, do not let these similarities lead you to conclude that BPM is simply a repackaging of existing technology - this year's latest IT branding. Although many vendors will try to hijack the BPM title, the reality of BPM is that it is unique in its ability to provide a top-down approach to business design that isolates business users from the vagaries of the numerous enterprise application systems already in place. Process models act like "live applications," but they are only process-schemas deployed on and directly executed by the BPMS. There is no waterfall model of process development as there is in software development. Top to bottom, at all levels of the process model the process representation is the same and is directly executed by a BPMS in the way that an RDBMS directly executes database structured query language commands. But "direct execution" does not quite capture what is going on, for processes are often confused with more traditional software procedures or scripted workflows. Rather, the process definition is a declarative description of the now and the future - as instances of the process are created, like a new row being added to a spreadsheet or a database, they proceed in line with their design.

In look and feel, the BPMS is to the business process designer what a design workstation is to the automobile designer. The computer-aided design and computer-aided manufacturing (CAD/CAM) system of the automobile designer becomes the computer-aided modeling/computer-aided deployment (CAM/CAD) system of the business process designer. Underlying the BPMS, as in the case of CAD/CAM systems, is a digital representation and simulation of the real "thing" with which the designer is working. While the automobile designer works with digital artifacts such as tires, engines, body frames, and aerodynamics, the process designer works with digital artifacts such as orders, suppliers' fulfillment services, third-party billing services, bills for materials, shipping schedules of trading partners, and so on. When the automobile engineer pushes the "make it so button," the computer-aided manufacturing part of the system actually implements the building of the new car. When the business process engineer pushes the "make it so button," the computer-aided deployment part of the system actually implements the mission-critical, end-to-end business process across the disparate legacy systems inside the enterprise and across the value chain.

What about all the C++, Java, scripting, EAI, and other computer technologies that are involved? Where did they go in all this? They are still there, only now it is the BPMS that deals with them, not the designers and other business people who use the business process workstations and the underlying BPMS. With the BPMS, business information systems are developed and evolved by manipulating the business process directly using the language and concepts of business, not the language and concepts of machines.

BPM is all about raising the level of abstraction from machine concepts to business concepts. Although BPM isn't a panacea for all computing needs, it requires a deliberate step of abstraction and application integration to a common process model, and fulfills an increasing number of business-critical needs.

Crossing the Chasm
Business process management products are available from many vendors, in versions ranging from departmental workgroup solutions to enterprise-scale infrastructures - a spectrum of solutions to meet diverse needs. Not all BPM systems, by any means, use the pi-calculus formalism yet, or process languages built from it such as the Business Process Modeling Language (BPML) published by BPMI.org. But as other make-do approaches hit technological walls, this will change. The underlying mathematics of pi-calculus and the semantics of BPML are hard to ignore, for these foundations are paramount to robust and reliable business process management. Process languages, such as the vendor-sponsored Business Process Execution Language for Web Services (BPEL4WS), will converge and evolve towards the needs of a BPMS with a solid mathematical underpinning.

Today, BPEL is primarily advocated for loosely coupled application integration and development, but as the needs for BPM go well beyond Web services and simple workflow requirements, BPEL will require the same theoretical foundation. CIOs will rightly disregard any other simplistic BPM "layer" as "yet another point solution" unless BPM systems can be shown that they embody a strong formal model of enterprise computing and mobile processes. Only then can BPMS migrate from a niche category to a mainstream platform, similar to what companies already know and understand in other areas of IT support such as relational data management and network management. BPM is far more than another way to develop applications. BPMS is a platform that will support a raft of new processes, tools, and applications. A sales campaign isn't a software application - it's an application of process management.

How will BPM be assimilated by end-user organizations? There is no doubt that businesses will continue to look to their current software suppliers for BPM innovations, yet they need a true BPMS today even if their preferred supplier cannot deliver, opening the market for new entrants. In addition, companies that survived the turbulent era of reengineering may be tempted to feel that they have already reengineered, reinvented, mapped, analyzed, and improved every aspect of their business processes. The reality is that they know, deep down, that they have barely started and business processes are in a continual state of flux. The reengineering prophecy - "we've not done reengineering"- is indeed true.

Now we are in uncertain times again - a downturned economy, corporate scandals, and the IT Ice Age. Today, companies are experiencing not one broad-based economic reality, but a multitude of process-related problems. They absolutely must be able to do more with less. Early adopters of BPM systems will therefore be those companies that face the most severe process management problems, just as early adopters of relational database systems were those that faced severe data management problems. It is not easy to cast business process-related problems into neat little categories or magic quadrants and no pattern of an "ideal BPMS" has yet emerged. On the other hand, business processes and their management lie at the heart of all business activity. As a result, processes are taking center stage, driving demand for powerful BPM systems with "pi-calculus inside" that take the process complexity outside of stovepipe applications and allow existing applications to be expressed in a form that business people, not just programmers, can understand, evolve, and manage.

Sitting right at the divide between humans and machines - between business and IT - the BPMS represents a paradigm shift in the world of business automation - business process computing - that has a profound impact on the way companies structure and perform work, letting people speak in their native tongue and enabling machines to understand them, not vice versa. Designed top down and deployed directly in accordance with a company's strategy, business processes can now be unhindered by the constraints of existing IT systems.

The implications are equally profound for the IT industry, for it must enter its next phase of evolution and maturity. As CAD/CAM systems enabled computer-integrated and "just-in-time" manufacturing, BPM can facilitate collaborative "just-in-time" business processes and a new era of process manufacturing. Those players in the IT industry that master BPM will share the new wealth with their customers: productivity gains, innovation, and lowered costs like those the industrial design and manufacturing industries have already realized as a result of implementing a direct path from design to execution.

Welcome to the company of the future, the fully digitized corporation - the process-managed enterprise. Welcome to the next 50 years of business and IT.

More Stories By Peter Fingar

Peter Fingar is an Executive Partner with the digital strategy firm, the Greystone Group. He delivers keynotes worldwide and is author of the best-selling books, The Death of "e" and the Birth of the Real New
Economy and Enterprise E-Commerce. Over his 30-year career he has taught graduate and undergraduate computing studies and held management, technical and consulting positions with GTE Data Services, Saudi Aramco, the Technical Resource Connection division of Perot Systems and IBM Global Services, as well as serving as CIO for the University of Tampa.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to clos...
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things c...
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
Many IT organizations have come to learn that leveraging cloud infrastructure is not just unavoidable, it’s one of the most effective paths for IT organizations to become more responsive to business needs. Yet with the cloud comes new challenges, including minimizing downtime, decreasing the cost of operations, and preventing employee burnout to name a few. As companies migrate their processes and procedures to their new reality of a cloud-based infrastructure, an incident management solution...
Cloud Governance means many things to many people. Heck, just the word cloud means different things depending on who you are talking to. While definitions can vary, controlling access to cloud resources is invariably a central piece of any governance program. Enterprise cloud computing has transformed IT. Cloud computing decreases time-to-market, improves agility by allowing businesses to adapt quickly to changing market demands, and, ultimately, drives down costs.
Recent survey done across top 500 fortune companies shows almost 70% of the CIO have either heard about IAC from their infrastructure head or they are on their way to implement IAC. Yet if you look under the hood while some level of automation has been done, most of the infrastructure is still managed in much tradition/legacy way. So, what is Infrastructure as Code? how do you determine if your IT infrastructure is truly automated?