Welcome!

Microservices Expo Authors: Yeshim Deniz, Kevin Benedict, Liz McMillan, Elizabeth White, Jyoti Bansal

Related Topics: Microservices Expo

Microservices Expo: Article

"The Backdoor" – BPM Solutions Pay

"I tend to like to go with technology because it makes sense"

People who know me would generally agree I'm a straightforward guy - I pretty much just like to move in the direction I've said I was going, rather than try to move from side to side and finesse something. So when it comes to technology, I tend to like to go with technology because it makes sense, and I usually assume that most IT organizations work that way as well.

But when you look at a technology like Business Process Management (BPM), you can see that the straightforward approach may not be the best, fastest, or even most successful route towards deployment.

BPM is a tougher sell on a straight technology basis, because it relies either on an SOA or an EAI environment that enables a service approach, and because the capabilities it provides have to date been implemented, albeit poorly, in actual code.

BPM as a technology extracts the business rules of an organization using advanced modeling techniques and software to define the business rules, the "what happens when" outside of lower level code. Besides allowing for rapid change in response to changing business conditions, BPM also allows the business community to take a much greater role in the definition of behavior within their software environments.

Clearly, this type of capability can be an asset to an organization that is confronted with frequent changes, dynamic market conditions, or even the consequences of a merger between organizations with disparate computing systems. Yet, because of the nature of the way IT projects are usually funded, this capability is frequently a difficult sell.

Most IT organizations have to fund their projects as discrete systems, therefore you can fund a CRM system, or an Order Management system, or a General Accounting system, even a User Portal. Each of these systems provides an "end user" benefit, one that can be easily quantified and budgeted for. BPM, by contrast, potentially cuts across all of these systems, while providing little visible or tangible benefit.

That's at least partially because funding a development effort and cost usually neglects the operational and maintenance cost of a system. These costs can often be multiples of the original implementation cost over the lifetime of a system. As an example, think of some of the COBOL programs that many large organizations have been nursing along for decades. Compared to the cost of creation, the maintenance costs are many times higher.

This is where the BPM solutions pay - they help reduce operational and maintenance cost. Anything that is programmed has to be tested to death, deployed, and managed. The model-driven architecture (MDA) approach seldom actually works all the way down to the code level and back again, so even if there is some modeling or design, it's typically only documentation when the coding gets done, allowing errors and omissions to creep into the process and creating troubleshooting nightmares.

In contrast, BPM presents the rules in a modeling environment that is completely round trip, and can be tested and debugged more effectively, especially in the difficult cases where a business transaction requires crossing system boundaries. We've all experienced the "he said, she said" finger pointing that goes on when a process that spans two or more systems experiences difficulty. BPM reduces cost by taking the management, the modeling, and the implementation out of multiple silo-based systems and centralizing the capabilities needed to effectively implement business processes rapidly.

It should not be surprising that the calculations necessary to quantify this benefit are convoluted and involved. They require analysis of maintenance and operations, as well as a good understanding of the actual software development life cycle in use in a particular organization - something that is seldom present. Thus while the technology clearly provides benefits, quantifying its value and justifying its cost remain elusive. In the end, the straightforward approach to the problem, which is simply stating the need for the capability, must give way to a more devious approach that builds the capability into the price of one or more system upgrades or packaged implementations.

More Stories By Sean Rhody

Sean Rhody is the founding-editor (1999) and editor-in-chief of SOA World Magazine. He is a respected industry expert on SOA and Web Services and a consultant with a leading consulting services company. Most recently, Sean served as the tech chair of SOA World Conference & Expo 2007 East.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Derek Miers 02/15/06 08:27:44 AM EST

Naive, ill-informed, confused ... just three of the adjectives that spring to mind while reading this piece.

There probably isn't a Fortune 2000 company that is not gaining significant benefits from the deployment of BPM technology. Mostly these projects are still in the mode of strategic experiment, but the evidence is plain for all to see.

Some of Sean's assertions are just plain wrong (like the bit about BPM code being poorly implemented), while others show a lack of understanding of the real issues and benefits.

Suggest you do a little more research next time before wading in to an area that you obviously have little contact with.

IMNSHO, SOA and BPM initiatives are fairly much joined at the hip. SOA is a "way of thinking" based around the notions of service orientation ... it allows an organisation to support the high level business capabilities that keep the firm in business, marrying that up to the low level technology and procedural elements.

OTOH, BPM is a business discipline that puts continuous performance improvement center stage in the way the firm is run. It involves a highly iterative approach to supporting the way systems are rolled out. From the perspective of this discussion however, BPM Suites provide the ability to orchestrate services in line with corporate objectives.

The ROI and business benefits are clear, making business justification relatively straight forward ... it just requires understanding of how those benefits transform into enhanced productivity, customer service, traceability and transparency.

Check out the papers on my site if you want an alternative view (or much of the material available on sites such as BP Trends, BPM.com)

SYS-CON Italy News Desk 02/14/06 07:19:24 PM EST

People who know me would generally agree I'm a straightforward guy - I pretty much just like to move in the direction I've said I was going, rather than try to move from side to side and finesse something. So when it comes to technology, I tend to like to go with technology because it makes sense, and I usually assume that most IT organizations work that way as well.

@MicroservicesExpo Stories
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
The evolution of JavaScript and HTML 5 to support a genuine component based framework (Web Components) with the necessary tools to deliver something close to a native experience including genuine realtime networking (UDP using WebRTC). HTML5 is evolving to offer built in templating support, the ability to watch objects (which will speed up Angular) and Web Components (which offer Angular Directives). The native level support will offer a massive performance boost to frameworks having to fake all...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
The rapid growth of hyperscale IaaS platforms that provide Serverless and Software management automation services is changing how enterprises can get better Cloud ROI. Heightened security concerns and enabling developer productivity are strategic issues for 2017. The emergence of hyper-scale Infrastructure as-a-Service (IaaS) platforms such as Amazon Web Services (AWS) that offer Serverless computing, DevOps automation and large-scale data management capabilities is changing the economics of so...
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, will discuss how to use Kubernetes to setup a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace....
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry’s single source for the cloud. Fusion’s advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including cloud...
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...