Welcome!

Microservices Expo Authors: Ken Schwaber, Aruna Ravichandran, Liz McMillan, Carmen Gonzalez, Yeshim Deniz

Related Topics: Microservices Expo

Microservices Expo: Article

"The Backdoor" – BPM Solutions Pay

"I tend to like to go with technology because it makes sense"

People who know me would generally agree I'm a straightforward guy - I pretty much just like to move in the direction I've said I was going, rather than try to move from side to side and finesse something. So when it comes to technology, I tend to like to go with technology because it makes sense, and I usually assume that most IT organizations work that way as well.

But when you look at a technology like Business Process Management (BPM), you can see that the straightforward approach may not be the best, fastest, or even most successful route towards deployment.

BPM is a tougher sell on a straight technology basis, because it relies either on an SOA or an EAI environment that enables a service approach, and because the capabilities it provides have to date been implemented, albeit poorly, in actual code.

BPM as a technology extracts the business rules of an organization using advanced modeling techniques and software to define the business rules, the "what happens when" outside of lower level code. Besides allowing for rapid change in response to changing business conditions, BPM also allows the business community to take a much greater role in the definition of behavior within their software environments.

Clearly, this type of capability can be an asset to an organization that is confronted with frequent changes, dynamic market conditions, or even the consequences of a merger between organizations with disparate computing systems. Yet, because of the nature of the way IT projects are usually funded, this capability is frequently a difficult sell.

Most IT organizations have to fund their projects as discrete systems, therefore you can fund a CRM system, or an Order Management system, or a General Accounting system, even a User Portal. Each of these systems provides an "end user" benefit, one that can be easily quantified and budgeted for. BPM, by contrast, potentially cuts across all of these systems, while providing little visible or tangible benefit.

That's at least partially because funding a development effort and cost usually neglects the operational and maintenance cost of a system. These costs can often be multiples of the original implementation cost over the lifetime of a system. As an example, think of some of the COBOL programs that many large organizations have been nursing along for decades. Compared to the cost of creation, the maintenance costs are many times higher.

This is where the BPM solutions pay - they help reduce operational and maintenance cost. Anything that is programmed has to be tested to death, deployed, and managed. The model-driven architecture (MDA) approach seldom actually works all the way down to the code level and back again, so even if there is some modeling or design, it's typically only documentation when the coding gets done, allowing errors and omissions to creep into the process and creating troubleshooting nightmares.

In contrast, BPM presents the rules in a modeling environment that is completely round trip, and can be tested and debugged more effectively, especially in the difficult cases where a business transaction requires crossing system boundaries. We've all experienced the "he said, she said" finger pointing that goes on when a process that spans two or more systems experiences difficulty. BPM reduces cost by taking the management, the modeling, and the implementation out of multiple silo-based systems and centralizing the capabilities needed to effectively implement business processes rapidly.

It should not be surprising that the calculations necessary to quantify this benefit are convoluted and involved. They require analysis of maintenance and operations, as well as a good understanding of the actual software development life cycle in use in a particular organization - something that is seldom present. Thus while the technology clearly provides benefits, quantifying its value and justifying its cost remain elusive. In the end, the straightforward approach to the problem, which is simply stating the need for the capability, must give way to a more devious approach that builds the capability into the price of one or more system upgrades or packaged implementations.

More Stories By Sean Rhody

Sean Rhody is the founding-editor (1999) and editor-in-chief of SOA World Magazine. He is a respected industry expert on SOA and Web Services and a consultant with a leading consulting services company. Most recently, Sean served as the tech chair of SOA World Conference & Expo 2007 East.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Derek Miers 02/15/06 08:27:44 AM EST

Naive, ill-informed, confused ... just three of the adjectives that spring to mind while reading this piece.

There probably isn't a Fortune 2000 company that is not gaining significant benefits from the deployment of BPM technology. Mostly these projects are still in the mode of strategic experiment, but the evidence is plain for all to see.

Some of Sean's assertions are just plain wrong (like the bit about BPM code being poorly implemented), while others show a lack of understanding of the real issues and benefits.

Suggest you do a little more research next time before wading in to an area that you obviously have little contact with.

IMNSHO, SOA and BPM initiatives are fairly much joined at the hip. SOA is a "way of thinking" based around the notions of service orientation ... it allows an organisation to support the high level business capabilities that keep the firm in business, marrying that up to the low level technology and procedural elements.

OTOH, BPM is a business discipline that puts continuous performance improvement center stage in the way the firm is run. It involves a highly iterative approach to supporting the way systems are rolled out. From the perspective of this discussion however, BPM Suites provide the ability to orchestrate services in line with corporate objectives.

The ROI and business benefits are clear, making business justification relatively straight forward ... it just requires understanding of how those benefits transform into enhanced productivity, customer service, traceability and transparency.

Check out the papers on my site if you want an alternative view (or much of the material available on sites such as BP Trends, BPM.com)

SYS-CON Italy News Desk 02/14/06 07:19:24 PM EST

People who know me would generally agree I'm a straightforward guy - I pretty much just like to move in the direction I've said I was going, rather than try to move from side to side and finesse something. So when it comes to technology, I tend to like to go with technology because it makes sense, and I usually assume that most IT organizations work that way as well.

@MicroservicesExpo Stories
I’m told that it has been 21 years since Scrum became public when Jeff Sutherland and I presented it at an Object-Oriented Programming, Systems, Languages & Applications (OOPSLA) workshop in Austin, TX, in October of 1995. Time sure does fly. Things mature. I’m still in the same building and at the same company where I first formulated Scrum.[1] Initially nobody knew of Scrum, yet it is now an open source body of knowledge translated into more than 30 languages[2] People use Scrum worldwide for ...
A lot of time, resources and energy has been invested over the past few years on de-siloing development and operations. And with good reason. DevOps is enabling organizations to more aggressively increase their digital agility, while at the same time reducing digital costs and risks. But as 2017 approaches, the hottest trends in DevOps aren’t specifically about dev or ops. They’re about testing, security, and metrics.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed ...
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
Here’s a novel, but controversial statement, “it’s time for the CEO, COO, CIO to start to take joint responsibility for application platform decisions.” For too many years now technical meritocracy has led the decision-making for the business with regard to platform selection. This includes, but is not limited to, servers, operating systems, virtualization, cloud and application platforms. In many of these cases the decision has not worked in favor of the business with regard to agility and cost...
Software delivery was once specific to the IT industry. Now, Continuous Delivery pipelines are used around world from e-commerce to airline software. Building a software delivery pipeline once involved hours of scripting and manual steps–a process that’s painful, if not impossible, to scale. However Continuous Delivery with Application Release Automation tools offers a scripting-free, automated experience. Continuous Delivery pipelines are immensely powerful for the modern enterprise, boosting ...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
True Story. Over the past few years, Fannie Mae transformed the way in which they delivered software. Deploys increased from 1,200/month to 15,000/month. At the same time, productivity increased by 28% while reducing costs by 30%. But, how did they do it? During the All Day DevOps conference, over 13,500 practitioners from around the world to learn from their peers in the industry. Barry Snyder, Senior Manager of DevOps at Fannie Mae, was one of 57 practitioners who shared his real world journe...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
As 2016 approaches its end, the time to prepare for the year ahead is now! Following our own advice, we sat down with three XebiaLabs thought leaders–Andrew Phillips, Tim Buntel, and TJ Randall–and asked what they think the future has in store for the DevOps world. In 2017, we’ll see a new wave of “next gen platform” projects focused on container orchestration frameworks such as Kubernetes, and re-tooled PaaS platforms such as OpenShift or Cloud Foundry. Acceptance of the need for a cross-machi...
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
Software development is a moving target. You have to keep your eye on trends in the tech space that haven’t even happened yet just to stay current. Consider what’s happened with augmented reality (AR) in this year alone. If you said you were working on an AR app in 2015, you might have gotten a lot of blank stares or jokes about Google Glass. Then Pokémon GO happened. Like AR, the trends listed below have been building steam for some time, but they’ll be taking off in surprising new directions b...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
When building DevOps or continuous delivery practices you can learn a great deal from others. What choices did they make, what practices did they put in place, and how did they connect the dots? At Sonatype, we pulled together a set of 21 reference architectures for folks building continuous delivery and DevOps practices using Docker. Why? After 3,000 DevOps professionals attended our webinar on "Continuous Integration using Docker" discussing just one reference architecture example, we recogn...
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.