Click here to close now.


Microservices Expo Authors: Liz McMillan, Elizabeth White, Pat Romanski, Yeshim Deniz, Lori MacVittie

Blog Feed Post

BPMS, Monitoring and Business Intelligence

In every organization, activities are conducted through workflows. These workflows are known as Processes. Processes can be automated to ensure that they function correctly and with maximum efficiency. This automation is achieved with a BPMS (Business Process Management Suite) which, as its name suggests, is the software that supports Business Process Management.

BPMS offers many advantages, such as work optimization, cost reduction, compliance with regulations, and the list goes on, but one of the most predominant advantages are the observation, control and analysis of company activity in real-time, which allow for fast and well-informed decision making. 

The vast amount of information handled by BPMS

The information handled by a BPM Suite is far greater than the information a company has ever been able to handle before, even if the company has already reached a certain level of Process Management maturity.

Because a next generation BPM Suite manages most of the business activity, it can also handle:

  • Structured Information. All data created, modified or deleted, not only from the BPM Suite, but also from the other applications used in the company that have been integrated into the suite: ERP, Legacy, etc.
  • Non Structured Information. Documents of any type that have been created, supplied by external users (customers, suppliers, etc.), filed, queried, signed or deleted. This is also the case with digital content.
  • Rules Information. Business rules, strategies, procedures and regulations, both internal and compulsory, of textual or mechanical nature. 
  • Information about Activities. About people, groups and roles that intervene in the execution of processes and Free Workflow tasks (minor or unstructured activities); and about times, usage and foreseen and incurred costs, etc.

The correct implementation of a next generation BPM Suite makes it possible to convert this vast quantity of information into available knowledge

The treatment of Information provided by BPMS

When the day-to-day processes are executed with BPMS, all the generated information becomes automatically recorded and organized according to the structure that has been determined. This allows the Intelligence tools in the BPM Suite (BAM (Business Activity Monitoring), Dashboard, BI (Business Intelligence), KPI’s (Key Performance Indicators), etc.) to automate the observation, control and analysis of the facts and data in the organization.

1. The User Activity

Any type of analysis can be obtained about the employee and external user access to the suite, including statistics about their activity, any errors that occur, etc. An example of this is the start/end session which records the pages accessed by each user, displaying the name, browser version, IP and domain. It also shows statistics, by hour, day, week or month, of the pages and options visited, the images and documents downloaded, bandwidth usage, search engines, phrases and keywords used, etc. 

2. The Document Activity

BPMS can punctually provide each user with their own personalized documents and reports. 

To optimize queries, different views can be created depending on the desired criteria. Documents from several libraries can all be displayed in the same view. The columns and filters are configurable, and a great deal of information about each document can be obtained, for example:

From this details window we can view the complete LOG of the document lifecycle, which offers the date and time of all the actions performed by the users (creation, modification, reading, deletion, and digital signature); the type of action, either manual (performed by a user) or automatic (performed by the system); and from where the action has been carried out (process task, library, family element, personal role, web service, external application, etc.).

3. The Management Activity

In the day-to-day running of the company, each executed process records all the information generated throughout execution, both manually introduced by the users and automatically generated by the system or through interaction with external applications.

Meanwhile the KPI’s (Key Performance Indicators) register certain values in order to analyze the performance of the processes for making decisions. Any type of KPI can be designed with a next generation BPMS.

Depending on the type of information, it will be registered and analyzed in a specialized manner by one of the following mechanisms:

System Controls

When the system detects an anomaly, it automatically generated alerts, alarms or warning notifications and it may even block the processes until the problem is resolved.


The Dashboard includes sets of queries and reports created by the user to observe and control the execution of the processes while they are running.  BPMS offers at least three types of query:

  1. Process Queries. To query the executed processes under the desired criteria, in Views that have been easily designed by the user.
  2. Time Queries. To display the process completion times under several criteria: processes, tasks, control points (KPIs), spans, etc., with control of deadlines, critical levels, thresholds, and margin alerts and alarms, etc.
  3. Execution Control. This allows an exhaustive control of every running or terminated process. The diagram of the Class of Process can be consulted, along with the Tracking that reproduces the flow of the process. This Tracking displays all the objects that the process current has passed through, including all the field values, comment logs, documents and chronometry. 


Business Intelligence

Business Intelligence includes sets of queries and reports created by the user to perform individual and statistical analyses of the terminated processes. 

With Business Intelligence we can obtain statistics, reports, process status and the BPMS engine activity, analyzing key indicators with drill down techniques and BI (Business Intelligence) tools, using OLAP cubes. The information obtained can be displayed in tables or charts (pie, bar, etc) similar to the image below, or customized to the suit the user.

But above all, BPMS punctually provides each user with all necessary, personalized information, automating the observation, the control and the analysis of the facts and data that are generated in the entity.

Read the original blog entry...

More Stories By Nina Moon

AuraPortal, BPM, BPMS, CRM, CRM automation, CAASPRE Consulting, Gartner, Business Process Management, Rules Engine, Process Engine, Document Management, Portals, Intranet/Extranet, Supply Chain, Process Models, Summit, Business Users, Consultants, Independent IT Companies, Sales Agents, Partners.

@MicroservicesExpo Stories
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
With containerization using Docker, the orchestration of containers using Kubernetes, the self-service model for provisioning your projects and applications and the workflows we built in OpenShift is the best in class Platform as a Service that enables introducing DevOps into your organization with ease. In his session at DevOps Summit, Veer Muchandi, PaaS evangelist with RedHat, will provide a deep dive overview of OpenShift v3 and demonstrate how it helps with DevOps.
JFrog has announced a powerful technology for managing software packages from development into production. JFrog Artifactory 4 represents disruptive innovation in its groundbreaking ability to help development and DevOps teams deliver increasingly complex solutions on ever-shorter deadlines across multiple platforms JFrog Artifactory 4 establishes a new category – the Universal Artifact Repository – that reflects JFrog's unique commitment to enable faster software releases through the first pla...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
The APN DevOps Competency highlights APN Partners who demonstrate deep capabilities delivering continuous integration, continuous delivery, and configuration management. They help customers transform their business to be more efficient and agile by leveraging the AWS platform and DevOps principles.
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of, and Fred Yatzeck, principal architect leading product development at, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
In their session at DevOps Summit, Asaf Yigal, co-founder and the VP of Product at, and Tomer Levy, co-founder and CEO of, will explore the entire process that they have undergone – through research, benchmarking, implementation, optimization, and customer success – in developing a processing engine that can handle petabytes of data. They will also discuss the requirements of such an engine in terms of scalability, resilience, security, and availability along with how the archi...
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
DevOps has often been described in terms of CAMS: Culture, Automation, Measuring, Sharing. While we’ve seen a lot of focus on the “A” and even on the “M”, there are very few examples of why the “C" is equally important in the DevOps equation. In her session at @DevOps Summit, Lori MacVittie, of F5 Networks, will explore HTTP/1 and HTTP/2 along with Microservices to illustrate why a collaborative culture between Dev, Ops, and the Network is critical to ensuring success.
Application availability is not just the measure of “being up”. Many apps can claim that status. Technically they are running and responding to requests, but at a rate which users would certainly interpret as being down. That’s because excessive load times can (and will be) interpreted as “not available.” That’s why it’s important to view ensuring application availability as requiring attention to all its composite parts: scalability, performance, and security.
Saviynt Inc. has announced the availability of the next release of Saviynt for AWS. The comprehensive security and compliance solution provides a Command-and-Control center to gain visibility into risks in AWS, enforce real-time protection of critical workloads as well as data and automate access life-cycle governance. The solution enables AWS customers to meet their compliance mandates such as ITAR, SOX, PCI, etc. by including an extensive risk and controls library to detect known threats and b...
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....