Click here to close now.




















Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Trevor Parsons, Jason Bloomberg, Roger Strukhoff

Blog Feed Post

BPMS, Monitoring and Business Intelligence

In every organization, activities are conducted through workflows. These workflows are known as Processes. Processes can be automated to ensure that they function correctly and with maximum efficiency. This automation is achieved with a BPMS (Business Process Management Suite) which, as its name suggests, is the software that supports Business Process Management.

BPMS offers many advantages, such as work optimization, cost reduction, compliance with regulations, and the list goes on, but one of the most predominant advantages are the observation, control and analysis of company activity in real-time, which allow for fast and well-informed decision making. 

The vast amount of information handled by BPMS

The information handled by a BPM Suite is far greater than the information a company has ever been able to handle before, even if the company has already reached a certain level of Process Management maturity.

Because a next generation BPM Suite manages most of the business activity, it can also handle:

  • Structured Information. All data created, modified or deleted, not only from the BPM Suite, but also from the other applications used in the company that have been integrated into the suite: ERP, Legacy, etc.
                                                           
  • Non Structured Information. Documents of any type that have been created, supplied by external users (customers, suppliers, etc.), filed, queried, signed or deleted. This is also the case with digital content.
     
  • Rules Information. Business rules, strategies, procedures and regulations, both internal and compulsory, of textual or mechanical nature. 
     
  • Information about Activities. About people, groups and roles that intervene in the execution of processes and Free Workflow tasks (minor or unstructured activities); and about times, usage and foreseen and incurred costs, etc.

The correct implementation of a next generation BPM Suite makes it possible to convert this vast quantity of information into available knowledge

The treatment of Information provided by BPMS

When the day-to-day processes are executed with BPMS, all the generated information becomes automatically recorded and organized according to the structure that has been determined. This allows the Intelligence tools in the BPM Suite (BAM (Business Activity Monitoring), Dashboard, BI (Business Intelligence), KPI’s (Key Performance Indicators), etc.) to automate the observation, control and analysis of the facts and data in the organization.

1. The User Activity

Any type of analysis can be obtained about the employee and external user access to the suite, including statistics about their activity, any errors that occur, etc. An example of this is the start/end session which records the pages accessed by each user, displaying the name, browser version, IP and domain. It also shows statistics, by hour, day, week or month, of the pages and options visited, the images and documents downloaded, bandwidth usage, search engines, phrases and keywords used, etc. 

2. The Document Activity

BPMS can punctually provide each user with their own personalized documents and reports. 

To optimize queries, different views can be created depending on the desired criteria. Documents from several libraries can all be displayed in the same view. The columns and filters are configurable, and a great deal of information about each document can be obtained, for example:

From this details window we can view the complete LOG of the document lifecycle, which offers the date and time of all the actions performed by the users (creation, modification, reading, deletion, and digital signature); the type of action, either manual (performed by a user) or automatic (performed by the system); and from where the action has been carried out (process task, library, family element, personal role, web service, external application, etc.).

3. The Management Activity

In the day-to-day running of the company, each executed process records all the information generated throughout execution, both manually introduced by the users and automatically generated by the system or through interaction with external applications.

Meanwhile the KPI’s (Key Performance Indicators) register certain values in order to analyze the performance of the processes for making decisions. Any type of KPI can be designed with a next generation BPMS.

Depending on the type of information, it will be registered and analyzed in a specialized manner by one of the following mechanisms:

System Controls

When the system detects an anomaly, it automatically generated alerts, alarms or warning notifications and it may even block the processes until the problem is resolved.

Dashboard

The Dashboard includes sets of queries and reports created by the user to observe and control the execution of the processes while they are running.  BPMS offers at least three types of query:

  1. Process Queries. To query the executed processes under the desired criteria, in Views that have been easily designed by the user.
  2. Time Queries. To display the process completion times under several criteria: processes, tasks, control points (KPIs), spans, etc., with control of deadlines, critical levels, thresholds, and margin alerts and alarms, etc.
  3. Execution Control. This allows an exhaustive control of every running or terminated process. The diagram of the Class of Process can be consulted, along with the Tracking that reproduces the flow of the process. This Tracking displays all the objects that the process current has passed through, including all the field values, comment logs, documents and chronometry. 

 

Business Intelligence

Business Intelligence includes sets of queries and reports created by the user to perform individual and statistical analyses of the terminated processes. 

With Business Intelligence we can obtain statistics, reports, process status and the BPMS engine activity, analyzing key indicators with drill down techniques and BI (Business Intelligence) tools, using OLAP cubes. The information obtained can be displayed in tables or charts (pie, bar, etc) similar to the image below, or customized to the suit the user.

But above all, BPMS punctually provides each user with all necessary, personalized information, automating the observation, the control and the analysis of the facts and data that are generated in the entity.

Read the original blog entry...

More Stories By Nina Moon

AuraPortal, BPM, BPMS, CRM, CRM automation, CAASPRE Consulting, Gartner, Business Process Management, Rules Engine, Process Engine, Document Management, Portals, Intranet/Extranet, Supply Chain, Process Models, Summit, Business Users, Consultants, Independent IT Companies, Sales Agents, Partners.

@MicroservicesExpo Stories
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Opening Keynote at 16th Cloud Expo, S...
"We've just seen a huge influx of new partners coming into our ecosystem, and partners building unique offerings on top of our API set," explained Seth Bostock, Chief Executive Officer at IndependenceIT, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Sysdig has announced two significant milestones in its mission to bring infrastructure and application monitoring to the world of containers and microservices: a $10.7 million Series A funding led by Accel and Bain Capital Ventures (BCV); and the general availability of Sysdig Cloud, the first monitoring, alerting, and troubleshooting platform specializing in container visibility, which is already used by more than 30 enterprise customers. The funding will be used to drive adoption of Sysdig Clo...
Auto-scaling environments, micro-service architectures and globally-distributed teams are just three common examples of why organizations today need automation and interoperability more than ever. But is interoperability something we simply start doing, or does it require a reexamination of our processes? And can we really improve our processes without first making interoperability a requirement for how we choose our tools?
What we really mean to ask is whether microservices architecture is SOA done right. But then, of course, we’d have to figure out what microservices architecture was. And if you think defining SOA is difficult, pinning down microservices architecture is unquestionably frying pan into fire time. Given my years at ZapThink, fighting to help architects understand what Service-Oriented Architecture really was and how to get it right, it’s no surprise that many people ask me this question.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at @DevOpsSummit, Haseeb Budhani, CEO and Co-founder of Soha, shared five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the friction an...
JavaScript is primarily a client-based dynamic scripting language most commonly used within web browsers as client-side scripts to interact with the user, browser, and communicate asynchronously to servers. If you have been part of any web-based development, odds are you have worked with JavaScript in one form or another. In this article, I'll focus on the aspects of JavaScript that are relevant within the Node.js environment.
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
In the midst of the widespread popularity and adoption of cloud computing, it seems like everything is being offered “as a Service” these days: Infrastructure? Check. Platform? You bet. Software? Absolutely. Toaster? It’s only a matter of time. With service providers positioning vastly differing offerings under a generic “cloud” umbrella, it’s all too easy to get confused about what’s actually being offered. In his session at 16th Cloud Expo, Kevin Hazard, Director of Digital Content for SoftL...
"Vicom Computer Services is a service provider and a value-added reseller and we provide technology solutions, infrastructure solutions, security and management services solutions," stated Amitava Das, Chief Technology Officer at Vicom Computer Services, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at DevOps Summit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Microservices are hot. And for good reason. To compete in today’s fast-moving application economy, it makes sense to break large, monolithic applications down into discrete functional units. Such an approach makes it easier to update and add functionalities (text-messaging a customer, calculating sales tax for a specific geography, etc.) and get those updates / adds into production fast. In fact, some would argue that microservices are a prerequisite for true continuous delivery. But is it too...
Worldwide, there's a growing appreciation for the many benefits of the Open Source way. Clearly, being truly Open is a frame of mind that can apply to just about anything in life -- including the development and nurture of a progressive company culture that's equipped for the challenges and opportunities of today's Global Networked Economy. Jim Whitehurst, CEO of Red Hat, recently launched his new book entitled "The Open Organization" -- Igniting Passion and Performance. He says, "The conventio...
This week, I joined SOASTA as Senior Vice President of Performance Analytics. Given my background in cloud computing and distributed systems operations — you may have read my blogs on CNET or GigaOm — this may surprise you, but I want to explain why this is the perfect time to take on this opportunity with this team. In fact, that’s probably the best way to break this down. To explain why I’d leave the world of infrastructure and code for the world of data and analytics, let’s explore the timing...
SYS-CON Events announced today that JFrog, maker of Artifactory, the popular Binary Repository Manager, will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based in California, Israel and France, founded by longtime field-experts, JFrog, creator of Artifactory and Bintray, has provided the market with the first Binary Repository solution and a software distribution social platform.
Puppet Labs has published their annual State of DevOps report and it is loaded with interesting information as always. Last year’s report brought home the point that DevOps was becoming widely accepted in the enterprise. This year’s report further validates that point and provides us with some interesting insights from surveying a wide variety of companies in different phases of their DevOps journey.
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. The DevOps approach is a way to increase business agility through collaboration, communication, and integration across different teams in the IT organization. In his session at DevOps Summit, Chris Van Tuin, Chief Technologist for the Western US at Red Hat, will discuss: The acceleration of application delivery for the business with DevOps