Processes present a structural approach to providing services and products, but they also reflect the way internal business is managed. If a process is not well-defined, or not acknowledged by employees who use it, it can affect the organization at it’s core.
Sometimes even if the users understand company’s processes, they are not fully aware of their complexity. Because these workflows are executed on a daily basis, people can lose the sense of what could go wrong within a process. They can think ideally and assume a process is not complex, while the reality could be completely different.
What we realized working on business process management projects is that there are always three different perceptions of a process: an ideal, expected, and real, which are keys to process improvement.
Understanding different perceptions of a process
An ideal process model is what a customer desires or imagines that the process should look like. Such perception is usually very simple, like the one in the picture below, meaning that there are basically no different scenarios or any conditions that force a workflow to complicate. An ideal process model is usually perceived as a straightforward one-by-one stage process, where there is no branching.
An expected process model is the way users perceive a process from their viewpoint. More specifically, users create subjective perceptions based on their level of engagement within the process, so we end up with various process versions, depending on the user’s involvement in it.
The potential challenge with such notion is that users think a specific process is working perfectly and can’t anticipate the issues that can occur. That’s why there is a need to have a data-driven analysis which would give us the right process insights in order to improve.
To achieve that and have a clear perception of processes, we need to analyze them by taking valid evidence – event logs. Such datatype ensures we get a realistic viewpoint of a specific process – the real process model.
We can identify the real process model by taking event logs from a process and executing a process mining algorithm, usually as part of a process mining technology (we use ABBYY Timeline and Process Advisor), that creates a process based on data.
Such perspective shows how the process truly works, which in most cases contains complex scenarios and conditions, as the picture below shows.
Process improvement – from real to the ideal process
In order to get from the real to the ideal process, the first thing to do is to collect all three perspectives:
- the real process model based on event logs,
- an expected process model created from users’ input, and
- an ideal process that is the desired outcome of the project.
Then, we can compare their similarities and determine a roadmap to get from the actual to the ideal process.
Conformance checking – comparing the process models
Such comparison is called conformance checking. First, we need to interview the users to get their perception of the process, and then compare their input with the process that was created from event logs. Analyzing the expected version in relation to the real is necessary so that we can understand how different both versions of the same process are, and what should be improved in order to align one to another.
Aspects of the organization that affect the improvement of the process
The path from actual to ideal process depends on many aspects, such as
- the willingness of an organization to change and improve its functionalities,
- adjusting technology solutions towards today’s expectations or needs,
- redefining objectives and opportunities within a process, and
- adjusting process complexity
In most cases, the situation leads to a compromise between the ideal and the actual, resulting in an improved expected process. In order to achieve a process where there are no more issues, it’s important to include these aspects of process improvement:
Each process consists of different scenarios that are defined by specific conditions between events. Most of these conditions are business rules that were defined by organization’s needs. However, it’s important to narrow down only to business scenarios that bring significant value to the outcome. Therefore, by analyzing and comparing individual process variants, we can prioritize the ones that occur more often and bring significant value, while removing and restricting scenarios that do not contribute to the final outcome.
- Process filtering
The main goal is to remove any redundancy within events, conditions, or resources in a process. To do so, it’s important to understand the main values and priorities among process objects. By having a transparent definition of priorities and what objects mostly affect positive process outcomes, we can remove redundant steps or resources. In addition, some scenarios within a process could be driven by repetition rather than decision-making purposes. Such loops that happen due to misunderstanding process capabilities should be inspected and removed.
Taking into consideration these aspects, identifying the values and priorities of the process will result in an improved process which leads to better understanding of the process, and consequentially a more productive environment.
In most cases, the ideal process is overly simplified to be included in the organization. That is, a process model should always be designed according to the real capabilities of the organization. On the other hand, an actual process model based on event logs may seem too complex.
So what is the solution? An actual process model can be refined by focusing only on the most important variants of a process. The scenarios that are prioritized should be the ones that occur daily and bring value to the organization.
It’s crucial to be aware of both the client’s and analyst’s perspectives while analyzing the process, because they come from different domains and perceive the process differently. In the end, there is no proper process understanding if there is no data-driven analysis by extracting event logs and modeling a process.
For process analysis, we use ABBYY Timeline, a process mining tool which reconstructs the original process instances step-by-step from event logs, even if they comes from multiple sources. We get fast and accurate results to interpret and act on.