Conference PaperPDF Available

Towards Reliable Business Process Simulation: A Framework to Integrate ERP Systems

Authors:

Abstract and Figures

A digital twin of an organization (DTO) is a digital replication of an organization used to analyze weaknesses in business processes and support operational decision-making by simulating different scenarios. As a key enabling technology of DTO, business process simulation provides techniques to design and implement simulation models that replicate real-life business processes. Existing approaches have been focusing on providing highly flexible design tools and data-driven evidence to improve the accuracy of simulation models. Provided with such tools and evidence, business analysts are required to reflect comprehensive aspects of reality with subjective judgments, including the design of ERP systems and the organizational interaction with the system. However, given the complexity of ERP systems, it is infeasible and error-prone to manually incorporate the business logic and data restrictions of the system into simulation models, impeding the faithfulness and reliability of the following analysis. In this work, we propose a framework to integrate ERP systems in business process simulation to overcome this limitation and ensure the reliability of the simulation results. The framework is implemented in ProM using the SAP ERP system and CPN Tools.
Content may be subject to copyright.
Towards Reliable Business Process Simulation:
A Framework to Integrate ERP Systems
Gyunam Parkand Wil M.P. van der Aalst
Process and Data Science Group (PADS), Department of Computer Science,
RWTH Aachen University, Aachen, Germany
{gnpark,wvdaalst}@pads.rwth-aachen.de
Summary. A digital twin of an organization (DTO) is a digital replica-
tion of an organization used to analyze weaknesses in business processes
and support operational decision-making by simulating different scenar-
ios. As a key enabling technology of DTO, business process simulation
provides techniques to design and implement simulation models that
replicate real-life business processes. Existing approaches have been fo-
cusing on providing highly flexible design tools and data-driven evidence
to improve the accuracy of simulation models. Provided with such tools
and evidence, business analysts are required to reflect comprehensive as-
pects of reality with subjective judgments, including the design of ERP
systems and the organizational interaction with the system. However,
given the complexity of ERP systems, it is infeasible and error-prone to
manually incorporate the business logic and data restrictions of the sys-
tem into simulation models, impeding the faithfulness and reliability of
the following analysis. In this work, we propose a framework to integrate
ERP systems in business process simulation to overcome this limitation
and ensure the reliability of the simulation results. The framework is
implemented in ProM using the SAP ERP system and CPN Tools.
Key words: Digital Twin, Business Process Simulation, ERP Systems,
Business Process Improvement, Process Mining
1 Introduction
A digital twin of an organization (DTO) is a digital representation of business
processes and assets across an organization. By simulating this mirrored repre-
sentation, business analysts can identify operational frictions in the organization
with data-based analytics like process mining [1] and evaluate the efficiency of
the decisions that are too expensive or dangerous to experiment with, e.g., as-
signing more resources to a task and increasing the capacity of machines.
Business process simulation is a key enabling technology of DTOs. A simula-
tion model represents reality in a simplified manner and generates hypothetical
instances of the business process, enabling the simulation of various scenarios
and “what-if” analysis. Many tools have been developed to support the design
and implementation of simulation models, including Arena and CPN Tools [2].
2 Gyunam Park and Wil M.P. van der Aalst
The successful implementation of DTOs using business process simulation re-
lies on how accurately the simulation model represents reality. Especially, given
the increasing relevance of Enterprise Resource Planning (ERP) systems in or-
ganizations, it is essential to accurately model the business logic and data re-
striction of ERP systems (e.g., SAP, Salesforce, and Microsoft Dynamics).
Traditional simulation approaches left them to the subjective judgments by
domain experts, focusing on providing easy-to-use design tools with high flexi-
bility in the implementation. Despite the flexible design tools, it is infeasible and
error-prone to design and implement simulation models reflecting the complex
design of ERP systems, making the simulation results unconvincing and unre-
liable [3]. For instance, the SAP ERP system may produce over 2,821 different
errors for violations of the business logic and data restrictions.
The violations mainly happen in two formats: omission and commission [3].
The former occurs when simulation models fail to reflect the behaviors required
by the system. For instance, in the SAP ERP system, each material has a dif-
ferent warehouse management policy, thus having different control flows (i.e.,
sequences of tasks). Although different control flows should be modeled depend-
ing on the policy, simulation models often omit it, resulting in a huge difference
in the simulated behaviors. On the other hand, the commission problem oc-
curs when simulation models simulate the behaviors not allowed in the system.
For instance, ERP systems are established upon data restrictions, limiting arbi-
trary creations, deletions, and updates of data. For instance, price changes (i.e.,
updates of price information) are not allowed for specific types of items, even
though simulation models often ignore this data restriction.
In this work, we propose a framework to integrate ERP systems to business
process simulation to 1) effectively incorporate the complex design of the sys-
tem into simulations without modeling it in simulation models and 2) provide
simulated event data without the omission and commission issues. Figure 1 con-
ceptually describes how the proposed framework integrates ERP systems into
business process simulation and how it realizes DTOs.
Fig. 1: A conceptual overview of integrating ERP systems in business process simu-
lation
As shown in Figure 1, ERP systems consist of three layers: representation,
application, and database layers. In reality, human resources interact with the
system through the representation layer. The commands by human resources are
Towards Reliable Business Process Simulation 3
processed according to the business logic defined in the application layer, whose
records are stored in the database layer. Event data, e.g., Object-Centric Event
Logs (OCEL)1, are extracted from the database layer and used for analysis, e.g.,
process discovery and root-cause analysis.
In the proposed framework, simulation models, designed by analyzing inter-
actions in the presentation layer, produce commands replicating the behaviors
of human resources. We execute the commands directly in the application layer
of the ERP system, ensuring the non-existence of the omission and commission
problems. The simulated event data are extracted from the database layer in the
same manner that real-life event data are extracted. The simulated data may
provide feedback to improve the design of the simulation model and adapt the
model to changing circumstances.
Since the simulated and real-life data are extracted from the same database
in the same manner, they can be analyzed using the same process mining tool.
Besides, they support action-oriented process mining [4], which transforms in-
sights from diagnostics to actions, by monitoring the undesired behaviors using
real-life data and evaluating the effects of actions using simulated data (e.g.,
A/B testing between different actions).
To show the effectiveness of the proposed framework, we implement it us-
ing the SAP ERP system and CPN Tools. We have tested the implementation
by simulating common scenarios in the order-to-cash process of the SAP ERP
system and extracting process-centric insights from simulated event data.
The remainder is organized as follows. We discuss the related work in Sect. 2.
Next, we present the framework for integrating ERP systems into business pro-
cess simulation and its implementation in Sect. 3 and Sect. 4, respectively. Sect. 5
presents the validation of the framework and Sect. 6 concludes the paper.
2 Related Work
Simulation has been adopted for analyzing business processes since the seven-
ties [5]. Nowadays, various simulation tools are available to design and imple-
ment simulation models. In [6], the simulation tools are classified into two types:
simulation language and simulation package. The former is a programming lan-
guage supporting the implementation of simulation models, including Simula
and GPSS [7]. The latter is a tool providing graphical building blocks to enable
the rapid creation of simulation models such as ARENA and CPN Tools [8].
Data-driven approaches have been proposed to provide conceptual guidance
to design simulation models. Martin et al. [9] identify modeling tasks (e.g., mod-
eling gateways, modeling activities, etc.) and present the relevant process mining
techniques to support each of the modeling tasks. In [10], authors utilize the
process model discovered using process mining techniques as the reference for
the current state of the process and design simulation models with re-engineered
1http://ocel-standard.org/
4 Gyunam Park and Wil M.P. van der Aalst
business processes by manually identifying the possible improvement points from
the reference model.
Furthermore, several techniques have been suggested to automatically dis-
cover simulation models from event data. Rozinat et al. [11] discover simula-
tion models based on Colored Petri Nets (CPNs) in a semi-automated manner.
Carmargo et al. [12] propose a method to optimize the accuracy of the simula-
tion models discovered from an event log. It searches the space of all possible
configurations in the simulation model to maximize the similarity between the
discovered simulation model and the event log.
The existing approaches have focused on improving the accuracy by support-
ing business analysts to better reflect reality in simulation models using domain
knowledge and data-driven insights. However, the question still remains: is the
simulated behavior the one that can be supported (accepted) by the underlying
ERP systems? Since the approaches do not explicitly involve the system in sim-
ulations, business analysts are left to ensure it with their subjective judgments.
In this work, we tackle this question by proposing a framework to integrate ERP
systems in the course of business process simulation.
3 A Framework for the integration of ERP systems
The proposed framework consists of three major components: ERP system,sim-
ulation engine and transformation engine. The ERP system creates, updates,
and deletes objects in object model based on executions. The simulation engine
simulates organizational behaviors in business processes and produces commands
describing the behaviors. The transformation engine translates the commands
into the executions according to which the system updates the object model.
The behavior of the simulation may again depend on the updated object model.
In the following, we explain the framework with formal definitions and examples.
3.1 ERP System
In this work, we define ERP systems in a narrow sense, focusing on its book-
keeping purpose (i.e., creating, updating, and deleting database tables based on
transactions). To this end, we first abstract databases in the system as object
models.
Fig. 2: We use object models to mimic the databases in ERP systems
Towards Reliable Business Process Simulation 5
As described in Figure 2, an object model contains objects (e.g., o1,o2, and
i1 ) with different types (e.g., order,item, and delivery). Also, the objects have
relationships (e.g., o1 is related to i1 ). Besides, each object involves attribute
values (e.g., order has docType and customer information). We formally define
object models as follows:
Definition 1 (Object Model). Let Uobe the universe of object identifiers, Uot
the universe of object types, Uattr the universe of attribute names, and Uval is
the universe of attribute values. An object model is a tuple OM = (O, OT , OR,
otyp, oval)where
OUois a set of object identifiers,
OT Uot is a set of object types,
OR O×Ois a set of object relationships,
otyp OOT assigns precisely one object type to each object identifier, and
oval :O(Uattr ↛ Uval )is the function associating an object to its attribute
value assignments. We denote oval(o)(attr) =if attr /dom(oval(o)) for
any oO.
Uom denote the set of all possible object models.
For instance, OM1= (O, OT , OR, otyp, oval)describes the object model de-
scribed in Figure 2, where O={o1, o2, i1, . . . },OT ={order, item, delivery },
OR ={(o1, i1),(o1, i2),(i1, d1), . . . },otyp(o1) = order,otyp(i1) = item,oval(o1)
(docT ype) = standard,oval(o1)(customer) = C hristine, etc.
Definition 2 (Transaction). Let Upval Uattr ↛ Uval be the universe of
parameter value assignments. A transaction tr Uom ×Upval Uom is a
function that modifies object models based on parameter value assignments. Utr
denotes the set of all possible transactions.
For instance, po Utr is a transaction that places an order. Assume that
pval1Upval is a parameter value assignment such that pval1(customer) =
Marco,pval1(docT ype) = quoted,pval1(item) = iP ad,pval1(quantity)=1,
etc. po(OM1, pval1) = (O, O T, OR, otyp, oval)such that o3, i5O,(o3, i5)
OR,otyp(o3) = order,otyp(i5) = item,oval(o3)(customer) = M arco, etc.
An execution specifies a transaction to be executed in the system along with
the execution time, responsible resource, and parameter value assignment.
Definition 3 (Execution). Let Ures be the universe of resources and Utime the
universe of timestamps. An execution exec Utr ×Ures ×Utime ×Upval is a
tuple of a transaction, a resource, timestamp, and a parameter value assignment.
Uexec denotes the set of all possible executions.
For instance, exec1= (po, Adams, 10:00 23.02.2021, pv al1)describes the order
placement by Adams at 10:00 23.02.2021 with the parameter pval1.
Definition 4 (ERP System). An ERP system sys Uom ×Uexec Uom
updates object models according to executions.
For instance, sys(OM1, exec1) = (O, OT , OR, otyp, oval)such that o3, i5
O,oval(o3)(timestamp) = 10:00 23.02.2021,oval(o3)(resource) = Adams, etc.
6 Gyunam Park and Wil M.P. van der Aalst
3.2 Simulation Engine
A simulation engine aims at producing commands describing which activity is
done by whom at what time with what information. We formally define com-
mands as follows:
Definition 5 (Commands). Let Uact be the universe of activities. A command
cmd Uact ×Ures ×Utime ×Upval is a tuple of an activity, resource, timestamp,
and information value assignment. Ucmd is the set of all possible commands.
For instance, cmd1= (place_order,Adams,10:00 23.02.2021, ival1)Ucmd ,
where ival1(customer) = M arco and ival1(price) = e100, clones the behav-
ior of Adams who places an order of e100 by Marco at 10:00 23.02.2021.
Since simulating behaviors of human resources (i.e., generating commands)
share commonalities to simulating events of business processes (e.g., placing
order occurred at 10:00 23.02.2021 by Adams), we can deploy techniques for business
process simulation to generate commands.
As presented in Sect. 2, various simulation tools are available for this pur-
pose such as CPN Tools and ARENA. Remaining tool-independent, we focus on
explaining essential components of simulation models to produce commands. In
Sect. 4, we explain how these components are designed and implemented using
CPN Tools.
Fig. 3: Core components of business process simulation [6]
Figure 3 explains the core ingredients of business process simulation includ-
ing control-flows,resource assignments,arrival processes, and activity duration,
object assignments and object value assignments. Below is the explanation of
each component:
Activities represent the behaviors that human resources do to serve business
processes (e.g., place order, send invoice, etc.).
Control-flows determine the sequence of activities. For instance, the process
model in Figure 3 describes one sequence of activities, i.e., send_quotation,
Towards Reliable Business Process Simulation 7
place_order,create_delivery,pack_items,create_invoice, and clear_invoice
in order.
Object requirements explain the required objects for the execution of activ-
ities. Each activity may involve multiple objects in its execution. For instance,
the execution of create_delivery involve an order and a delivery since the order
information is required to create the delivery.
Value assignment specifies how the execution of activities updates infor-
mation of the involved objects. For instance, the execution of place_order
updates the document and customer information of the order.
Resource assignments define who is eligible to perform activities in the
business process. For instance, place_order can be performed by the resources
from a sales department.
Arrival processes and activity duration define the inter-arrival time be-
tween arrivals and the duration required for the execution of activities, respec-
tively.
3.3 Transformation Engine
The goal of the transformation engine is to convert the commands to the exe-
cutable formats supported in the underlying ERP system (i.e., executions). To
this end, we need two components: transaction mapping and parameter map-
ping. First, the transaction mapping translates the activity in commands to the
transaction defined in ERP systems.
Definition 6 (Transaction Mapping). A transaction mapping µtr Uact
Utr relates transactions to activities.
Assume place_order in cmd1corresponds to po Utr in an ERP system. A
transaction mapping µ
tr connects them, i.e., µ
tr(place_order) = po.
Next, parameter mapping connects the parameters in commands to the
system-defined parameters in ERP systems.
Definition 7 (Parameter Mapping). A parameter mapping µpr Uattr
Uattr relates system-defined parameters of an ERP system to parameters in com-
mands.
Assume dom(ival1) = {docT ype, customer }where (place_order,Adams, t1,
ival1)Ucmd and DOC_TYPE and PART_NUMB are the corresponding parameters
defined in the system. A parameter mapping µ
pr connects the parameters, i.e.,
µ
pr(docType) = DOC_TYPE and µ
pr(customer) = PART_NUMB.
Given transaction and parameter mapping, a transformation engine trans-
forms commands to executions by translating transactions and parameters. pro-
duces the executions. Note that we assume that the resource/timestamp in com-
mands is compatible with the one in executions.
Definition 8 (Transformation Engine). Let µtr be a transaction mapping
and µpr a parameter mapping. A transformation engine maps executions onto
commands, i.e., for any cmd=(act, res, time, ival)Ucmd ,trans(µtr , µpr )(cmd)
= (µtr(act), r es, time, pval)s.t. attrdom(ival)pval(µpr (attr)) = ival(attr).
8 Gyunam Park and Wil M.P. van der Aalst
4 Implementation
In this section, we implement the proposed framework using the SAP ERP sys-
tem as the underlying ERP system. We design and implement simulation models
using CPN Tools2. The transformation engine is implemented as a plug-in in
ProM 3and translates commands into executions in the SAP ERP system given
a transaction and parameter mapping.
4.1 ERP system: SAP ERP ECC 6.0
The SAP ERP system is the most widely-adopted ERP system, supporting more
than 400,000 businesses worldwide. In the implementation, we utilize SAP ERP
ECC 6.0 4supporting Global Bike Inc., an imaginary enterprise producing and
distributing bicycle products where all relevant SAP solutions are represented.
Fig. 4: A screenshot of the user interface in the SAP ERP ECC 6.0 (place an order)
Figure 4 shows the user interface in the representation layer where sales
staff places orders. Given the inputs such as customer, delivery date, and items
(i.e., parameters) by users (i.e., resources), the transactions in the application
layer (e.g., BAPI_SALESORDER_CREATEFROMDAT2) are executed to update the
database supported by Oracle Database Management System (i.e., object mod-
els).
2https://cpntools.org/
3http://www.promtools.org
4https://www.sap.com/
Towards Reliable Business Process Simulation 9
4.2 Simulation Engine: CPN Tools
CPN Tools is a toolset providing support for editing, simulating, and analyzing
Colored Petri Nets (CPNs). For the detailed information of CPN Tools, we refer
readers to [8].
In the following, we explain how CPN Tools is used to implement the core
ingredients of business process simulation introduced in Subsect. 3.2 with the
example described in Figure 3. Note that there exist various design choices re-
sulting in different executable models in CPN Tools.
Fig. 5: A schematic overview of the CPN used to implement our simulation framework
Figure 5 shows the overall composition consisting of multiple CPN pages,
i.e., overview page,environment page,process page, and activity pages.
The overview page connects the environment page, process page, and resource
pool. The environment page describes the arrival process implemented as a
negative exponential distribution. A simulation instance is generated according
to the arrival process and passed into the process page.
In the process page, relevant objects for the simulation instance are gener-
ated by transition “generate object”. In our example, each simulation instance
associates an order, a delivery, and an invoice. The transitions, including “PO”,
“CD”, “PI”, and “SI”, represent activities in the process.
The object requirement for the execution of an activity is indicated with
the incoming arcs from the places representing object types to the corresponding
transition. For instance, “create delivery” involves an order and a delivery (i.e.,
10 Gyunam Park and Wil M.P. van der Aalst
incoming arcs from the place for order type (i.e., o1) and the place for delivery
(i.e., d1) to “create delivery”).
Control-flows are represented using the semantics of CPNs, i.e., a transition
is triggered by consuming tokens from its input places and producing tokens to
its output places. In our example, place_order is triggered first by consuming a
token at o1and producing a token at o2. Next, “create delivery” is triggered by
consuming tokens from o2and d1and producing tokens at o3and d2.
Each transition has a sub-page (i.e., activity page) where resource assign-
ments and value mappings are modeled. First, in each execution of the tran-
sition, a resource is selected from the resource pool based on the role. Next,
the relevant information for the execution of the activity (e.g., the customer
and document type in place_order ) is passed by the tokens from the connected
places.
Activity duration is implemented as the timed property of CPN Tools.
The activity duration is inscribed on the transition. For instance, the duration
of the place_order activity is populated from a normal distribution.
We generate commands using the designed CPNs. Below is an example of the
commands in XML-based CMD formats. In (act, res, time, ival)Ucmd, Lines
4-6 correspond to act,res, and time, while Lines 7-12 specify ival.
Listing 1: An example of CMD format
1 <?xm l v e r s i o n = " 1 .0 " en co d i n g= "UTF8"?>
2 <commands>
3 <command>
4 <a c t i v i t y >p l ac e _ or d er </ a c t i v i t y >
5 <r e s o u r c e >Adams</r e s o u r c e >
6 <ti me st am p >202102 23 1 0: 00 : 00 < / ti me sta mp >
7 <o r d e r I d > 50 000 43 12 </ o r d e r I d >
8 <c ust om er >10 32</c us to me r>
9 <doc Ty pe>TA</do cT yp e>
10 <s a l e s O r g >10 00</ s a l e s O r g >
11 <m a t e r i a l L i s t >P101 ,P103</ m a t e r i a l L i s t >
12 <q u a n t i t y L i s t >6, 5</ q u a n t i t y L i s t >
13 </command>
14 . . .
15 </commands>
4.3 Transformation Engine: ProM plug-in
The transformation engine is implemented as a plug-in of ProM, an open-source
framework for the implementation of process mining tools. Our new plug-in is
available in a new package named ERPSimulator in the nightly build of ProM.
The main input objects of the plug-in are transaction mapping, parameter map-
ping, and commands, whereas the outputs are SAP executions.
The transaction mapping is stored as an XML-based AMAP format, storing
relations between activities and transactions. Below is an example of the transac-
tion mapping for the commands generated by the simulation engine described in
Figure 5. Four activities in the simulation engine are assigned to corresponding
transactions defined in the SAP ERP system.
Towards Reliable Business Process Simulation 11
Listing 2: An example of AMAP format
1 <?xm l v e r s i o n = " 1 .0 " en co d i n g= "UTF8"?>
2 <transactionMapping>
3 <s t r i n g key =" p l a c e _ o r d e r " va l u e ="BAPI_SALESORDER_CREATEFROMDAT2"/ >
4 <s t r i n g key =" c r e a t e _ d e l i v e r y " va l u e ="BAPI_OUTB_DELIVERY_CREATE_SLS
"/>
5 <s t r i n g ke y=" pa ck _i te m s " v al u e ="L_TO_CREATE_DN"/ >
6 <s t r i n g k ey =" c r e a t e _ i n v o i c e " v a l u e ="BAPI_BILLINGDOC_CREATEMULTIPLE
"/>
7 </transactionMapping>
The parameter mapping is stored as an XML-based PMAP format, storing
relations between the parameter in commands and the system-defined parameter.
Below is an example of the parameter mapping for the commands produced
by the simulation engine described in Figure 5. In Line 3-4, “docType” and
“customer” are matched into DOC_TYPE and PARTN_NUMB in the SAP ERP
system.
Listing 3: An example of PMAP format
1 <?xm l v e r s i o n = " 1 .0 " e n c o di n g ="UTF8"? >
2 <pa r am e te r Ma p pi ng>
3 <s t r i n g ke y=" doc Ty pe " v a lu e ="DOC_TYPE"/>
4 <s t r i n g ke y=" c us to m er " va l ue ="PARTN_NUMB"/ >
5 <s t r i n g ke y=" o r d er I d " va l ue ="SALESDOCUMENTIN"/ >
6 <s t r i n g k ey =" s a l e s O r g " va l u e ="SALES_ORG"/>
7 . . .
8 </par ame ter Map pi n g>
Given the transaction and parameter mapping, the transformation engine
translates commands in CMD format into SAP Remote Function Calls (RFCs)
that can be directly executed in the system. The SAP RFC is an SAP interface
protocol managing the communication between the SAP ERP system and any
external system. For instance, the command in Listing 1 is translated to the RFC
specifying BAPI_SALESORDER_CREATEFROMDAT2, as defined in Listing 2, with
the parameters such as DOC_TYPE and PARTN_NUMB, as described in Listing 3.
5 Proof of concept
In this section, we validate the feasibility of the proposed framework in gen-
erating simulated event data that contain no omission and commission prob-
lems (i.e., reliable) and have the same view as the real-life event data (i.e.,
realistic) without having to manually model the complex design of ERP sys-
tems in simulation models. To this end, we simulate common business chal-
lenges in the order-to-cash (O2C) process using the implementation presented
in Sect. 4. The CPN files and commands are publicly available via https:
//github.com/gyunamister/ERPSimulator, as well as the user manual.
5.1 Experimental Design
The O2C process deals with customer orders. First, customers send inquiries and,
in return, the company sends corresponding quotations. Sales staff converts the
12 Gyunam Park and Wil M.P. van der Aalst
quotations into orders if the customers confirm them. Afterward, deliveries are
prepared by picking up and packing the items of the orders. Next, invoices are
sent to customers and the corresponding payments are collected.
In the following, we simulate common business challenges in the O2C process
using the implementation presented in Sect. 4, i.e., 1) low conversion rate, 2)
frequent price change, and 3) order cancellation. In each scenario, we evaluate
if simulated data have omission and commission problems by measuring the
number of executions with errors using the error handling module in the SAP
system. Besides, we apply process mining techniques, such as process discovery
and root-cause analysis, to simulated event data to show that they have the
same view as real-life event data, containing insightful knowledge.
5.2 Scenario 1: Low Conversion Rate
The low conversion rate from quotations to orders is not desired because not only
of the lost revenue but also of the waste of resources. We design and implement
the simulation model where quotations are not converted to orders mostly due to
the late response to the corresponding inquiry using CPN Tools. 288 commands
are generated by the simulation model and transformed into 288 RFCs using the
ProM plug-in. Among the 288 RFCs, 288 are successfully executed in the system.
As a result, 286 objects of 8 different object types, including inquiry, quotation,
order, etc., are created, updating more than 11 tables in the database.
Fig. 6: (a) a discovered process model of the O2C process in BPMN notation, (b) a low
conversion from quotations to orders (53.4%), (c) a box plot showing the correlation
between the response time and (un)successful conversions
We analyze the behaviors in the business process using the Inductive Visual
Miner in ProM [13]. Figure 6-(a) describes the process model in BPMN nota-
tion. As shown in Figure 6-(b), only 34 out of 73 quotations from the company
are converted to orders, showing the conversion rate of 46.6%. We define the
response time as the time difference between the completion of “create inquiry”
and “create quotation”. Figure 6-(c) shows the difference in the response time
Towards Reliable Business Process Simulation 13
between the successful and unsuccessful confirmed orders. Especially, the quo-
tations that are responded to later than 10 days are all rejected, showing the
correlation between the response time and unsuccessful conversion.
5.3 Scenario 2: Manual Price Changes
Due to different reasons (e.g., outdated pricing policy in the system), manual
changes in prices are carried out. We design this scenario in CPN Tools and
produce 4,249 commands that are transformed into 4,249 RFCs by the ProM
plug-in. All of the 4,249 RFCs are successfully executed in the system without
errors, creating 4,093 objects and updating more than 15 tables in the database.
Fig. 7: (a) a discovered process model, (b) manual price changes required for 113
orders, (c) a pie chart describing the ratio of price changes to total orders per products
Figure 7-(a) depicts the process model discovered using the process discovery
technique. As shown in Figure 7-(b), for 113 orders out of 402 orders, the manual
price change occurred. Figure 7-(c) describes manual price changes per product
(e.g., P-100 and P-101 ). The outer pie indicates the total number of orders per
product, while the red part in the inner pie represents the proportion of the
changes for each product. For instance, P-109 is included in 138 orders and 79%
of them require manual changes.
5.4 Scenario 3: frequent cancellation of orders
For the frequent cancellation of orders, we first generate 4,540 commands using
CPN Tools and transform them into 4,540 SAP RFCs using the ProM plug-in.
We successfully execute the 4,540 RFCs without errors and, accordingly, 4,384
objects of 8 different object types are created.
Figure 8 shows the process model discovered with the process discovery tech-
nique. As shown in Figure 8-(b), 97 out of 562 orders are canceled in the process.
We conduct further analysis on these canceled orders by analyzing the reasons
for the cancellation. Figure 8-(c) shows the pie chart explaining the proportion
of different reasons. The most frequent reason is the delivery date set too late.
14 Gyunam Park and Wil M.P. van der Aalst
Fig. 8: (a) a discovered process model, (b) order cancellations (97 out of 462 orders),
(c) a pie chart depicting the frequency of reasons for the order cancellation
The second most frequent reason is that the order exceeds the quantity limit of
one of its items, followed by the high price of the order.
6 Conclusion
In this paper, we proposed the framework for integrating ERP systems into
business process simulation to realize DTOs. The framework is composed of three
components: the ERP system, simulation engine, and transformation engine.
Commands are generated by the simulation engine and translated to system-
executable formats by the transformation engine. The executions are applied
to the system to update the object model in the system. The framework is
implemented using the SAP ERP system as the underlying ERP system, CPN
Tools as the simulation engine, and a ProM plug-in as the transformation engine.
By integrating ERP systems, we can effectively reflect the complex design of
the system into simulation models. Moreover, the resulting simulated data have
no omission and commission issues, ensuring the reliability of simulation results.
Also, having the same data structure as the real-life event data, the simulated
event data can be analyzed by existing analysis techniques. Furthermore, it sup-
ports action-oriented process mining by providing a digital twin where different
actions can be implemented and tested.
As future work, we plan to improve the implementation to support the feed-
back loop between the simulated data and simulation engine. In addition, we
plan to deploy the implementation to the techniques for action-oriented process
mining to evaluate the efficiency of actions. In the proposed framework, we as-
sume a one-to-one relationship between activities (i.e., human behaviors) and
transactions. However, in real-life business processes, a single human behavior
may involve multiple transactions and vice versa. In this work, we resolve the
Towards Reliable Business Process Simulation 15
issue in the implementation by manually aligning the level of simulated human
behaviors to the level of transactions. Future work should present a method to
support the resolution of the many-to-many relationship.
Acknowledgements We thank the Alexander von Humboldt (AvH) Stiftung
for supporting our research.
References
1. van der Aalst, W.M.P.: Data Science in Action. In: Process Mining. Springer,
Heidelberg (2016)
2. Dumas, M., La Rosa, M., Mendling, J., Reijers, H.A.: Fundamentals of Business
Process Management. Springer Berlin Heidelberg (2018)
3. Rashid, A., Tjahjono, B.: Achieving manufacturing excellence through the integra-
tion of enterprise systems and simulation. Production Planning & Control 27(10)
(2016) 837–852
4. Park, G., van der Aalst, W.M.P.: A general framework for action-oriented pro-
cess mining. In Del Río Ortega, A., et al., eds.: Business Process Management
Workshops, Springer International Publishing (2020) 206–218
5. Shannon, R., Johannes, J.D.: Systems simulation: The art and science. IEEE
Transactions on Systems, Man, and Cybernetics SMC-6(10) (1976) 723–724
6. van der Aalst, W.M.P.: Business process simulation survival guide. In vom Brocke,
J., Rosemann, M., eds.: Handbook on Business Process Management. Springer
Berlin Heidelberg (2015) 337–370
7. Dahl, O.J., Nygaard, K.: Simula: An algol-based simulation language. Commun.
ACM 9(9) (1966) 671678
8. Jensen, K., Kristensen, L.M., Wells, L.: Coloured petri nets and CPN tools for
modelling and validation of concurrent systems. International Journal on Software
Tools for Technology Transfer 9(3) (2007) 213–254
9. Martin, N., Depaire, B., Caris, A.: The use of process mining in business pro-
cess simulation model construction: Structuring the field. Business & Information
Systems Engineering 58(1) (2016) 73–87
10. Maruster, L., van Beest, N.R.T.P.: Redesigning business processes: a methodology
based on simulation and process mining techniques. Knowl. Inf. Syst. 21(3) (2009)
267–297
11. Rozinat, A., Mans, R., Song, M., van der Aalst, W.M.P.: Discovering simulation
models. Information Systems 34(3) (2009) 305–327
12. Camargo, M., Dumas, M., González-Rojas, O.: Automated discovery of business
process simulation models from event logs. Decision Support Systems 134 (2020)
113284
13. Leemans, S., Fahland, D., van der Aalst, W.M.P.: Process and deviation explo-
ration with inductive visual miner. In Limonad, L., Weber, B., eds.: BPM Demo
Sessions. (2014) 46–50
... The goal of business process management (BPM) is to improve and optimize business processes, which is why it works closely with other scientific disciplines. A virtual replica of the organization's business processes is used to analyze weaknesses in business processes and assist in operational decisionmaking by simulating different scenarios instantly [1]. According to Dumas [2], digital twin is a model of an object or a system that, together with a set of event data related to the object or system, can accurately predict the performance of a physical object or system over time. ...
Article
Full-text available
Business process modeling is one of the most important steps in managing the lifecycle of a business process. A digital twin of a business process is a virtual model of a business process designed to replicate the behavior of a real business process. The difference between a digital twin and a business process simulation is that the simulation does not require real-time data, while a digital twin is created based on real-time data. A digital twin comprises a physical entity in the real world, a software-based digital counterpart, and the data that links them both. When a created process communicates in real time with IoT devices and adapts its execution based on the information received in real time, this is referred to as a process digital twin. As the field of digital twins becomes increasingly popular, the challenge is to investigate how the process digital twin communicates with IoT devices in real time and updates its execution state.
... Park et al. [33] define DTO as "a digital representation of an organization that aims to uncover business process flaws and enhance operational decision-making by simulating various situations." For realizing the DTO, the same authors in [34] employed the action-oriented process mining framework [35] for monitoring violations of constraints in the operational process. ...
Article
Full-text available
italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Digital Twin of the Organization (DTO) is a relatively new concept that emerged to help managers have a full understanding of their organization and realize their objectives. Indeed, DTO enables connecting all the elements of an organization virtually by providing monitoring, optimization, prediction, and other capabilities through continuous simulations. Creating a flexible and evolvable DTO that covers and supports the organization’s business strategies is a complex and time-consuming task that requires engineering best practices. In this context, this paper presents and evaluates the EA Blueprint Pattern, which serves as an architectural reference for the development of a DTO by allowing for mapping well-known Enterprise Architecture concepts into software components defining the DTO software architecture. The evaluation is carried on by showing how to use the pattern for creating the DTO for a given organization. Then, a thorough discussion is conducted to analyze how the developed DTO should evolve to deal with vertical and horizontal integration. The lessons learned highlight the strengths and weaknesses along with practical implications for organizations that are eager to develop their DTO according to the EA Blueprint Pattern.
... To answer RQ2, we design a simulation model to simulate a Purchase-To-Pay (P2P) process using CPN tools [12]. The simulation model allows us to fully define the decision logic of decision points. ...
Chapter
Full-text available
Several decision points exist in business processes (e.g., whether a purchase order needs a manager’s approval or not), and different decisions are made for different process instances based on their characteristics (e.g., a purchase order higher than €500 needs a manager approval). Decision mining in process mining aims to describe/predict the routing of a process instance at a decision point of the process. By predicting the decision, one can take proactive actions to improve the process. For instance, when a bottleneck is developing in one of the possible decisions, one can predict the decision and bypass the bottleneck. However, despite its huge potential for such operational support, existing techniques for decision mining have focused largely on describing decisions but not on predicting them, deploying decision trees to produce logical expressions to explain the decision. In this work, we aim to enhance the predictive capability of decision mining to enable proactive operational support by deploying more advanced machine learning algorithms. Our proposed approach provides explanations of the predicted decisions using SHAP values to support the elicitation of proactive actions. We have implemented a Web application to support the proposed approach and evaluated the approach using the implementation.KeywordsProcess miningDecision miningMachine learningOperational supportProactive action
... To answer RQ2, we design a simulation model to simulate a Purchase-To-Pay (P2P) process using CPN tools [12]. The simulation model allows us to fully define the decision logic of decision points. ...
Preprint
Full-text available
Several decision points exist in business processes (e.g., whether a purchase order needs a manager's approval or not), and different decisions are made for different process instances based on their characteristics (e.g., a purchase order higher than $500 needs a manager approval). Decision mining in process mining aims to describe/predict the routing of a process instance at a decision point of the process. By predicting the decision, one can take proactive actions to improve the process. For instance, when a bottleneck is developing in one of the possible decisions, one can predict the decision and bypass the bottleneck. However, despite its huge potential for such operational support, existing techniques for decision mining have focused largely on describing decisions but not on predicting them, deploying decision trees to produce logical expressions to explain the decision. In this work, we aim to enhance the predictive capability of decision mining to enable proactive operational support by deploying more advanced machine learning algorithms. Our proposed approach provides explanations of the predicted decisions using SHAP values to support the elicitation of proactive actions. We have implemented a Web application to support the proposed approach and evaluated the approach using the implementation.
... This, in turn, triggers Fig. 13 Overview of the experimental design using a real-life SAP ERP system transactions in the system (e.g., transaction code VA01 in SAP ERP), thus supporting the business goals of the company. Figure 13 describes the overview of the experimental design based on the simulation approach that incorporates ERP systems [45]. First, we simulate the O2C process using CPN Tools 5 and generate the automation log where each record clones the behaviors of the employees in the company (e.g., Adams is supposed to place an order at 3 p.m.). ...
Article
Full-text available
As business environments become more dynamic and complex, it becomes indispensable for organizations to objectively analyze business processes, monitor the existing and potential operational frictions, and take proactive actions to mitigate risks and improve performances. Process mining provides techniques to extract insightful knowledge of business processes from event data collected during the execution of the processes. Besides, various approaches have been suggested to support the real-time (predictive) monitoring of the process-related problems. However, the link between the insights from the continuous monitoring and the concrete management actions for the actual process improvement is missing. Action-oriented process mining aims at connecting the knowledge extracted from event data to actions. In this work, we propose a general framework for action-oriented process mining covering the continuous monitoring of operational processes and the automated execution of management actions. Based on the framework, we suggest a cube-based action engine where actions are generated by analyzing monitoring results in a multi-dimensional way. The framework is implemented as a ProM plug-in and evaluated by conducting experiments on both artificial and real-life information systems.
Conference Paper
Full-text available
Process mining provides techniques to extract process-centric knowledge from event data available in information systems. These techniques have been successfully adopted to solve process-related problems in diverse industries. In recent years, the attention of the process mining discipline has shifted to supporting continuous process management and actual process improvement. To this end, techniques for operational support, including predictive process monitoring, have been actively studied to monitor and influence running cases. However, the conversion from insightful diagnostics to actual actions is still left to the user (i.e., the “action part” is missing and outside the scope of today’s process mining tools). In this paper, we propose a general framework for action-oriented process mining that supports the continuous management of operational processes and the automated execution of actions to improve the process. As proof of concept, the framework is implemented in ProM.
Article
Full-text available
Simulation is a versatile technique for quantitative analysis of business processes. It allows analysts to estimate the performance of a process under multiple scenarios. However, the discovery, validation, and tuning of business process simulation models is cumbersome and error-prone. It requires manual iterative refinement of the process model and simulation parameters in order to match the observed reality as closely as possible. Modern information systems store detailed execution logs of the business processes they support. Previous work has shown that these logs can be used to discover simulation models. However, existing methods for log-based discovery of simulation models do not seek to optimize the accuracy of the resulting models. Instead they leave it to the user to manually tune the simulation model to achieve the desired level of accuracy. This article presents an accuracy optimized method to discover business process simulation models from execution logs. The method decomposes the problem into a series of steps with associated configuration parameters. A hyper-parameter optimization method is then used to search through the space of possible configurations so as to maximize the similarity between the behavior of the simulation model and the behavior observed in the log. The method has been implemented as a tool and evaluated using logs from different domains.
Article
Full-text available
This paper discusses the significance of the enterprise systems and simulation integration in improving shop floor’s short-term production planning capability. The ultimate objectives are to identify the integration protocols, optimisation parameters and critical design artefacts, thereby identifying key ‘ingredients’ that help in setting out a future research agenda in pursuit of optimum decision-making at the shop floor level. While the integration of enterprise systems and simulation gains a widespread agreement within the existing work, the optimality, scalability and flexibility of the schedules remained unanswered. Furthermore, there seems to be no commonality or pattern as to how many core modules are required to enable such a flexible and scalable integration. Nevertheless, the objective of such integration remains clear, i.e. to achieve an optimum total production time, lead time, cycle time, production release rates and cost. The issues presently faced by existing enterprise systems (ES), if properly addressed, can contribute to the achievement of manufacturing excellence and can help identify the building blocks for the software architectural platform enabling the integration.
Article
Full-text available
The paper focuses on the use of process mining (PM) to support the construction of business process simulation (BPS) models. Given the useful BPS insights that are available in event logs, further research on this topic is required. To provide a solid basis for future work, this paper presents a structured overview of BPS modeling tasks and how PM can support them. As directly related research efforts are scarce, a multitude of research challenges are identified. In an effort to provide suggestions on how these challenges can be tackled, an analysis of PM literature shows that few PM algorithms are directly applicable in a BPS context. Consequently, the results presented in this paper can encourage and guide future research to fundamentally bridge the gap between PM and BPS.
Article
Full-text available
Simulation provides a flexible approach to analyzing business processes. Through simulation experiments various “what if“ questions can be answered and redesign alternatives can be compared with respect to key performance indicators. This chapter introduces simulation as an analysis tool for business process management. After describing the characteristics of business simulation models, the phases of a simulation project, the generation of random variables, and the analysis of simulation results, we discuss 15 risks, i.e., potential pitfalls jeopardizing the correctness and value of business process simulation. For example, the behavior of resources is often modeled in a rather naïve manner resulting in unreliable simulation models. Whereas traditional simulation approaches rely on hand-made models, we advocate the use of process mining techniques for creating more reliable simulation models based on real event data. Moreover, simulation can be turned into a powerful tool for operational decision making by using real-time process data.
Conference Paper
Full-text available
Process mining aims to extract information from recorded process data, which can be used to gain insights into the process. This requires applying a discovery algorithm and settings its parameters, after which the discovered process model should be evaluated. Both steps may need to be repeated several times until a satisfying model is found; we refer to this as process exploration. Existing commercial tools usually do not provide models having executable semantics, thereby disallowing for accurate map evaluation, while most academic tools lack features and by the repetitive nature of process exploration, their use is tedious. In this paper, we describe a novel process exploration tool: the Inductive visual Miner. It aims to bridge this gap between commercial and academic tools, by combining the executable semantics of academic tools with the exploration support of commercial tools. It also adds animation and deviation visualisation capabilities. Copyright © 2014 for this paper by its authors. Copying permitted for private and academic purposes.
Article
Full-text available
Nowadays, organizations have to adjust their business processes along with the changing environment in order to maintain a competitive advantage. Changing a part of the system to support the business process implies changing the entire system, which leads to complex redesign activities. In this paper, a bottom-up process mining and simulation-based methodology is proposed to be employed in redesign activities. The methodology starts with identifying relevant performance issues, which are used as basis for redesign. A process model is “mined” and simulated as a representation of the existing situation, followed by the simulation of the redesigned process model as prediction of the future scenario. Finally, the performance criteria of the current business process model and the redesigned business process model are compared such that the potential performance gains of the redesign can be predicted. We illustrate the methodology with three case studies from three different domains: gas industry, government institution and agriculture.
Article
Full-text available
Process mining is a tool to extract non-trivial and useful information from process execution logs. These so-called event logs (also called audit trails, or transaction logs) are the starting point for various discovery and analysis techniques that help to gain insight into certain characteristics of the process. In this paper we use a combination of process mining techniques to discover multiple perspectives (namely, the control-flow, data, performance, and resource perspective) of the process from historic data, and we integrate them into a comprehensive simulation model. This simulation model is represented as a colored Petri net (CPN) and can be used to analyze the process, e.g., evaluate the performance of different alternative designs. The discovery of simulation models is explained using a running example. Moreover, the approach has been applied in two case studies; the workflows in two different municipalities in the Netherlands have been analyzed using a combination of process mining and simulation. Furthermore, the quality of the CPN models generated for the running example and the two case studies has been evaluated by comparing the original logs with the logs of the generated models.
Book
This textbook covers the entire Business Process Management (BPM) lifecycle, from process identification to process monitoring, covering along the way process modelling, analysis, redesign and automation. Concepts, methods and tools from business management, computer science and industrial engineering are blended into one comprehensive and inter-disciplinary approach. The presentation is illustrated using the BPMN industry standard defined by the Object Management Group and widely endorsed by practitioners and vendors worldwide. In addition to explaining the relevant conceptual background, the book provides dozens of examples, more than 230 exercises – many with solutions – and numerous suggestions for further reading. This second edition includes extended and completely revised chapters on process identification, process discovery, qualitative process analysis, process redesign, process automation and process monitoring. A new chapter on BPM as an enterprise capability has been added, which expands the scope of the book to encompass topics such as the strategic alignment and governance of BPM initiatives. The textbook is the result of many years of combined teaching experience of the authors, both at the undergraduate and graduate levels as well as in the context of professional training. Students and professionals from both business management and computer science will benefit from the step-by-step style of the textbook and its focus on fundamental concepts and proven methods. Lecturers will appreciate the class-tested format and the additional teaching material available on the accompanying website.
Chapter
In recent years, data science emerged as a new and important discipline. It can be viewed as an amalgamation of classical disciplines like statistics, data mining, databases, and distributed systems. Existing approaches need to be combined to turn abundantly available data into value for individuals, organizations, and society. Moreover, new challenges have emerged, not just in terms of size (“Big Data”) but also in terms of the questions to be answered. This book focuses on the analysis of behavior based on event data. Process mining techniques use event data to discover processes, check compliance, analyze bottlenecks, compare process variants, and suggest improvements. In later chapters, we will show that process mining provides powerful tools for today’s data scientist. However, before introducing the main topic of the book, we provide an overview of the data science discipline.