Chapter

A Method for Bottleneck Detection, Prediction, and Recommendation Using Process Mining Techniques

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Bottlenecks arise in many processes, often negatively impacting performance. Process mining can facilitate bottleneck analysis, but research has primarily focused on bottleneck detection and resolution, with limited attention given to the prediction of bottlenecks and recommendations for improving process performance. As a result, operational support for bottleneck resolution is often partially or not realized. The aim of this paper is to propose a method for Bottleneck Detection, Prediction, and Recommendation (BDPR) using process mining techniques to achieve operational support. A design science research methodology is adopted to design, develop, and demonstrate the BDPR method. A systematic literature review and a developed classification model provide theoretical support for the BDPR method and offer scholarly in the field of process mining a starting point for research. The BDPR method extends the utility of the classification model and aims to provide guidance to scholars and practitioners for assessing, selecting, evaluating, and implementing process mining techniques to realize operational support. A case study at a logistics service provider demonstrates the use of the proposed BDPR method.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

Article
Full-text available
Dit is de publieke eindrapportage van het project Industry 4.0 driven Supply Chain Coordination for Small and Medium-sized Enterprises (ICCOS), een onderzoeksproject dat is uitgevoerd binnen de Topsector Logistiek op basis van de TKI Toeslag Call. In het ICCOS project is gedurende 3 jaar experimenteel en industrieel onderzoek uitgevoerd met betrekking tot logistieke data spaces, de inzet van kunstmatige intelligentie voor het coördineren van ketens, de adoptie van nieuwe Industrie 4.0 gerelateerde technologieën en de veranderende rol van de logistiek professional. ICCOS is een vervolg op het Dinalog project Autonomous Logistics Miners for Small- and Medium sized Businesses. ICCOS heeft tot doel de concurrentiepositie van de Nederlandse logistieke sector te verbeteren door de acceptatie en het gebruik van industrie 4.0-gerelateerde technologieën te verhogen in combinatie met geavanceerde real-time data-analyse. De Universiteit Twente is hoofdaanvrager en projectleider van ICCOS en heeft in samenwerking met ABC Flows, Bullit Digital, Deltago, Districon, Emons, King Nederland, LOGAPS en Veenman industrieel en experimenteel onderzoek uitgevoerd. Er zijn binnen ICCOS diverse ontwerpen en prototypes gerealiseerd voor IDS-gebaseerde logistieke data spaces. Binnen ICCOS is door Bullit Digital een industrieplatform gerealiseerd op basis van het Open Trip Model (OTM) met herbruikbare algoritmes. Het industrieplatform is gevalideerd bij Emons voor verschillende use cases. Op basis van het Cross Chain Collaboration Center (4C) model is een control tower ontworpen en geïmplementeerd bij Emons. Daarnaast is een regieconcept ontworpen voor vaccinatielogiek en geïmplementeerd bij Isala. Bij Veenman is een digitaal transformatie traject gestart om van Business Intelligence de stap te gaan maken naar het gebruik van Artificial Intelligence. Via Districon is BigMile binnen 7 bedrijven toegepast door 250 studenten. Samen met Evofenedex zijn sector surveys uitgevoerd om inzicht te verkrijgen in de belangrijkste factoren die geavanceerd datagebruik en de adoptie van nieuwe technologie beinvloeden. De resultaten van ICCOS zijn verwerkt in 7 wetenschappelijke artikelen en 2 Dinalog publicaties. Daarnaast zijn er 12 Bachelor theses, 4 Master theses en 1 PDEng thesis geschreven als onderdeel van het ICCOS project in nauwe samenwerking met consortiumpartners en Evofenedex. ICCOS laat zien dat datagedreven logistiek en inzet van nieuwe Industrie 4.0 technologieën verscheidende kansen biedt om te innoveren in de sector, maar ook dat nieuwe ontwikkelingen zoals IDS een lange adem vergen. Concrete praktijkvoorbeelden, nieuwe kennis delen en begeleiding bij ontwikkeling en implementatie zijn hierbij van essentieel belang. Specifiek richting het MKB. In samenwerking met de NL AI Coalitie wordt vanuit ICCOS actief bijgedragen aan de ontwikkeling van een gratis online cursus over de toepassing van kunstmatige intelligentie in logistiek en maritiem.In het ReAL project wordt het industrieplatform doorontwikkeld en aangevuld met ondersteunende onderwijsmiddelen om een learning community te vormen. Daarnaast wordt binnen DASLOGIS gewerkt aan de demonstratie en implementatie van logistieke data spaces.
Article
Full-text available
Background Acute care for critical illness requires very strict treatment timeliness. However, healthcare providers usually cannot accurately figure out the causes of low efficiency in acute care process due to the lack of effective tools. Besides, it is difficult to compare or conformance processes from different patient groups. Methods To solve these problems, we proposed a novel process mining framework with time perspective, which integrates four steps: standard activity construction, data extraction and filtering, iterative model discovery, and performance analysis. Results It can visualize the execution of actual clinical activities hierarchically, evaluate the timeliness and identify bottlenecks in the treatment process. We take the acute ischemic stroke as a case study, and retrospectively reviewed 420 patients’ data from a large hospital. Then we discovered process models with timelines, and identified the main reasons for in-hospital delay. Conclusions Experiment results demonstrate that the framework proposed could be a new way of drawing insights about hospitals’ clinical process, to help clinical institutions increase work efficiency and improve medical service.
Chapter
Full-text available
The increasing amounts of data have affected conceptual modeling as a research field. In this context, process mining involves a set of techniques aimed at extracting a process schema from an event log generated during process execution. While automatic algorithms for process mining and analysis are needed to filter out irrelevant data and to produce preliminary results, visual inspection, domain knowledge, human judgment and creativity are needed for proper interpretation of the results. Moreover, a process discovery on an event log usually results in complicated process models not easily comprehensible by the business user. To this end, visual analytics has the potential to enhance process mining towards the direction of explainability, interpretability and trustworthiness in order to better support human decisions. In this paper we propose an approach for identifying bottlenecks in business processes by analyzing event logs and visualizing the results. In this way, we exploit visual analytics in the process mining context in order to provide explainable and interpretable analytics results for business processes without exposing to the user complex process models that are not easily comprehensible. The proposed approach was applied to a manufacturing business process and the results show that visual analytics in the context of process mining is capable of identifying bottlenecks and other performance-related issues and exposing them to the business user in an intuitive and non-intrusive way.
Article
Full-text available
The most critical challenge in analyzing the data of Massive Open Online Courses (MOOC) using process mining techniques is storing event logs in appropriate formats. In this study, an innovative approach for extraction of MOOC data is described. Thereafter, several process-discovery techniques, i.e., Dotted Chart Analysis, Fuzzy Miner, and Social Network Miner, are applied to the extracted MOOC data. In addition, behavioral studies of high-and low-performance students taking online courses are conducted. These studies considered i) overall behavioral statistics, ii) identification of bottlenecks and loopback behavior through frequency-and time-performance-based approaches, and iii) working together relationships. The results indicated that there are significant behavioral differences between the two groups. We expect that the results of this study will help educators understand students' behavioral patterns and better organize online course content.
Article
Full-text available
A range of advanced methods have been formulated and utilized in the efforts of improving the business processes in many enterprises. One impacting enhancement technique is to employ process mining algorithms as modeling and analysis tools in order to provide the actual business performance by digging the event log data and finding the useful information. This paper focuses on the applications of process mining in e-commerce industry. Event log data with timestamps were retrieved and analyzed from the web databases of an e-commerce company and process mining algorithms, like inductive miner and fuzzy miner were executed for generating the actual e-commerce business processes automatically and checking their conformance with the standardized processes as well as to early detecting any bottlenecks and issues in the e-commerce processes. Several e-commerce process issues were considered, such as item procurement, product order and delivery item tracking. The process mining modeling and its statistical results indicate that process mining can provide an efficient and effective tool for modeling and analyzing the e-commerce business processes allowing for real-time process auditing and reengineering.
Conference Paper
Full-text available
Traditional modeling approaches, based on predefined business logic, offer little support for today's complex environments. In this paper, we propose a conceptual agent-based simulation framework to help not only discover complex business processes but also to analyze and learn from emergent behavior arising in cyber-physical systems. Techniques originating from agent-based modeling as well as from the process mining discipline are used to reinforce agent-based decision-making. Whereas agent-technology is used to orchestrate the integration and relationship between the environment and business logic activities, process mining capabilities are mainly used to discover and analyze emergent behavior. Using a functional decomposition approach, we specified three agent types: cyber-physical controller agent, business rule management agent, and emergent behavior detection agent. We use agent-based simulation of a logistics cold chain case study to demonstrate the feasibility of our approach.
Chapter
Full-text available
In response to the growth of demand for web services, there is a rapid increase in distributed systems. Accordingly, software architects design components in a modular fashion to allow for higher flexibility and scalability. In such an infrastructure, a variety of microservices are continuously evolving to respond to the needs of every application. These microservices asynchronously provide reusable modules for other services. To gain valuable insights into the actual software or dynamic user behaviors within distributed systems, data mining, and process mining disciplines provide many powerful data-driven analysis techniques. However, gaining reliable insights into the overall architecture of a heterogeneous distributed system is proved to be challenging and is a tedious task. In this paper, on the one hand, we present a novel approach that enables domain experts to reverse engineer the architecture of the distributed system and monitor its status. On the other hand, it allows the analysis and extraction of new insights about dynamic usage patterns within a distributed environment. With the help of two case studies under real-life conditions, we have assessed our methodology and demonstrated the validity of our approach to discover new insights and bottlenecks in the system.
Article
Full-text available
Process mining as a modeling and analysis tool can be used to improve the business performance by looking at the actual business processes. This paper presents the applications of process mining in automotive industry. Using event log data with timestamps, process mining algorithms, like inductive miner and fuzzy miner were able to automatically generate car manufacturing processes, automatically checking the conformance between the actual processes and the predefined standard ones, and identify and solve any bottlenecks and issues in the car manufacturing processes. A number of car manufacturing issues were considered in this research, such as process delays, stagnant workflow, mismanagement, faulty production and labor insufficiency. The modeling and statistical results show promising leverage of process mining in automotive industry that can lead to the autonomous car manufacturing with abilities for real-time process auditing and reengineering.
Article
Full-text available
Healthcare organizations are under increasing pressure to improve productivity, gain competitive advantage and reduce costs. In many cases, despite management already gained some kind of qualitative intuition about inefficiencies and possible bottlenecks related to the enactment of patients’ careflows, it does not have the right tools to extract knowledge from available data and make decisions based on a quantitative analysis. To tackle this issue, starting from a real case study conducted in San Carlo di Nancy hospital in Rome (Italy), this article presents the results of a process mining project in the healthcare domain. Process mining techniques are here used to infer meaningful knowledge about the patient careflows from raw event logs consisting of clinical data stored by the hospital information systems. These event logs are analyzed using the ProM framework from three different perspectives: the control flow perspective, the organizational perspective and the performance perspective. The results on the proposed case study show that process mining provided useful insights for the governance of the hospital. In particular, we were able to provide answers to the management of the hospital concerning the value of last investments, and the temporal distribution of abandonments from emergency room and exams without reservation.
Article
Full-text available
We develop a framework for assessing technological readiness level using available data on business processes. By constructing a network of actors and linking process steps together it is possible to estimate the complexity of organizational structure, examine the bottlenecks and analyse whether advantages of available technology are fully utilized. Using publicly available data on business event logs we also test an automated process mining procedure and suggest a measurement to link our results to the TRL.
Article
Full-text available
The paper motivates, presents, demonstrates in use, and evaluates a methodology for conducting design science (DS) research in information systems (IS). DS is of importance in a discipline oriented to the creation of successful artifacts. Several researchers have pioneered DS research in IS, yet over the past 15 years, little DS research has been done within the discipline. The lack of a methodology to serve as a commonly accepted framework for DS research and of a template for its presentation may have contributed to its slow adoption. The design science research methodology (DSRM) presented here incorporates principles, practices, and procedures required to carry out such research and meets three objectives: it is consistent with prior literature, it provides a nominal process model for doing DS research, and it provides a mental model for presenting and evaluating DS research in IS. The DS process includes six steps: problem identification and motivation, definition of the objectives for a solution, design and development, demonstration, evaluation, and communication. We demonstrate and evaluate the methodology by presenting four case studies in terms of the DSRM, including cases that present the design of a database to support health assessment methods, a software reuse measure, an Internet video telephony application, and an IS planning method. The designed methodology effectively satisfies the three objectives and has the potential to help aid the acceptance of DS research in the IS discipline.
Conference Paper
Full-text available
Process mining aims to transform event data recorded in information systems into knowledge of an organisation’s business processes. The results of process mining analysis can be used to improve process performance or compliance to rules and regulations. However, applying process mining in practice is not trivial. In this paper we introduce PM2^2, a methodology to guide the execution of process mining projects. We successfully applied PM2^2 during a case study within IBM, a multinational technology corporation, where we identified potential process improvements for one of their purchasing processes.
Chapter
Full-text available
Recently, process mining emerged as a new scientific discipline on the interface between process models and event data. On the one hand, conventional Business Process Management (BPM) and Workflow Management (WfM) approaches and tools are mostly model-driven with little consideration for event data. On the other hand, Data Mining (DM), Business Intelligence (BI), and Machine Learning (ML) focus on data without considering end-to-end process models. Process mining aims to bridge the gap between BPM and WfM on the one hand and DM, BI, and ML on the other hand. Here, the challenge is to turn torrents of event data (“Big Data”) into valuable insights related to process performance and compliance. Fortunately, process mining results can be used to identify and understand bottlenecks, inefficiencies, deviations, and risks. This tutorial paper introduces basic process mining techniques that can be used for process discovery and conformance checking. Moreover, some very general decomposition results are discussed. These allow for the decomposition and distribution of process discovery and conformance checking problems, thus enabling process mining in the large.
Article
Full-text available
Bottleneck detection in manufacturing is the key to improving production efficiency and stability in order to improve capacity. Yet common bottleneck detection methods in industry and academia lack either accuracy or practicability, or both, for dynamic systems. The new methodology is conducted by the observation of processes and inventories. Blocked processes and full inventories indicate a downstream bottleneck. Starved processes and empty inventories indicate an upstream bottleneck. Through subsequent observations of multiple process states and inventory levels within a system, it is possible to determine the direction of the bottleneck at the given time and hence to find the momentary bottleneck in the system. The shifting of bottlenecks can be observed directly. Work-sampling techniques can be used to obtain a long-term picture of the dynamically shifting bottleneck. The new methodology does not require any calculations, statistics, or time measurements. Hence the method is suited for practical use by shop floor supervisors and clerks. The direct observation of the bottleneck also gives additional information about the underlying causes of the bottlenecks, simplifying the improvement of the system capacity. Extensive field testing of the method received positive feedback not only from management but also shop floor operators. The method is already in use at the Robert Bosch GmbH, where it is known as the bottleneck walk.
Article
Full-text available
BackgroundIn 2004 the concept of evidence-based software engineering (EBSE) was introduced at the ICSE04 conference.AimsThis study assesses the impact of systematic literature reviews (SLRs) which are the recommended EBSE method for aggregating evidence.MethodWe used the standard systematic literature review method employing a manual search of 10 journals and 4 conference proceedings.ResultsOf 20 relevant studies, eight addressed research trends rather than technique evaluation. Seven SLRs addressed cost estimation. The quality of SLRs was fair with only three scoring less than 2 out of 4.ConclusionsCurrently, the topic areas covered by SLRs are limited. European researchers, particularly those at the Simula Laboratory appear to be the leading exponents of systematic literature reviews. The series of cost estimation SLRs demonstrate the potential value of EBSE for synthesising evidence and making it available to practitioners.
Article
Full-text available
The paper motivates, presents, demonstrates in use, and evaluates a methodology for conducting design science (DS) research in information systems (IS). DS is of importance in a discipline oriented to the creation of successful artifacts. Several researchers have pioneered DS research in IS, yet over the past 15 years, little DS research has been done within the discipline. The lack of a methodology to serve as a commonly accepted framework for DS research and of a template for its presentation may have contributed to its slow adoption. The design science research methodology (DSRM) presented here incorporates principles, practices, and procedures required to carry out such research and meets three objectives: it is consistent with prior literature, it provides a nominal process model for doing DS research, and it provides a mental model for presenting and evaluating DS research in IS. The DS process includes six steps: problem identification and motivation, definition of the objectives for a solution, design and development, demonstration, evaluation, and communication. We demonstrate and evaluate the methodology by presenting four case studies in terms of the DSRM, including cases that present the design of a database to support health assessment methods, a software reuse measure, an Internet video telephony application, and an IS planning method. The designed methodology effectively satisfies the three objectives and has the potential to help aid the acceptance of DS research in the IS discipline.
Article
This case study proposes a novel methodology for hospital management to plan and implement short to medium-term improvement initiatives by integrating data-driven decision-making with Multi-Criteria Decision-Making/Analysis (MCDM/A). Historical data on 165 patients operated upon in eye surgery department was first analysed (using Tableau software) to provide overall insights supported by process mining (using Celonis software) to identify the process bottlenecks that require immediate attention. The bottlenecks led to the identification of issues and their potential solutions. These potential solutions were taken as alternatives and run through Visual PROMETHEE software that incorporates the PROMETHEE II method, an MCDM/A method. By adopting a visual approach, the hospital management could arrive at a quick consensus regarding the actual situation and bottleneck, potential solutions to issues identified and their comparative ranking in an interactive environment. While insights from data analysis bring a consensus on the issues requiring a resolution, the solutions to these issue(s) can be compared and ranked by utilising PROMETHEE II. Hence, this paper proposes a unique methodology that facilitates both short-term and medium-term decision-making by utilising visual means for understanding current reality and developing/exploring potential solutions to identified issues.
Conference Paper
One way to do business process modelling is to use the process mining. Process mining links the gap between traditional model-based process analysis such as business process management simulation and data- centric analysis techniques such as machine learning and data mining. In process modelling, bottleneck conditions are often found. Bottlenecks conditions can be found in the process models generated using Process Mining applications such as ProM and Disco based on event log data. There is another alternative to find the bottleneck condition of the event log using a statistical approach. The alternative is to view the event log as an asset that can be explored without using a normative process model. This paper proposes a statistical test of heteroscedasticity in event log data. Then the heteroscedasticity test results from the event log are compared with the results of normative process modelling with the Inductive Miner algorithm using the Process Mining application. The comparison results show that the detected event log data having heteroscedasticity problems will ensure a bottleneck condition in the process model. The approach taken can be an alternative in evaluating the process model based on its event log.
Article
To fully understand how a construction project actually proceeds, a novel framework for automated process discovery from building information modeling (BIM) event logs is developed. The significance of the work is to manage and optimize the complex construction process towards the ultimate goal of narrowing the gap between BIM and process mining. More specifically, meaningful information is retrieved from prepared event logs to build a participant-specific process model, and then the established model with executable semantics and fitness guarantees provides evidence in process improvement through identifying deviations, inefficiencies, and collaboration features. The proposed method has been validated in a case study, where the input is an as-planned event log from a real BIM construction project. The process model is created automatically by the inductive mining and fuzzy mining algorithms, which is then analyzed deeply under the joint use of conformance checking, frequency and bottleneck analysis, and social network analysis (SNA). The discovered knowledge contributes to revealing potential problems and evaluating the performance of workflows and participants objectively. In the discussion part, as-built data from the internet of things (IoT) deployment in construction site monitoring is automatically compared with the as-planned event log in the BIM platform to detect the actual delays. It turns out that the participant playing a central role in the network tends to overburden with heavier workloads, leading to more undesirable discrepancies and delays. As a result, extensive investigations based on process mining supports data-driven decision making to strategically smooth the construction process and increase collaboration opportunities, which also help in reducing the risk of project failure ahead of time.
Article
The twofolds purpose of this pilot study is first to integrate process mining with Lean Six Sigma (LSS) in a shared resource environment for performing eye surgery operations requiring local anaesthesia and, secondly for the collaboration and measuring performance by utilising the concept of the “Single Minutes’ Exchange of Dies” for quick changeover. A review of articles employing process mining and LSS carried out has identified the gaps in planning an innovative ‘Collaboration Table’ for doctors. The “Plan, Do, Study and Act Cycle (PDSA)” is applied in this pilot case. While different researchers have been working on the application of process mining in finding bottlenecks for improvement and application of LSS, there is a lack of method and tools that integrate and apply these two approaches under the PDSA cycle. The paper provides a new method ‘Collaboration Table’ developed in MS Excel 2019, integrated with the application of process mining and LSS under the overall PDSA improvement methodology.
Article
With the focus of smart construction project management, this paper presents a closed-loop digital twin framework under the integration of Building Information Modeling (BIM), Internet of Things (IoT), and data mining (DM) techniques. To be specific, IoT connects the physical and cyber world to capture real-time data for modeling and analyzing, and data mining methods incorporated in the virtual model aim to discover hidden knowledge in collected data. The proposed digital twin has been verified in a practical BIM-based project. Based on large inspection data from IoT devices, the 4D visualization and task-centered or worker-centered process model are built as the virtual model to simulate both the task execution and worker cooperation. Then, the high-fidelity virtual model is investigated by process mining and time series analysis. Results show that possible bottlenecks in the current process can be foreseen using the fuzzy miner, while the number of finished tasks in the next phase can be predicted by the multivariate autoregressive integrated moving average (ARIMAX) model. Consequently, tactic decision-making can realize to not only prevent possible failure in advance, but also arrange work and staffing reasonably to make the process adapt to changeable conditions. In short, the significance of this paper is to build a data-driven digital twin framework integrating with BIM, IoT, and data mining for advanced project management, which can facilitate data communication and exploration to better understand, predict, and optimize the physical construction operations. In future works, more complex cases with multiple data streams will be used to test the developed framework, and more detailed interpretations with the actual observations of construction activities will be given.
Chapter
In this paper, we proposed a method that allows us to formulate and evaluate process mining indicators through questions related to the process traceability, and to bring about a clear understanding of the process variables through data visualization techniques. This proposal identifies bottlenecks and violations of policies that arise due to the difficulty of carrying out measurements and analysis for the improvement of process quality assurance and process transformation. The proposal validation was carried out in a health clinic in Lima (Peru) with data obtained from an information system that supports the surgery block process. Finally, the results contribute to the optimization of decision-making by the medical staff involved in the surgery block process.
Chapter
In healthcare organizations, delivering high quality service to patients with affordable cost is usually a challenge in nature. This is where process mining comes into place to learn system bottlenecks, their deficiencies, how to optimize processes and how to avoid cost over estimation. In this paper, algorithm, a process mining technique, is used to understand, depict and enhance emergency rooms through workflow modeling and simulation. A closed form Theorem that defines a valid learning system is introduced with a closed form proof to validate the process mining approach. The algorithm is applied to a real life scenario where event log files are extracted to exploit the behavior of system processes and illustrate the system workflow structure. Results show an enhancement with system behavior, especially with metrics related to deadlocks.
Article
Building Information Modelling (BIM) is defined as the process of creation and management of digital replica for building products in a collaborative design set-up. On this basis, BIM as a digital collaboration platform in AECO (Architecture, Engineering, Construction, and Operation) industry, can be upgraded to assist monitoring, control and improvement of the business processes related to planning, design, construction and operation of building facilities. The main problem in this regard, is the wastage of data related to activities completed by different actors during the project; and subsequently, the lack of analytics to discover latent patterns in collaboration and execution of such processes. The present study aims to enable BIM to capture digital footprints of project actors and create event logs for design authoring phase of building projects. This is done using files in IFC (Industry Foundation Classes) format, archived during the design process. We have developed algorithms to create event logs from such archives, and analyzed the event logs using process mining (i.e. process discovery, conformance checking and bottleneck analysis), to identify measures derived from as-happened processes. BIM managers can implement such measures in monitoring, controlling and re-engineering work processes related to design authoring. Two case studies were completed to validate and verify the products and findings of the research. Our results show that process models discovered/fine-tuned at various resolutions and from different perspectives (including ‘actor-centric’ and ‘phase-centric’ views) can provide a realistic view of the BIM project execution. This includes understanding the structure of collaboration and hand-over of work; evaluation of compliance with the BIM execution plan; and detection of bottlenecks and re-works. While the scope of the study has been limited to design authoring processes, this mindset can be extended to other BIM uses, and other phases (such as construction and operation) of building projects. Given the growing efforts on upgrading BIM to capture and formalize the lifecycle data on the products, processes and actors, this study can strongly support BIM managers with documentation and evaluation of the business processes and workflows in their project teams.
Chapter
Process management has been considered in many organizations. Finding improvement opportunities is an important part of process management. Process mining technique can be used for analyzing the processes and extracting improvement opportunities. Healthcare systems includes one of complicated processes between industries. In this paper, process mining techniques are used in order to analyze pre-hospital processes in emergency room. The process discovery phase is implemented based on 4 different states, which are introduced in this study to increase the accuracy of process analysis. After discovering the process model, conformance checking and enhancement are following steps that were done in this study. The data is extracted from the automation system of a pre-hospital emergency room, which is used as input event logs for process mining. Statistical records including control sheet of one year were provided by the organization. Control sheet and in consequence the P-control chart are used as supplement of conformance checking phase. Enhancement phase is based on two states, and used performance analysis by considering factors of output, cycle time/duration and costs, which helps the pre-hospital emergency room to improve their processes.
Article
Given a process model and an event log, conformance checking aims to relate the two together, e.g. to detect discrepancies between them. For the synchronous product net of the process and a log trace, we can assign different costs to a synchronous move, and a move in the log or model. By computing a path through this (synchronous) product net, whilst minimizing the total cost, we create a so-called optimal alignment — which is considered to be the primary target result for conformance checking. Traditional alignment-based approaches (1) have performance problems for larger logs and models, and (2) do not provide reliable diagnostics for non-conforming behaviour (e.g. bottleneck analysis is based on events that did not happen). This is the reason to explore an alternative approach that maximizes the use of observed events. We also introduce the notion of milestone activities, i.e. unskippable activities, and show how the different approaches relate to each other. We propose a data structure, that can be computed from the process model, which can be used for (1) computing alignments of many log traces that maximize synchronous moves, and (2) as a means for analysing non-conforming behaviour. In our experiments we show the differences of various alignment cost functions. We also show how the performance of constructing alignments with our data structure relates to that of the state-of-the-art techniques.
Chapter
In recent years, process mining has been used throughout the world to contribute to the management and development of business processes. Various works have been taken on the steps of process mining and how to analyze business processes. Process mining is of great importance in terms of reducing costs in different sectors, making process improvements and especially shortening times of processes. In this study, the real estate processes offered by the bank are discussed. In real estate transactions, there are processes that differ with many variables. These transactions, these process is long and complicated because of have more checkpoints and operations. Furthermore, end-to-end analysis becomes more difficult because it contains more than one subprocess and integration. Because the processes are not standardized, efficiency studies cannot be disciplined. Records of bank real estate processes obtained over time cannot be used efficiently in process development studies. In the research, how the process mining methodology can be applied in real estate transactions that interact with customers and systems is shown. In addition, it was aimed to show the bottlenecks, long-time processes, work flow and resource statistics in banking process and to reveal the solution suggestions. Process visualizations have been realized with the support of fuzzy model algorithms due to the high number of cases and actions. Process analysis and clustering studies were found to be easier with the help of fuzzy models. The process of obtaining the processes from the data, which is the biggest step of process mining, has been solved by fuzzy models. The results showed that process mining is an important methodology for improving the enterprise processes and increasing their efficiency.
Article
Process mining is a growing and promising study area focused on understanding processes and to help capture the more significant findings during real execution rather than, those methods that, only observed idealized process model. The objective of this article is to map the active research topics of process mining and their main publishers by country, periodicals, and conferences. We also extract the reported application studies and classify these by exploration domains or industry segments that are taking advantage of this technique. The applied research method was systematic mapping, which began with 3713 articles. After applying the exclusion criteria, 1278 articles were selected for review. In this article, an overview regarding process mining is presented, the main research topics are identified, followed by identification of the most applied process mining algorithms, and finally application domains among different business segments are reported on. It is possible to observe that the most active research topics are associated with the process discovery algorithms, followed by conformance checking, and architecture and tools improvements. In application domains, the segments with major case studies are healthcare followed by information and communication technology, manufacturing, education, finance, and logistics.
Article
Purpose This paper aims to investigate the process performances in Emergency Departments (EDs) with a novel data-driven approach, permitting to discover the entire patient-flow, deploy the performances in term of time and resources on the activities and flows and identify process deviations and critical bottlenecks. Moreover, the use of this methodology in real time might dynamically provide a picture of the current situation inside the ED in term of waiting times, crowding, resources, etc., supporting the management of patient demand and resources in real time. Design/methodology/approach The proposed methodology exploits the process-mining techniques. Starting from the event data inside the hospital information systems, it permits automatically to extract the patient-flows, to evaluate the process performances, to detect process exceptions and to identify the deviations between the expected and the actual results. Findings The application of the proposed method to a real ED revealed being valuable to discover the actual patient-flow, measure the performances of each activity with respect to the predefined targets and compare different operating situations. Practical implications Starting from the results provided by this system, hospital managers may explore the root causes of deviations, identify areas for improvements and hypothesize improvement actions. Finally, process-mining outputs may provide useful information for creating simulation models to test and compare alternative ED operational scenarios. Originality/value This study responds to the need of novel approaches for monitoring and evaluating processes performances in the EDs. The novelty of this data-driven approach is the opportunity to timely connect performances, patient-flows and activities.
Article
Resource management is crucial in operational process management because it directly affects the performance of the operational process. However, resource models are generally very complicated and variable in real world process management. Therefore, process mining techniques based on the real log data can be utilized to scrutinize resource utilization and bottleneck detection. In this paper, a method for detecting the bottleneck is proposed in the viewpoint of the resource pool that was involved in the process execution. In addition, the method is combined with the critical path analysis in order to detect the bottleneck resource pools of the decomposed process models of the whole process.
Book
This is the second edition of Wil van der Aalst’s seminal book on process mining, which now discusses the field also in the broader context of data science and big data approaches. It includes several additions and updates, e.g. on inductive mining techniques, the notion of alignments, a considerably expanded section on software tools and a completely new chapter of process mining in the large. It is self-contained, while at the same time covering the entire process-mining spectrum from process discovery to predictive analytics. After a general introduction to data science and process mining in Part I, Part II provides the basics of business process modeling and data mining necessary to understand the remainder of the book. Next, Part III focuses on process discovery as the most important process mining task, while Part IV moves beyond discovering the control flow of processes, highlighting conformance checking, and organizational and time perspectives. Part V offers a guide to successfully applying process mining in practice, including an introduction to the widely used open-source tool ProM and several commercial products. Lastly, Part VI takes a step back, reflecting on the material presented and the key open challenges. Overall, this book provides a comprehensive overview of the state of the art in process mining. It is intended for business process analysts, business consultants, process managers, graduate students, and BPM researchers.
Maturity model for applying process mining in supply chains: Literature overview and practical implications
  • C Jacobi
  • M Meier
  • L Herborn
  • K Furmans
Jacobi, C., Meier, M., Herborn, L., Furmans, K.: Maturity model for applying process mining in supply chains: Literature overview and practical implications. Logistics J. Proc. 2020(12) (2020). https://doi.org/10.2195/lj Proc jacobi en 202012 01