Artem PolyvyanyyUniversity of Melbourne | MSD · Department of Computing and Information Systems
Artem Polyvyanyy
Dr. rer. nat. (PhD)
About
158
Publications
51,580
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
3,710
Citations
Introduction
Artem Polyvyanyy is an Associate Professor at the School of Computing and Information Systems, the University of Melbourne (Australia), where he leads the Process Science and Technology research group. He is a Vice-Chair of the Steering Committee of the IEEE Task Force on Process Mining and the editor of and a contributor to the book entitled “Process Querying Methods.” He has actively contributed to open-source initiatives that significantly impact research and practice, incl. Oryx and jBPT.
Additional affiliations
January 2023 - present
March 2018 - December 2022
January 2015 - March 2018
Publications
Publications (158)
A business process is often modeled using some kind of a directed flow graph, which we call a workflow graph. The Refined Process Structure Tree (RPST) is a technique for workflow graph parsing, i.e., for discovering the structure of a workflow graph, which has various applications. In this paper, we provide two improvements to the RPST. First, we...
The volume of process-related data is growing rapidly: more and more business operations are being supported and monitored by information systems. Industry 4.0 and the corresponding industrial Internet of Things are about to generate new waves of process-related data, next to the abundance of event data already present in enterprise systems. Howeve...
The behavioural comparison of systems is an important concern of software engineering research. For example, the areas of specification discovery and specification mining are concerned with measuring the consistency between a collection of execution traces and a program specification. This problem is also tackled in process mining with the help of...
Given an event log as a collection of recorded real-world process traces, process mining aims to automatically construct a process model that is both simple and provides a useful explanation of the traces. Conformance checking techniques are then employed to characterize and quantify commonalities and discrepancies between the log's traces and the...
Process mining studies ways to derive value from process executions recorded in event logs of IT-systems, with process discovery the task of inferring a process model for an event log emitted by some unknown system. One quality criterion for discovered process models is generalization. Generalization seeks to quantify how well the discovered model...
Process mining aids organisations in improving their operational processes by providing visualisations and algorithms that turn event data into insights. How often behaviour occurs in a process—the stochastic perspective—is important for simulation, recommendation, enhancement and other types of analysis. Although the stochastic perspective is impo...
The continued success of Large Language Models (LLMs) and other generative artificial intelligence approaches highlights the advantages that large information corpora can have over rigidly defined symbolic models, but also serves as a proof-point of the challenges that purely statistics-based approaches have in terms of safety and trustworthiness....
This paper presents the concept of a service colony and its characteristics. A service colony is a novel architectural style for developing a software system as a group of autonomous software services co-operating to fulfill the objectives of the system. Each inhabitant service in the colony implements a specific system functionality, collaborates...
A model of an information system describes its processes and how resources are involved in these processes to manipulate data objects. This paper presents an extension to the Petri nets formalism suitable for describing information systems in which states refer to object instances of predefined types and resources are identified as instances of spe...
A transhumeral prosthesis restores missing anatomical segments below the shoulder, including the hand. Active prostheses utilize real-valued, continuous sensor data to recognize patient target poses, or goals, and proactively move the artificial limb. Previous studies have examined how well the data collected in stationary poses, without considerin...
The continued success of Large Language Models (LLMs) and other generative artificial intelligence approaches highlights the advantages that large information corpora can have over rigidly defined symbolic models, but also serves as a proof-point of the challenges that purely statistics-based approaches have in terms of safety and trustworthiness....
Process discovery studies ways to use event data generated by business processes and recorded by IT systems to construct models that describe the processes. Existing discovery algorithms are predominantly concerned with constructing process models that represent the control flow of the processes. Agent system mining argues that business processes o...
A process discovery algorithm aims to construct a model from data generated by historical system executions such that the model describes the system well. Consequently, one desired property of a process discovery algorithm is rediscoverability, which ensures that the algorithm can construct a model that is behaviorally equivalent to the original sy...
A process discovery algorithm aims to construct a model from data generated by historical system executions such that the model describes the system well. Consequently, one desired property of a process discovery algorithm is rediscoverability, which ensures that the algorithm can construct a model that is behaviorally equivalent to the original sy...
Increasing the success rate of a process, i.e. the percentage of cases that end in a positive outcome, is a recurrent process improvement goal. At runtime, there are often certain actions (a.k.a. treatments) that workers may execute to lift the probability that a case ends in a positive outcome. For example, in a loan origination process, a possibl...
ProLift is a Web-based tool that uses causal machine learning, specifically uplift trees, to discover rules for optimizing business processes based on execution data (event logs). ProLift allows users to upload an event log, to specify case treatments and case outcomes, and to visualize treatment rules that increase the probability of positive case...
Goal Recognition (GR) is a research problem that studies ways to infer the goal of an intelligent agent based on its observed behavior and knowledge of the environment. A common assumption of GR is that the underlying environment is stationary. However, in many real-world scenarios, it is necessary to recognize agents' goals over extended periods....
A model of an information system describes its processes and how resources are involved in these processes to manipulate data objects. This paper presents an extension to the Petri nets formalism suitable for describing information systems in which states refer to object instances of predefined types and resources are identified as instances of spe...
The problem of process discovery in process mining studies ways to construct process models that encode business processes that induced event data recorded by IT systems. Most existing discovery algorithms are concerned with constructing models that represent the control flow of the processes. Agent system mining argues that business processes ofte...
A process discovery algorithm aims to construct a process model that represents the real-world process stored in event data well; it is precise, generalizes the data correctly, and is simple. At the same time, it is reasonable to expect that better quality input event data should lead to constructed process models of better quality. However, existi...
User interaction logs allow us to analyze the execution of tasks in a business process at a finer level of granularity than event logs extracted from enterprise systems. The fine-grained nature of user interaction logs open up a number of use cases. For example, by analyzing such logs, we can identify best practices for executing a given task in a...
A model of an information system describes its processes and how these processes manipulate data objects. Object-aware extensions of Petri nets focus on modeling the life-cycle of objects and their interactions. In this paper, we focus on Petri nets with identifiers, where identifiers are used to refer to objects. These objects should “behave” well...
Process mining extracts value from the traces recorded in the event logs of IT-systems, with process discovery the task of inferring a process model for a log emitted by some unknown system. Generalization is one of the quality criteria applied to process models to quantify how well the model describes future executions of the system. Generalizatio...
Process mining extracts value from the traces recorded in the event logs of IT-systems, with process discovery the task of inferring a process model for a log emitted by some unknown system. Generalization is one of the quality criteria applied to process models to quantify how well the model describes future executions of the system. Generalizatio...
State-of-the-art process discovery methods construct free-choice process models from event logs. Consequently, the constructed models do not take into account indirect dependencies between events. Whenever the input behaviour is not free-choice, these methods fail to provide a precise model. In this paper, we propose a novel approach for enhancing...
This chapter gives a brief introduction to the research area of process querying. Concretely, it articulates the motivation and aim of process querying, gives a definition of process querying, presents the core artifacts studied in process querying, and discusses a framework for guiding the design, implementation, and evaluation of methods for proc...
A process
is a collection of actions that were already, are currently being, or must be taken in order to achieve a goal, where an action
is an atomic unit of work, for instance, a business activity or an instruction of a computer program. A process repository
is an organized collection of models that describe processes, for example, a business pro...
Process querying studies concepts and methods from fields like Big data, process modeling and analysis, business process intelligence, and process analytics and applies them to retrieve and manipulate real-world and designed processes. This chapter reviews state-of-the-art methods for process querying, summarizes techniques used to implement proces...
Through the application of process mining, organisations can improve their business processes by leveraging data recorded as a result of the performance of these processes. Over the past two decades, the field of process mining evolved considerably, offering a rich collection of analysis techniques with different objectives and characteristics. Des...
There are many fields of computing in which having access to large volumes of data allows very precise models to be developed. For example, machine learning employs a range of algorithms that deliver important insights based on analysis of data resources. Similarly, process mining develops algorithms that use event data induced by real-world proces...
Reducing cycle time is a recurrent concern in the field of business process management. Depending on the process, various interventions may be triggered to reduce the cycle time of a case, for example, using a faster shipping service in an order-to-delivery process or calling a customer to obtain missing information rather than waiting passively. H...
Process analytics is an umbrella of data-driven techniques which includes making predictions for individual process instances or overall process models. At the instance level, various novel techniques have been recently devised, tackling next activity, remaining time, and outcome prediction. At the model level, there is a notable void. It is the am...
Process models automatically discovered from event logs represent business process behavior in a compact graphical way. To compare process variants, e.g., to explore how the system’s behavior changes over time or between customer segments, analysts tend to visually compare conceptual process models discovered from different “slices” of the event lo...
Robotic Process Automation (RPA) is a technology to automate routine work such as copying data across applications or filling in document templates using data from multiple applications. RPA tools allow organizations to automate a wide range of routines. However, identifying and scoping routines that can be automated using RPA tools is time consumi...
The behavioural comparison of systems is an important concern of software engineering research. For example, the areas of specification discovery and specification mining are concerned with measuring the consistency between a collection of execution traces and a program specification. This problem is also tackled in process mining with the help of...
This paper presents the idea of a compendium of process technologies, i.e., a concise but comprehensive collection of techniques for process model analysis that support research on the design, execution, and evaluation of processes. The idea originated from observations on the evolution of process-related research disciplines. Based on these observ...
In the Agent-Based Modeling (ABM) paradigm, an organization is a Multi-Agent System (MAS) composed of autonomous agents inducing business processes. Process Mining automates the creation, update, and analysis of explicit business process models based on event data. Process Mining techniques make simplifying assumptions about the processes discovere...
State-of-the-art process discovery methods construct free-choice process models from event logs. Consequently, the constructed models do not take into account indirect dependencies between events. Whenever the input behaviour is not free-choice, these methods fail to provide a precise model. In this paper, we propose a novel approach for enhancing...
Robotic Process Automation (RPA) is a technology to automate routine work such as copying data across applications or filling in document templates using data from multiple applications. RPA tools allow organizations to automate a wide range of routines. However, identifying and scoping routines that can be automated using RPA tools is time consumi...
The aim of a process discovery algorithm is to construct from event data a process model that describes the underlying, real-world process well. Intuitively, the better the quality of the input event data, the better the quality of the resulting discovered model should be. However, existing process discovery algorithms do not guarantee this relatio...
This paper addresses the challenge of decoupling “back-office” enterprise system functions in order to integrate them with the Industrial Internet-of-Things (IIoT). IIoT is a widely anticipated strategy, combining IoT technologies managing physical object movements, interactions and contexts, with business contexts. However, enterprise systems, sup...
Robotic process automation (RPA) is an emerging technology that allows organizations automating repetitive clerical tasks by executing scripts that encode sequences of fine-grained interactions with Web and desktop applications. Examples of clerical tasks include opening a file, selecting a field in a Web form or a cell in a spreadsheet, and copy-p...
Process analytics is the field focusing on predictions for individual process instances or overall process models. At the instance level, various novel techniques have been recently devised, tackling next activity, remaining time, and outcome prediction. At the model level, there is a notable void. It is the ambition of this paper to fill this gap....
Initially, process mining focused on discovering process models from event data, but in recent years the use and importance of conformance checking has increased. Conformance checking aims to uncover differences between a process model and an event log. Many conformance checking techniques and measures have been proposed. Typically, these take into...
Conformance checking is an area of process mining that studies methods for measuring and characterizing commonalities and discrepancies between processes recorded in event logs of IT-systems and designed processes, either captured in explicit process models or implicitly induced by information systems. Applications of conformance checking range fro...
Event sequence data is increasingly available in various application domains, such as business process management, software engineering, or medical pathways. Processes in these domains are typically represented as process diagrams or flow charts. So far, various techniques have been developed for automatically generating such diagrams from event se...
This book constitutes the thoroughly refereed proceedings of the international workshops associated with the 33rd International Conference on Advanced Information Systems Engineering, CAiSE 2021, which was held during June 28-July 2, 2021. The conference was planned to take place in Melbourne, Australia, but changed to an online format due to the C...
This volume constitutes the proceedings of the 19th International Conference on Business Process Management, BPM 2021, held in Rome, Italy, in September 2021.
The 23 full papers, one keynote paper, and 4 tutorial papers presented in this volume were carefully reviewed and selected from 92 submissions. The papers are organized in topical sections na...
This book constitutes the proceedings of the BPM Forum of the 19th International Conference on Business Process Management, BPM 2021, which will take place in Rome, Italy, in September 2021.
The BPM Forum offers innovative research papers characterized by their high potential of stimulating interesting discussion and scientific debate, although wi...
The aim of a process discovery algorithm is to construct from event data a process model that describes the underlying, real-world process well. Intuitively, the better the quality of the event data, the better the quality of the model that is discovered. However, existing process discovery algorithms do not guarantee this relationship. We demonstr...
Modern software systems are often built using service-oriented principles. Atomic components, be that web-or microservices, allow constructing flexible and loosely coupled systems. In such systems, services are building blocks orchestrated by business processes the system supports. Due to the complexity and heterogeneity of industrial software syst...
Event sequence data is increasingly available in various application domains, such as business process management, software engineering, or medical pathways. Processes in these domains are typically represented as process diagrams or flow charts. So far, various techniques have been developed for automatically generating such diagrams from event se...
Robotic Process Automation (RPA) is a technology to develop software bots that automate repetitive sequences of interactions between users and software applications (a.k.a. routines). To take full advantage of this technology, organizations need to identify and to scope their routines. This is a challenging endeavor in large organizations, as routi...
This paper proposes an approach to analyze an event log of a business process in order to generate case-level recommendations of treatments that maximize the probability of a given outcome. Users classify the attributes in the event log into controllable and non-controllable, where the former correspond to attributes that can be altered during an e...
In the past couple of centuries, humankind has achieved a significant improvement in the quality of life of the world's population, in large due to important advancements in the automation of wealth-generating activities. Business Process Management (BPM) studies concepts , methods, techniques, and tools that support and improve the way business pr...
This paper proposes an approach to analyze an event log of a business process in order to generate case-level recommendations of treatments that maximize the probability of a given outcome. Users classify the attributes in the event log into controllable and non-controllable, where the former correspond to attributes that can be altered during an e...
A plethora of algorithms for automatically discovering process models from event logs has emerged. The discovered models are used for analysis and come with a graphical flowchart-like representation that supports their comprehension by analysts. According to the Occam’s Razor principle, a model should encode the process behavior with as few constru...
Data and processes go hand-in-hand in information systems but are often modeled, validated, and verified separately in the systems' design phases. Designers of information systems often proceed by ensuring that database tables satisfy normal forms, and process models capturing the dynamics of the intended information manipulations are deadlock and...
This paper presents a command-line tool, called Entropia, that implements a family of conformance checking measures for process mining founded on the notion of entropy from information theory. The measures allow quantifying classical non-deterministic and stochastic precision and recall quality criteria for process models automatically discovered f...
Robotic Process Automation (RPA) is a technology to develop software bots that automate repetitive sequences of interactions between users and software applications (a.k.a. routines). To take full advantage of this technology, organizations need to identify and to scope their routines. This is a challenging endeavor in large organizations, as routi...
Given an event log as a collection of recorded real-world process traces, process mining aims to automatically construct a process model that is both simple and provides a useful explanation of the traces. Conformance checking techniques are then employed to characterize and quantify commonalities and discrepancies between the log's traces and the...
According to our recent proposal, an information system is a combination of a process model captured as a Petri Net with Identifiers, an information model specified in the first-order logic over finite sets with equality, and a specification of how the transitions in the net manipulate information facts. The Information Systems Modeling (ISM) Suite...
State-of-the-art process discovery methods construct free-choice process models from event logs. Hence, the constructed models do not take into account indirect dependencies between events. Whenever the input behavior is not free-choice, these methods fail to provide a precise model. In this paper, we propose a novel approach for the enhancement of...