Markus StumptnerUniversity of South Australia | UniSA · Advanced Computing Research Centre
Markus Stumptner
PhD
About
289
Publications
39,815
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
4,041
Citations
Introduction
Additional affiliations
May 1986 - February 2001
February 2001 - present
Publications
Publications (289)
Substandard performance between information systems and applications remains a problem for the Architecture, Engineering, Construction, and Operations (AECO) sector leading to significant economic, social and environmental costs. The sector suffers from poor interoperability because it lacks a holistic ecosystem for exchanging data and information....
Semi-autonomous cyber security (“cyber") operations require effective communication between a human operator and the underlying cyber systems that carry out the mission. We show a goal-driven approach to specifying mission objectives of such systems, where the system has controlled autonomy to refine the goals into executable plans. An ontology ali...
Maintenance of assets is a multi-million dollar cost each year for asset intensive organisations in the defence, manufacturing, resource and infrastructure sectors. These costs are tracked though maintenance work order (MWO) records. MWO records contain structured data for dates, costs, and asset identification and unstructured text describing the...
In the era of Industry 4.0, digital twins bridge the gap between the physical and digital worlds, enabling early-detection of issues, increased production and efficiency, among other benefits. The development of digital twins begins early in the life cycle of a system/plant by using the data from various information systems as it passes through its...
Achieving real-time agility and adaptation with respect to changing requirements in existing IT infrastructure can pose a complex challenge. We explore a goal-oriented approach to managing this complexity. We argue that a goal-oriented perspective can form an effective basis for devising and deploying responses to changed requirements in real-time....
This research proposes recommendations that could improve interoperability in the architecture, engineering, construction and operations (AECO) sector, by connecting domains, building lifecycles, and software systems with each other and the web. The objective has been to identify methods that promote evolution from file-based formats by advancing o...
The integrated management of industrial systems in future environments like Industry 4.0 requires the effective management of information throughout the engineering life cycle. As systems pass through phases of design, construction, operation, maintenance, renewal or replacement, they will be administered via different information ecosystems, requi...
This research reveals insights that can improve interoperability in the architecture, engineering, construction and operations AECO sector. Research design centres on a comparative review of standards and systems in AECO and the Oil & Gas sectors. For both sectors, different data exchange standards and specifications, and the available solutions th...
The management of health and safety plays an important role in safety performance, and is therefore an important foundational element in an organisation's overall sustainable development. Many organisations are now able to collect vast amounts of data being in an attempt to shed light on the underlying causes behind accidents and safety-related inc...
Knowledge Graphs (KGs), as one of the key trends which are driving the next wave of technologies, have now become a new form of knowledge representation, and a cornerstone for several applications from generic to specific industrial use cases. However, in some specific domains such as law enforcement, a real and large domain-oriented KG is often un...
Entity resolution (ER) is the problem of accurately identifying multiple, differing, and possibly contradicting representations of unique real-world entities in data. It is a challenging and fundamental task in data cleansing and data integration. In this work, we propose graph differential dependencies (GDDs) as an extension of the recently develo...
With the digital transformation of industries as proposed by Industry 4.0, there will be an increased amount of data collected and exchanged between enterprise systems. Software developers and domain experts are exposed to complex data specifications when dealing with enterprise interoperability. It is a major challenge to understand standards spec...
The application of data analytics has delivered significant value in a broad range of industries, but has frequently failed to bring about operational efficiencies or process safety improvements in asset-intensive sectors despite a profound increase in the volume of digital information that is stored by companies operating in this domain. Process s...
The advent of Industrial Internet of Things (IIoT) technology has significantly optimized the industrial operations management by connecting industrial assets with information systems and, hence, with business processes. The IIoT forms the backbone for materializing the Industry 4.0 initiative. Actionable insights obtained from industrial analytics...
Model transformations are an important aspect of Model-Driven Engineering as models throughout the software development process are transformed and refined until, finally, application code is generated. However, model transformations are complex to build, maintain, and verify for correctness. We propose the combination of visual contracts, an imple...
Traditionally the integration of data from multiple sources is done on an ad-hoc basis for each analysis scenario and application. This is a solution that is inflexible, incurs high costs, leads to “silos” that prevent sharing data across different agencies or tasks, and is unable to cope with the modern environment, where workflows, tasks, and pri...
Due to the volume, variety, and veracity of network data available, information fusion and reasoning techniques are needed to support network analysts’ cyber-situational awareness. These techniques rely on formal knowledge representation to define the network semantics with data provenance at various levels of granularity. To this end, this paper p...
Cyber-situational awareness is crucial to applications such as network monitoring and management, vulnerability assessment, and defense. To gain improved cyber-situational awareness, analysts can benefit from automated reasoning-based frameworks. However, such frameworks would require the processing of enormous amounts of network data, which are ch...
The development of systems of systems or the replacement of processes or systems can create unknowns, risks, delays and costs which are difficult to understand and characterise, and which frequently result in unforeseen issues resulting in overspend or avoidance. Yet maintaining state of the art processes and systems and utilising best of breed com...
Of the core challenges originally associated with Big Data, namely Volume, Velocity, and Variety, the Variety aspect is the one that is least addressed by the standard analytics architectures. In this chapter, we analyze types and sources of variety and describe data- and metadata management principles for organizing data lakes. We discuss how sema...
Multi-level modeling is currently regaining attention in the database and software engineering community with different emerging proposals and implementations. One driver behind this trend is the need to reduce model complexity, a crucial aspect in a time of analytics in Big Data that deal with complex heterogeneous data structures. So far no stand...
Constructive combat simulation is widely used across Defence Science and Technology Group, typically using behavioural models written by software developers in a scripting or programming language for a specific simulation. This approach is time-consuming, can lead to inconsistencies between the same behaviour in different simulations, and is diffic...
Allocation of resources to improve security is crucial when we consider people's safety on transport systems. We show how a system engineering methodology can be used to link business intelligence and railway specifics toward better value for money. A model is proposed to determine a probability of a success in service management. The forecasting m...
Interoperability between heterogeneous software ecosystems at increasing scale remains a major challenge. The automated translation of data between the data models and languages built around official or de facto standards is best addressed using model-driven engineering techniques, but requires handling both data and multiple levels of meta-data wi...
In the past two decades, business process research has focused on process flexibility to facilitate the operation of business processes in an open and dynamic environment. This is important to ensure that processes accurately reflect and handle changes occurring in the real-world. While substantial existing work has investigated changes in business...
Traditionally the integration of data from multiple sources is done on an ad-hoc basis for each analysis scenario and application. This is a solution that is inflexible, incurs in high costs, leads to "silos" that prevent sharing data across different agencies or tasks, and is unable to cope with the modern environment, where workflows, tasks, and...
One of the most significant challenges in information system design is the constant and increasing need to establish interoperability between heterogeneous software systems at increasing scale. The automated translation of data between the data models and languages used by information ecosystems built around official or de facto standards is best a...
Business constraints in general, and temporal constraints, in particular, play a crucial role in business process management. They are specified to ensure that business processes or their component steps are performed according to the time restrictions of the real world context of the process. A number of temporal solutions with the aim of monitori...
Allocation of resources to improve security is crucial when we consider people’s safety on transport systems. We show how a system engineering methodology can be used to link business intelligence and railway specifics toward better value for money. A model is proposed to determine a probability of a success in service management. The forecasting m...
Business processes are composed mainly of activities and events. The latter has gained much focus recently which has resulted in the drift towards Event-Driven Business Process Management (EDBPM). Events are used in both monitoring and controlling the execution of business processes. They are considered to be instantaneous and their content cannot...
The ongoing research and development in the field of Natural Language Processing has lead to a great number of technologies in its context. There have been major benefits when it comes to bringing together the worlds of natural language and semantic technologies, so more and more potential areas of application emerge. One of these is the subject of...
In today's enterprise environment, business processes no longer operate in an isolated fashion, driven purely by human input. Instead, they exchange information across organisations as well as interacting directly with sensors and actuators in the Internet of Things. This means that traditional assumptions about processes having a "perfect" view of...
One of the most significant challenges in information system design is the constant and increasing need to establish interoperability between heterogeneous software systems at increasing scale. Beyond individual applications, today’s enterprise applications require automated information exchange across the system lifecycle of information ecosystems...
Modern organisations are increasingly moving from traditional monolithic business systems to environments where more and more tasks are outsourced to third party providers. Therefore, processes must operate in an open and dynamic environment in which the management of time plays a crucial role. Handling time, however, remains a challenging issue ye...
One of the most significant challenges in information system design is the constant and increasing need to establish interoperability between heterogeneous software systems at increasing scale. The automated translation of data between the data models and languages used by information ecosystems built around official or de facto standards is best a...
During the past decade, much effort has been invested in developing standards to overcome data and software interoperability barriers in the oil and gas industry. Whereas syntactical integration is no longer problem, semantic integration still remains an open challenge. To overcome this problem, standards provide more and more complex structures to...
This paper addresses the problem of transforming business specifications written in natural language into formal models suitable for use in information systems development. It proposes a method for transforming controlled natural language specifications based on the Semantics of Business Vocabulary and Business Rules standard. This approach is uniq...
In large organizations, multiple stakeholders may modify the same business process. This paper addresses the problem when stakeholders perform changes on process views which become inconsistent with the business process and other views. Related work addressing this problem is based on execution trace analysis which is performed in a post-analysis p...
Modularity has been a key issue in the design and development of modern embedded Real-Time Software Systems (RTS) where modularity enables flexibility with respect to changes in platform, environment, and requirements, as well as reuse. In distributed RTS, similar ideas have led to the adoption of Commercial Off-The-Shelf (COTS) components integrat...
Many attempts have been made to apply Natural Language Processing to requirements specifications. However, typical approaches rely on shallow parsing to identify object-oriented elements of the specifications (e.g. classes, attributes, and methods). As a result, the models produced are often incomplete, imprecise, and require manual revision and va...
In large organisations different stakeholders are usually responsible for the management of business processes. This paper addresses the situation where multiple stakeholders with different interests are handling parts of the same business processes. One technique to support this are process views where each stakeholder holds a personal view which...
Effective exchange of information about processes and industrial plants, their design, construction, operation, and maintenance requires sophisticated information modelling and exchange mechanisms that enable the transfer of semantically meaningful information between a vast pool of heterogeneous information systems. In order to represent entities...
Languages that combine aspects of probabilistic representations with aspects of first-order logic are referred to as first-order probabilistic languages (FOPLs). FOPLs can be divided into three categories: rule-based, procedural-based and entity-relation-based languages. This article presents a survey of directed entity-relation- based FOPLs and th...
The design process for large systems, e.g., industrial plants, involves large multi-disciplinary teams. Since each discipline has its own specialised concerns, the common thread is describing the functional requirements of an artefact. In the oil and gas industry, Engineering, Procurement & Construction (EPC) companies are responsible for designing...
Even with modern software development methodologies, the actual debugging of source code, i.e., location and identification of errors in the program when errant behavior is encountered during testing, remains a crucial part of software development. To apply model-based diagnosis techniques, which have long been state of the art in hardware diagnosi...
Configuration models specify the set of possible configurations (solutions). A configuration model together with a defined set of (customer) requirements are the major elements of a configuration task (problem). In this chapter, we discuss different knowledge representations that can be used for the definition of a configuration model. We provide e...
Exploring and understanding large business process models are important tasks in the context of business process management. In recent years, several techniques have been proposed for the abstraction of business processes. Automated abstraction techniques have been devised for verifying correctness and consistency of process models and for providin...
Many modern product offerings are bundles of physical and service elements. The mass-customization by configuration approach is applicable to services, but may require extensions to traditional configuration techniques. The service production process is more intimately involved with the customer and his/her environment, which may need to be modeled...
Developing a product configuration system is a nontrivial and challenging task for various reasons. First, the domain knowledge that has to be encoded into the system is often spread over several departments or functions within a company. Besides that, in many cases data from existing information systems have to be integrated into the configurator....
Model-driven approaches to establishing interoperability between information systems have recently embraced meta-modelling frameworks spanning multiple levels. However, no consensus has yet been established as to which techniques adequately support situations where heterogeneous domain-specific models must be linked within a common modelling approa...
Multi-level modelling is currently regaining attention in the database and software engineering community with different emerging proposals and implementations. One driver behind this trend is to reduce model complexity, a crucial aspect in a time of big data research in which more and more data from different sources are required to be integrated....
The ISO 15926 standard was developed to facilitate the integration of life-cycle data of process plants. The core of the standard is a highly generic and extensible data model trying to capture a holistic
view of the world. We investigated the standard from a software modelling point of view and identified some challenges in terminology, circular d...
Modularity has been a key issue in the design and development of modern embedded Real-Time Software Systems (RTS), where modularity enables flexibility with respect to changes in platform, environment, and requirements, as well as reuse. In distributed RTS, similar ideas have led to the adoption of COTS components integrated via Service-Oriented Ar...
In large organisations multiple stakeholders may modify the same business process. This paper addresses the problem when stakeholders perform changes on process views which become inconsistent with the business process and other views. Related work addressing this problem is based on execution trace analysis which is performed in a post-analysis ph...