ArticlePDF Available

Object-Oriented Event-Graph modeling formalism to simulate manufacturing systems in the Industry 4.0 era

Authors:

Abstract and Figures

In the event-based modeling formalism a system is modeled by defining the changes that occur at event times and the system dynamics can be described using an ‘Event Graph’. An event graph model is a network of event nodes describing the events that take place in the system and the relationships among these events. In the paper we extend the Event Graph framework by proposing an event-graph modeling formalism suitable to represent discrete event simulation models developed through the object-oriented approach, named Object-Oriented Event Graph (OOEG). The importance of Object–Oriented simulation is recently growing under the Industry 4.0 paradigm, in which optimization via simulation, real-time simulation, automatic decisions systems based on simulation, on line scenario analysis play a relevant role. The OOEG representation provides the same compactness of the EG representation. The advantage of the OOEG representation is that allows supporting a modelling methodology where systems are described by linking components analogously as the system components are linked, and the dynamic of the systems can be visualized in terms of interactions between objects, which have their physical correspondence in the real world.
Content may be subject to copyright.
A preview of the PDF is not available
... The object-oriented approach is suitable for modeling structures such as graphs and trees [40,41], and its implementation enables the development and flexible modification of readymade software, as well as a reduction in expenses for the realization of new subject-oriented systems owing to the usage of ready-to-use templates. ...
... To solve the given problems, the usage of iterative algorithms was reviewed; their essence involves executing multiple iterations, where the arguments are changed to some The object-oriented approach is suitable for modeling structures such as graphs and trees [40,41], and its implementation enables the development and flexible modification of readymade software, as well as a reduction in expenses for the realization of new subject-oriented systems owing to the usage of ready-to-use templates. ...
Article
Full-text available
One of the key tools in an organization’s performance management is the goal tree, which is used for solving both direct and inverse problems. This research deals with goal setting based on a model of the future by presenting the goal and subgoal in the form of concrete quantitative and qualitative characteristics and stepwise formation of factors. A stepwise solution to a factor generation problem is considered on the basis of mathematical symmetry. This paper displays an algorithm for solving hierarchical inverse problems with constraints, which is based on recursively traversing the vertices that constitute the separate characteristics. Iterative methods, modified for the case of nonlinear models and the calculation of constraints, were used to generate solutions to the subproblems. To realize the algorithm, the object-oriented architecture, which simplifies the creation and modification of software, was elaborated. Computational experiments with five types of models were conducted, and the solution to a problem related to fast-food restaurant profit generation was reviewed. The metrics of remoteness from set values and t-statistics were calculated for the purpose of testing the received results, and solutions to the subproblems, with the help of a mathematical package using optimization models and a method of inverse calculations, were also provided. The results of computational experiments speak to the compliance of the received results with set constraints and the solution of separate subproblems with the usage of the mathematical package. The cases with the highest solution accuracy reached are specified.
... ALS is a discrete event simulator of assembly lines developed in Java under the object-oriented simulation paradigm. The importance of object-oriented simulation is recently growing under the Industry 4.0 paradigm, in which optimization via simulation, real-time simulation, automatic decisions systems based on simulation, on line scenario analysis play a relevant role (Tiacci, 2020). ALS is capable to immediately calculate the average cycle time of a complex line by simply receiving as inputs the task times durations, the line configuration, and the sequence of models entering the line. ...
Article
Full-text available
In the paper two popular techniques able to improve the efficiency of mixed model asynchronous assembly lines are compared: the allocation of buffers within work centers and the optimization of the sequence of models entering the line. The comparison has been performed on a set of benchmark instances related to the MALBP (Mixed-model Assembly Line Balancing Problem). In fact, the buffer allocation problem (BAP) and the sequencing problem (SP) are strictly connected to the MALBP, because balancing decisions, buffer allocation and sequencing optimization have a direct impact on the line throughput. The presented approach allows to simultaneously solve the BAP, the SP and the MALBP for asynchronous unpaced lines. In this way, by an opportune design of experiment, it is possible to compare the different solutions found for the benchmark instances and to quantify the impact of buffer allocation and sequencing optimization on the quality of the solutions.
... Meanwhile, BN is also expected to serve as a bridge for hybrid mode in PdM since it is a graph-based data-driven approach. Furthermore, recent works have applied KG to represent and cognize the equipment mechanism in an event graph [112], which is expected to cooperate with data-driven models for cognitive PdM in the future. ...
Article
Full-text available
Predictive Maintenance (PdM) has continually attracted interest from the manufacturing community due to its significant potential in reducing unexpected machine downtime and related cost. Much attention to existing PdM research has been paid to perceiving the fault, while the identification and estimation processes are affected by many factors. Many existing approaches have not been able to manage the existing knowledge effectively for reasoning the causal relationship of fault. Meanwhile, complete correlation analysis of identified faults and the corresponding root causes is often missing. To address this problem, graph-based approaches (GbA) with cognitive intelligence are proposed, because the GbA are superior in semantic causal inference, heterogeneous association, and visualized explanation. In addition, GbA can achieve promising performance on PdM's perception tasks by revealing the dependency relationship among parts/components of the equipment. However, despite its advantages, few papers discuss cognitive inference in PdM, let alone GbA. Aiming to fill this gap, this paper concentrates on GbA, and carries out a comprehensive survey organized by the sequential stages in PdM, i.e., anomaly detection, diagnosis, prognosis, and maintenance decision-making. Firstly, GbA and their corresponding graph construction methods are introduced. Secondly, the implementation strategies and instances of GbA in PdM are presented. Finally, challenges and future works toward cognitive PdM are proposed. It is hoped that this work can provide a fundamental basis for researchers and industrial practitioners in adopting GbA-based PdM, and initiate several future research directions to achieve the cognitive PdM.
... Has not been designed for reconfigurations due to out-ofordinary events (Tiacci, 2020) Event graph modeling for simulation ...
Article
Full-text available
Adoption of digital twins in smart factories, that model real statuses of manufacturing systems through simulation with real time actualization, are manifested in the form of increased productivity, as well as reduction in costs and energy consumption. The sharp increase in changing customer demands has resulted in factories transitioning rapidly and yielding shorter product life cycles. Traditional modeling and simulation approaches are not suited to handle such scenarios. As a possible solution, we propose a generic data-driven framework for automated generation of simulation models as basis for digital twins for smart factories. The novelty of our proposed framework is in the data-driven approach that exploits advancements in machine learning and process mining techniques, as well as continuous model improvement and validation. The goal of the framework is to minimize and fully define, or even eliminate, the need for expert knowledge in the extraction of the corresponding simulation models. We illustrate our framework through a case study.
... In this situation, nowadays existing increasing sentiment techniques to determine the accurate, explainable, and tracible results, also as for the better performance of dialogues structure and sound. In this study, we use the sentiment technique to perform the prediction and solve the various problem in the case of linear models to the knowledge graph (Jiao et al., 2020) (Tiacci, 2020; R. . ...
Article
In the machine learning technique, the knowledge graph is advancing swiftly; however, the basic models are not able to grasp all the affluence of the script that comes from the different personal web graphics, social media, ads, and diaries, etc., ignoring the semantic of the basic text identification. The knowledge graph provides a real way to extract structured knowledge from the texts and desire images of neural network, to expedite their semantics examination. In this study, we propose a new hybrid analytic approach for sentiment evaluation based on knowledge graphs, to identify the polarity of sentiment with positive and negative attitudes in short documents, particularly in 4 chirps. We used the tweets graphs, then the similarity of graph highlighted metrics and algorithm classification pertain sentimentality pre-dictions. This technique facilitates the explicability and clarifies the results in the knowledge graph. Also, we compare our differentiate the embeddings n-gram based on sentiment analysis and the result is indicated that our study can outperform classical n-gram models, with an F1-score of 89% and recall up to 90%.
... The most straightforward manner is to transform the working process into a knowledge graph [33] or disassembling the components as nodes in the knowledge graph [34]. Similarly, an event graph is generated to simulate the manufacturing process, and represent the event logic by setting events as entities in a graph form [35]. ...
Article
Full-text available
Empowered by advanced cognitive computing, industrial IoT, and data analytics techniques, today’s smart manufacturing systems are ever-increasingly equipped with cognitive capabilities, towards an emerging and promising Self-X cognitive manufacturing network. Nevertheless, to our best knowledge, the readiness of ‘Self-X’ levels (e.g., self-configuration, self-optimization, and self-adjust/adaptive/healing) is still in the infant stage. To pave its way, this work introduces an industrial knowledge graph (IKG)-based multi-agent reinforcement learning (MARL) method for approaching the Self-X cognitive manufacturing network. Firstly, an IKG is formulated based on the extracted empirical knowledge and recognized patterns in the manufacturing process by leveraging the massive human-generated and machine-sensed multimodal data, as the cognitive manufacturing network. Then, a proposed graph neural network-based embedding algorithm is performed based on a comprehensive understanding of the IKG, to achieve semantic-based self-configurable solution searching and task decomposition. Furthermore, the MARL-enabled decentralized system is introduced to self-optimize the manufacturing process, and further to complement the IKG towards Self-X cognitive manufacturing network eventually. An illustrative example of multi-robot reaching task is conducted at last, to validate the feasibility of the proposed approach. As an explorative study, limitations and future perspectives are also highlighted to attract more open discussions and in-depth research for ever smarter manufacturing
... The most straightforward manner is to transform the working process into a graph (Alsafi and Vyatkin 2010) or disassembling the components as nodes in the graph (Hedberg et al. 2020). Furthermore, an event graph is generated to simulate and understand the manufacturing process and represent the event logic by setting events as entities in graph form (Tiacci 2020). ...
Article
Full-text available
The material removal rate (MRR) plays a critical role in the chemical mechanical planarization (CMP) process in the semiconductor industry. Many physics-based and data-driven approaches have been proposed to-date to predict the MRR. Nevertheless, most of them neglect the underlying equipment structure containing essential interaction mechanisms among different components. To fill the gap, this paper proposes a novel hypergraph convolution network (HGCN) based approach for predicting MRR in the CMP process. The main contributions include: 1) a generic hypergraph model to represent the interrelationships of complex equipment; and 2) a temporal-based prediction approach to learn the complex data correlation and high-order representation based on the hypergraph. To validate the effectiveness of the proposed approach, a case study is conducted by comparing with other cutting-edge models, of which it outperforms in several metrics. It is envisioned that this research can also bring insightful knowledge to similar scenarios in the manufacturing process.
Article
Product-mix problems, where a range of products that generate different incomes compete for a limited set of production resources, are key to the success of many organisations. In their deterministic forms, these are simple optimisation problems; however, the consideration of stochasticity may turn them into analytically and/or computationally intractable problems. Thus, simulation becomes a powerful approach for providing efficient solutions to real-world product-mix problems. In this paper, we develop a simulator for exploring the cost of uncertainty in these production systems using Petri nets and agent-based techniques. Specifically, we implement a stochastic version of Goldratt's PQ problem that incorporates uncertainty in the volume and mix of customer demand. Through statistics, we derive regression models that link the net profit to the level of variability in the volume and mix. While the net profit decreases as uncertainty grows, we find that the system is able to effectively accommodate a certain level of variability when using a Drum-Buffer-Rope mechanism. In this regard, we reveal that the system is more robust to mix than to volume uncertainty. Later, we analyse the cost-benefit trade-off of uncertainty reduction, which has important implications for professionals. This analysis may help them optimise the profitability of investments. In this regard, we observe that mitigating volume uncertainty should be given higher consideration when the costs of reducing variability are low, while the efforts are best concentrated on alleviating mix uncertainty under high costs.
Article
Simulation facilitates the understanding and improvement of complex systems. Conceptual modelling is a key step in simulation studies. It has gained recognition because it may both increase engagement with stakeholders and decrease the time to implement a simulation. This scoping review’s objective is to highlight approaches and platforms for electronically representing models from 1999 to 2020. The motivation is that electronic representations facilitate the sharing of conceptual models. The contribution from the review to the research of conceptual modelling and simulation is to show that conceptual models are electronically represented by broadly speaking either General-Purpose or Domain-Specific Modelling Languages. There is a slight trend towards the latter in order to better deal with application specificities and improve unambiguity in model representations, though. Thus, we identify modelling approaches, platforms, and features for electronically representing conceptual models with the potential to fill the gap between conceptual models and their corresponding simulation implementations.
Article
Full-text available
p dir="ltr"> A importância ao tema investimento tecnológico cresce no âmbito da Indústria 4.0. O artigo tem por objetivo analisar o nível de investimento tecnológico em Alagoas na perspectiva da Indústria 4.0. Especificamente, almeja identificar o perfil da estrutura industrial, analisar a composição dos investimentos tecnológicos, o uso das tecnologias da Indústria 4.0 e as tecnologias em que a indústria pretende investir nos próximos anos. Pretende-se responder à seguinte questão de pesquisa: quais são os limites e os impactos do investimento tecnológico sob a perspectiva da Indústria 4.0? O estudo é quali-quantitativo, sendo descritivo quanto aos objetivos. Para tal, aplicou-se um survey em uma amostra de 150 indústrias, configurando uma margem de erro de 7,5% e com intervalo de confiança de 95%. É possível aferir como resultado que a indústria possui reduzido nível de investimento em tecnologias habilitadoras da Indústria 4.0, mas com previsão de níveis mais elevados nos próximos anos. </div
Article
Full-text available
Assembly system design defines proper configurations and efficient management strategies to maximize the assembly system performances. Beyond assembly line balancing and scheduling, several other dimensions of this problem have to be considered. Furthermore, the assembly system design has to consider the industrial environment in which the system operates. The latest industrial revolution, namely Industry 4.0, leverages Internet connected and sensorized machines to manufacture customer-designed products. This paper proposes an original framework which investigates the impact of Industry 4.0 principles on assembly system design. The traditional dimensions of this problem are described along with the industrial environment evolution over the last three centuries. Concerning the latest industrial revolution, the technology innovations which enabled the manufacturing process digitalization are presented. The application of these enabling technologies to the assembly domain results in a new generation of assembly systems, the here defined assembly system 4.0. Finally, the distinctive characteristics of these novel systems are proposed and described in detail.
Article
Full-text available
In the paper we proposed and tested on a real industrial case, related to a company in the segment of Agricultural Equipment, an approach to design asynchronous assembly lines in compliance with ergonomic legislation. We considered the OCRA index as method for ergonomic risk assessment, as it is the preferred method indicated in international norms for detailed risk assessment related to handling of low loads at high frequency. A genetic algorithm approach able to integrate the ergonomic risks evaluation and balancing/sequencing is proposed. The approach allow designing line configurations taking into account many characteristics of the complex scenario of real industrial cases: mixed models assembly lines, stochastic task times, precedence constraints among tasks, equipment and line feeding duplication costs associated to parallel workstations. Thanks to the integration of a discrete event simulator, it is also possible to consider the effect of blocking and starvation phenomena on the effective cycle time and on worker’s ergonomic load. The respect of ergonomic norms is often view by companies as an onerous obligation, being often associated to the increase of required manpower. Results show that, using the proposed approach, extra costs due to the compliance with ergonomic legislation can be very limited. This should encourage companies to adopt design methodologies able at the same time to comply with ergonomic norms and to defend their profitability.
Article
Full-text available
In the paper a genetic algorithm approach is proposed to balance asynchronous mixed-model U-shaped lines with stochastic task times. U-shaped lines have become popular in recent years for their ability to outperform straight assembly lines in terms of line efficiency. The great majority of studies in the literature deal with paced synchronous U-shaped lines. Asynchronous lines can be more efficient than synchronous lines, but are more difficult to study, due to blocking and starvation phenomena caused by the variability of completion times: this makes it difficult to calculate the effective throughput. This variability, that in straight lines comes from the stochastic nature of task times and from the changing of models entering the line, is even higher in U-shaped lines, where an operator can work at two different models in the same cycle at the two sides of the line. For this reason, the genetic algorithm proposed is coupled to a parametric simulator for the evaluation of the objective function, which contains the simulated throughput. Two alternative chromosomal representations are tested on an ample set of instances from the literature. The best solutions are also compared with the best solutions known in the literature, on the same instances, for straight lines with buffers and parallel workstations. From the comparison it turns out that U-shaped lines are generally more efficient with respect to straight lines with buffers. This is because crossover work centers naturally act similarly to unitary buffers, providing two places in which two loads can be placed simultaneously. The superiority of U-shaped lines holds true as long as it is possible to take full advantage of the employment of crossover work centers. For particular types of instances, depending on the distribution of task times, this possibility decreases, so that straight lines with parallel workstations and buffers are preferable.
Conference Paper
Full-text available
The goal of this panel was to discuss the state of the art in simulation optimization research and practice. The participants included representation from both academia and industry, where the latter was represented by participation from a leading software provider of optimization tools for simulation. This paper begins with a short introduction to simulation optimization, and then presents a list of specific questions that served as a basis for discussion during the panel discussion. Each of the panelists was given an opportunity to provide their preliminary thoughts on simulation optimization for this paper, and the remainder of the paper summarizes those points, ranging from assessments of the field from their perspective to initial reactions to some of the posed questions. Finally, one of the panelists who has worked on an online testbed of simulation optimization problems for the research community was asked to provide an update of the status of the Web site.
Article
Full-text available
The buffer allocation problem (BAP) and the assembly line balancing problem (ALBP) are amongst the most studied problems in the literature on production systems. However they have been so far approached separately, although they are closely interrelated. This paper for the first time considers these two problems simultaneously. An innovative approach, consisting in coupling the most recent advances of simulation techniques with a genetic algorithm approach, is presented to solve a very complex problem: the Mixed Model Assembly Line Balancing Problem (MALBP) with stochastic task times, parallel workstations, and buffers between workstations. An opportune chromosomal representation allows the solutions space to be explored very efficiently, varying simultaneously task assignments and buffer capacities among workstations. A parametric simulator has been used to calculate the objective function of each individual, evaluating at the same time the effect of task assignment and buffer allocation decisions on the line throughput. The results of extensive experimentation demonstrate that using buffers can improve line efficiency. Even when considering a cost per unit buffer space, it is often possible to find solutions that provide higher throughput than for the case without buffers, and at the same time have a lower design cost.
Conference Paper
Full-text available
In a flat panel display (FPD) production line, unlike a table-type machine that processes one glass at a time, an inline cell works simultaneously on several glasses unloaded from different cassettes in a serial manner and is divided into two types (uni-inline cell and bi-inline cell) according to the job loading and unloading behavior. In order to build a production simulator for this type of FPD production line, an object-oriented event graph modeling approach is proposed where the FPD production line is simplified into a job shop consisting of two types of inline cells, and the job shop is represented as an object-oriented event graph model. This type of job shop is referred to as a heterogeneous job shop. The resulting model is realized in a production simulator using an object-oriented event graph simulator and is illustrated with the experimental results from the production simulator.
Conference Paper
In this paper, a methodology for fast development of Discrete Event Simulation (DES) models is presented. The methodology simply works in two stages. In the first stage the modeler builds a Conceptual Model (CM) of the system to be modeled. A CM is represented as an Event Graph (EG). EGs are used to document the events and their causes in a system. In the second stage the CM is translated to an Event Based DES model. To fulfill this task we developed a DES library, SharpSim, using C# (CSharp) programming language. This paper gives an introduction to our methodology. We provide an insight into SharpSim and EGs, and illustrate a modeling example.