Conference Paper

A Bayesian Network Based Approach for Root-Cause-Analysis in Manufacturing Process

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We describe an Early Warning System (EWS) which enables the root cause analysis for initiating quality improvements in the manufacturing shop floor and process engineering departments, at product OEMs as well as their tiered suppliers. The EWS combines the use of custom designed domain ontology of manufacturing processes and failure related knowledge, innovative application of domain knowledge in the form of probability constraints and a novel two step constrained optimization approach to causal network construction. Probabilistic reasoning is the main vehicle for inference from the causal network. This inference engine provides the capability to do a root cause analysis in manufacturing scenarios, and is thus a powerful weapon for an automotive EWS. This technique is widely applicable and can be used in various contexts in the broader manufacturing industry as well.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... For generating tree structures, different authors proposed using several tools that depict the cause and effect relationship between these nodes. In generating the tree structure, De et al [4] and Pradhan et al [5] proposed the use of failure mode and effect analysis (FMEA), Pradhan et al [5] and Nguyen et al [6] utilized fishbone diagrams (cause and effect diagram), Pradhan et al [5] utilized faulttree analysis and variation sensitivity matrix, and finite element analysis (FEA) was used by Liu & Jin [7]. The precise construction of the tree structure of a BN from data is an NP-hard optimization problem [8]. ...
... For generating tree structures, different authors proposed using several tools that depict the cause and effect relationship between these nodes. In generating the tree structure, De et al [4] and Pradhan et al [5] proposed the use of failure mode and effect analysis (FMEA), Pradhan et al [5] and Nguyen et al [6] utilized fishbone diagrams (cause and effect diagram), Pradhan et al [5] utilized faulttree analysis and variation sensitivity matrix, and finite element analysis (FEA) was used by Liu & Jin [7]. The precise construction of the tree structure of a BN from data is an NP-hard optimization problem [8]. ...
... For generating tree structures, different authors proposed using several tools that depict the cause and effect relationship between these nodes. In generating the tree structure, De et al [4] and Pradhan et al [5] proposed the use of failure mode and effect analysis (FMEA), Pradhan et al [5] and Nguyen et al [6] utilized fishbone diagrams (cause and effect diagram), Pradhan et al [5] utilized faulttree analysis and variation sensitivity matrix, and finite element analysis (FEA) was used by Liu & Jin [7]. The precise construction of the tree structure of a BN from data is an NP-hard optimization problem [8]. ...
Conference Paper
Full-text available
Artificial intelligence applications are increasing due to advances in data collection systems, algorithms, and affordability of computing power. Within the manufacturing industry, machine learning algorithms are often used for improving manufacturing system fault diagnosis. This study focuses on a review of recent fault diagnosis applications in manufacturing that are based on several prominent machine learning algorithms. Papers published from 2007 to 2017 were reviewed and keywords were used to identify 20 articles spanning the most prominent machine learning algorithms. Most articles reviewed consisted of training data obtained from sensors attached to the equipment. The training of the machine learning algorithm consisted of designed experiments to simulate different faulty and normal processing conditions. The areas of application varied from wear of cutting tool in computer numeric control (CNC) machine, surface roughness fault, to wafer etching process in semiconductor manufacturing. In all cases, high fault classification rates were obtained. As the interest in smart manufacturing increases, this review serves to address one of the cornerstones of emerging production systems.
... Chuanzhe and Fengping (2006) give a review of risk analysis tools where Bayesian Network scenarios are given a special importance because of uncertainty handling. Future scenarios are built by using Bayesian Networks in the customer analytics (Dienst et al., 2010) or in process analytics (Pradhan, Singh, & Kachru, 2007). ...
... Simulation and Influence Diagrams (BNs coupled with decision making based on expected utility) were used to model the uncertainties on the shop-floor when considering job rescheduling [8]. Ontology and expert reports were used to define the appropriate knowledge domains that must be used when constructing BN to define the root cause of failure of dies used to produce auto parts [9] . The Bayesian Network was also used to infer relations between different features that can co-exist in a certain product [10]. ...
Article
Full-text available
Complexity in manufacturing arises due to the intertwined relationships between products and their manufacturing systems. If the system/product relations can be retrieved automatically and efficiently, complex systems would be better designed and utilized to manufacture more products effectively. In this paper, a comprehensive method is used to explore the inter- relationships in the products domain and machines domain, and map the relations between products and systems. The method uses structure learning by Bayesian Networks to capture and analyze these relations, hence facilitating synthesis of new systems and product. A case study of parts and respective machines for producing them is used to demonstrate the method. Results show that the dependency relations among products and systems features can be extracted by analyzing existing instances of related entities such as machines/products and specifications without explicitly identifying the relations between them, which is akin to reverse engineering. Using Bayesian Networks and the most probable algorithm, a new composite part is obtained for a 3-axis machine with a tool magazine. This new information is based on the inferred Bayesian Network that links products and machines. Thus, future manufacturing systems and parts/machines co-design can be done robustly without much human interference.
Article
Enterprise users of IT services seek real-time contextual insights during system-failure scenarios in both cloud-provisioned and legacy management systems. Current IT management systems mostly provide front-office automation support, such as ticket categorization and scheduling, using a generalized set of troubleshooting instructions. Therefore, in this paper, we propose an IT management system that provides real-time insights on user-perceived failures (e.g., “Why is application not responding?”) expressed in natural language texts. We achieve this through an underpinning of a knowledge graph that helps in discovering possible topology patterns comprising multiple interdependent systems for a specific purpose. Based on the detected list of topology patterns, the proposed system composes multiple debugging workflows to generate accurate operational insights. The user interactions are “system agnostic” in nature and do not depend on the knowledge of the underlying system topology. This significantly augments the self-assist scenarios of end-users and front-office agents, before they engage with IT support teams. We demonstrate our proposed approach, as a cloud application with a natural language interface, using an experimental setup involving a standard ticket management system.
Article
Mechanical product quality depends on many quality characteristics (QCs). There are many coupling relations between these QCs that are required to analyze these relations. A principle − empirical (P–E) model for quality improvement is proposed in this study. The architecture of the model is first introduced. The method of the P–E model structure learning is provided, and the QC relations are determined by empirical data. These discovered QC relations are validated by principal knowledge. The P–E model structure is built based on the validated QC relations. The maximum likelihood estimation (MLE) is used for parameter learning. Finally, a case study is given to demonstrate the P–E model. The results show that the learned structure based on the P–E model is superior to the K2 algorithm when the data size is small. The difference between the P–E model and the K2 algorithm is not significant when the data size is large.
Conference Paper
Full-text available
Risk Assessment is a mandatory step in product development phase to ensure its Quality and Reliability. For a structural assessment, Finite Element Analysis (FEA) is used commonly in order to know where to be assessed or to judge certain part can be withstood in terms of mechanical stress intensity. FEA results predict some mechanical weak area and/or reliable life time. However, it doesn't tell where Quality and Reliability risks are hidden. This study intends to identify potential reliability risk of copper pillar structure in temperature cyclic loads by using Bayesian Networks (BN). As a conclusion, a BN model suggested higher value of probable failure part and it was well co-related with a result of temperature cyclic load test.
Article
Full-text available
Today root causes of failures and quality deviations in manufacturing are usually identified using existing on-site expert knowledge about causal relationships between process steps and the nature of failures and deviations. Automatization of identification and back tracking of root causes for said failures and deviations would benefit companies both in that knowledge can be transferred between factories and that knowledge will be preserved for future use. We propose a machine learning framework using Bayesian networks to model the causal relationships between manufacturing stages using expert knowledge, and demonstrate the usefulness of the framework on two simulated manufacturing processes.
Article
Full-text available
Due to the megatrend globalization, special machinery is gaining significance for the capital goods sector. Characterized by the fulfillment of individual customer requirements, companies in special machinery have to deal with very specific and technologically complex tasks. Hence, managing information and knowledge becomes vital for a company's competitive ability, notably when it comes to expert knowledge. The characteristics of special machines leads to iterative processes for problem solving and thereby, increase lead times significantly. The more technologically complex a machine is, the more scattered the expert knowledge, meaning that many different experts need to be consulted before solving a problem. Up to now, in scientific literature, there has been little discussion about the challenges of special machinery and practical solutions regarding an implementation of technical intelligence in a special machinery environment. Therefore, the goal of this paper is to give an example of how an expert system can be applied to special machinery surroundings and thus, increases productivity. A Bayesian network forms the basis of the system as it allows efficient inference algorithms and reasoning under uncertainty, despite its ability to describe complex dependencies. The expert systems capability has been proven in industrial laser manufacturing.
Article
Studies reported in supply chain literature have focused on demonstrating the benefits of information sharing, knowledge transfer, and collaborative efforts. The present study carries forward these fundamentals into the quality domain. A methodology is developed for quality defects information sharing among supply chain partners, and analysis of the shared data is structured using correspondence analysis of two-way contingency tables. The model presents a unique opportunity in identifying the underlying phenomenon causing defects by exploiting the associations and differences among the supply chain partners based on defect occurrences and then analyzing the process similarities/differences among them. This saves time and effort spent on finding root causes and designing solutions, as one will have an opportunity to adopt a proven process from a better supplier. This distinguishes the proposed methodology from other conventional quality tools and root cause analysis techniques. The proposed methodology is empirically validated through a case study on a multinational medical equipment manufacturer's supply chain. The application helped identify technical and nontechnical root causes driving defects. The methodology developed in this study is generic and can be applied to analyze quality defects in any supply chain.
Article
Alarm flood is one of the main problems in the alarm systems of industrial process. Alarm root-cause analysis and alarm prioritization are good for alarm flood reduction. This paper proposes a systematic rationalization method for multivariate correlated alarms to realize the root cause analysis and alarm prioritization. An information fusion based interpretive structural model is constructed according to the data-driven partial correlation coefficient calculation and process knowledge modification. This hierarchical multi-layer model is helpful in abnormality propagation path identification and root-cause analysis. Revised Likert scale method is adopted to determine the alarm priority and reduce the blindness of alarm handling. As a case study, the Tennessee Eastman process is utilized to show the effectiveness and validity of proposed approach. Alarm system performance comparison shows that our rationalization methodology can reduce the alarm flood to some extent and improve the performance.
Conference Paper
Industrial alarm systems inform the operator of abnormal plant behavior and are required to guarantee safety, quality, and productivity of the plant. However, modern alarm systems often produce large amounts of false or nuisance alarms which leads to alarm floods. To reduce these alarm floods, we developed an alarm system that performs Root Cause Analysis (RCA) upon an alarm model constructed with causal or Bayesian networks. In this paper, we present methods to construct such networks for RCA with a knowledge-based and a machine learning approach. Finally, a small case study is given to illustrate its applicability in practice and we propose an architecture to combine both approaches.
Article
Purpose – The purpose of this paper is to raise awareness among manufacturing researchers and practitioners of the potential of Bayesian networks (BNs) to enhance decision making in those parts of the manufacturing domain where uncertainty is a key characteristic. In doing so, the paper describes the development of an intelligent decision support system (DSS) to help operators in Motorola to diagnose and correct faults during the process of product system testing. Design/methodology/approach – The intelligent (DSS) combines BNs and an intelligent user interface to produce multi-media advice for operators. Findings – Surveys show that the system is effective in considerably reducing fault correction times for most operators and most fault types and in helping inexperienced operators to approach the performance levels of experienced operators. Originality/value – Such efficiency improvements are of obvious value in manufacturing. In this particular case, additional benefit was derived when the product testing facility was moved from the UK to China as the system was able to help the new operators to get close to the historical performance level of experienced operators.
Article
Full-text available
A data-agnostic approach to automated problem (fault or change) root cause determination in complex IT systems based on monitoring and anomaly event data correlation is developed. It also includes a framework for identification of bottlenecks and black swan type issues in IT infrastructures. The relevant anomaly event correlation engine (AECE) is designed, prototyped, and successfully applied to data from real environments. This approach is based on a historical analysis of abnormality events (obtained by application of an IT management analytics) with their probabilistic correlations. The general theory that applies information measures between the random variables embodying those events is described, and a particular scenario is discussed. The method is capable of detecting origins of problems and generating real-time recommendations for their locations in a hierarchical system. Moreover, it allows localizing the bottleneck IT resources which are persisting causes of historical problems. The black swan type events are shown to be retrospectively identified and used in their risk estimation on-line. The technology and algorithms are based on statistical processing of virtual directed graphs produced from historical anomaly events.
Article
This thesis describes a Bayesian Network (BN) model for recognizing the “Action Units (AUs)” of a facial expression using video sequence images as input. Features were extracted by using an optimal estimation optical flow method coupled with a physical (muscle) model describing the facial structure. The muscle action patterns are used for analysis, recognition, and synthesis of facial expressions. In the thesis the main approaches to facial expression recognition of dynamic images are designed considering three main parts: 1) Region of Interest Selection, 2) Feature Extraction, and 3) Image Classification. Bayesian Networks are a powerful and flexible methodology for representing and computing with probabilistic models of a stochastic process. In the past decade, there has been increasing interest in applying them to practical problems, and this thesis shows that they can be used effectively in the field of automatic AU’s recognition. In past decade optical flows have been used to either model muscle activities or estimate the displacements of feature points but in this thesis we defined nine regions of interest (ROI) which contains the most complex motion by using entropy maximum algorithm. Furthermore, the results were statistically analyzed by compass diagrams to find out the major ranges of directions and velocities of vector flows in each ROI. We found that for the six basic emotions, the ROI are different, so we did not consider all of nine regions for every emotion because of the complexity of our model. Furthermore, we present a methodology for obtaining the BN structure, learning the parameters and inference, including issues such as the discretization of continuous variables. Finally, we apply the BN model to recognize single Action Units (AUs) and some important AU combinations. The average classification rate for the single AUs is between 80% and 90% and for the AU combinations is above 90%
Chapter
Full-text available
It is always essential but difficult to capture incomplete, partial or uncertain knowledge when using ontologies to conceptualize an application domain or to achieve semantic interoperability among heterogeneous systems. This chapter presents an on-going research on developing a framework which augments and supplements the semantic web ontology language OWL5 for representing and reasoning with uncertainty based on Bayesian networks (BN) [26], and its application in ontology mapping. This framework, named BayesOWL, has gone through several iterations since its conception in 2003 [8, 9]. BayesOWL provides a set of rules and procedures for direct translation of an OWL ontology into a BN directed acyclic graph (DAG), it also provides a method based on iterative proportional fitting procedure (IPFP) [19, 7, 6, 34, 2, 4] that incorporates available probability constraints when constructing the conditional probability tables (CPTs) of the BN. The translated BN, which preserves the semantics of the original ontology and is consistent with all the given probability constraints, can support ontology reasoning, both within and across ontologies as Bayesian inferences. At the present time, BayesOWL is restricted to translating only OWL-DL concept taxonomies into BNs, we are actively working on extending the framework to OWL ontologies with property restrictions.
Conference Paper
Full-text available
To support building and maintaining knowledge-based systems for real-life application domains, sophisticated knowledge- engineering methodologies are available. As more and more Bayesian networks are being developed for complex applications, their construction and maintenance calls for the use of tailor-made knowledge-engineering methodologies. We have designed such a methodology and have studied its use within the domain of oe- sophageal cancer. Based upon expert knowledge and a previously constructed Bayesian network, we have built an ontology for this do- main, from which we have constructed, in a sequence of steps, a new network. The use of our methodology has allowed us to address, in a structured fashion, the various intricate modelling issues involved.
Article
Full-text available
Most of the calculator programs found in existing penbased mobile computing devices, such as personal digital assistants (PDA) and other handheld devices, do not take full advantages of the pen technology offered by these devices. Instead, input of expressions is still done through a virtual keypad shown on the screen, and the stylus (i.e., electronic pen) is simply used as a pointing device. In this paper, we propose an intelligent handwriting-based calculator program with which the user can enter expressions simply by writing them on the screen using a stylus. In addition, variables can be defined to store intermediate results for subsequent calculations, as in ordinary algebraic calculations. The proposed software is the result of a novel application of on-line mathematical expression recognition technology which has mostly been used by others only for some mathematical expression editor programs. 1.
Article
Recognizing mathematical expressions from documents image is a key problem in the conversion of scientific document into electronic form. It is also a difficult part in the development of recognition technology. This paper presents an efficient and robust method of parsing typeset mathematics notation using baseline and operator range. A set of predefined rules according to the syntax of the expression is used to form error correction so that a logical arrangement of the mathematical expression can be obtained. Experiments have been carried out for many types of expressions found in printed document and our method has shown favorable results.
Article
A structure specification scheme is described which can be used to specify the structures of certain two-dimensional patterns. Algorithms are developed to test whether a pattern has a (strongly) well-formed structure with respect to a given structure specification scheme. This method is applicable to the analysis of two-dimensional mathematical expressions and the format of printed material. The usefulness of this method is limited to the analysis of patterns whose structures are based upon a number of operators. When it is applicable, a strongly well-formed structure can be constructed in time n2, where n is the number of primitive components of the pattern.
Article
This paper presents our ongoing effort on developing a principled methodology for automatic ontology mapping based on BayesOWL, a probabilistic framework we devel-oped for modelling uncertainty in semantic web. The pro-posed method includes four components: 1) learning prob-abilities (priors about concepts, conditionals between sub-concepts and superconcepts, and raw semantic similarities between concepts in two different ontologies) using Naïve Bayes text classification technique, by explicitly associating a concept with a group of sample documents retrieved and selected automatically from World Wide Web (WWW); 2) representing in OWL the learned probability information concerning the entities and relations in given ontologies; 3) using the BayesOWL framework to automatically translate given ontologies into the Bayesian network (BN) structures and to construct the conditional probability tables (CPTs) of a BN from those learned priors or conditionals, with reason-ing services within a single ontology supported by Bayesian inference; and 4) taking a set of learned initial raw similari-ties as input and finding new mappings between concepts from two different ontologies as an application of our for-malized BN mapping theory that is based on evidential rea-soning across two BNs.
Article
We consider the problem of learning the parameters of a Bayesian network from data, while taking into account prior knowledge about the signs of influences between variables. Such prior knowledge can be readily obtained from domain experts. We show that this problem of parameter learning is a special case of isotonic regression and provide a simple algorithm for computing isotonic estimates. Our experimental results for a small Bayesian network in the medical domain show that taking prior knowledge about the signs of influences into account leads to an improved fit of the true distribution, especially when only a small sample of data is available. More importantly, however, the isotonic estimator provides parameter estimates that are consistent with the specified prior knowledge, thereby resulting in a network that is more likely to be accepted by experts in its domain of application.
Article
Machine recognition of mathematical expressions is not trivial even when all the individual characters and symbols in an expression can be recognized correctly. In this paper, we propose to use definite clause grammar (DCG) as a formalism to define a set of replacement rules for parsing mathematical expressions. With DCG, we are not only able to define the replacement rules concisely, but their definitions are also in a readily executable form. However, a DCG parser is potentially inefficient due to its frequent use of backtracking. Thus, we propose some methods here to increase the efficiency of the parsing process. Experiments done on some commonly seen mathematical expressions show that our proposed methods can achieve quite satisfactory speedup, making mathematical expression recognition more feasible for real-world applications.
Conference Paper
We present a structural analysis method for the recognition of on-line handwritten mathematical expressions based on a minimum spanning tree construction and symbol dominance. The method han- dles some layout irregularities frequently found in on-line handwritten formula recognition systems, like symbol overlapping and association of arguments of sum-like operators. It also handles arguments of opera- tors with non-standard layouts, as well as tabular arrangements, like matrices.
Conference Paper
Generalized two dimensional context free grammars an extension of context free grammars to two dimensions, is described. This extension is a generalization of Tomita's two dimensional context free grammars (M. Tomita, 1989), and better fits into the families of graph grammars described by Crimi (1990) Relation Grammars and by Flasinski (1988) edNLC Grammars, Figure Grammars are particularly useful for applications such as handwritten mathematical expressions. A two dimensional extension of the Cocke-Kasami-Younger parser for context-free languages is used to parse figures using these grammars
Article
We describe a robust and efficient system for recognizing typeset and handwritten mathematical notation. From a list of symbols with bounding boxes the system analyzes an expression in three successive passes. The Layout Pass constructs a Baseline Structure Tree (BST) describing the two-dimensional arrangement of input symbols. Reading order and operator dominance are used to allow efficient recognition of symbol layout even when symbols deviate greatly from their ideal positions. Next, the Lexical Pass produces a Lexed BST from the initial BST by grouping tokens comprised of multiple input symbols; these include decimal numbers, function names, and symbols comprised of nonoverlapping primitives such as "=". The Lexical Pass also labels vertical structures such as fractions and accents. The Lexed BST is translated into L<sup>A</sup>T<sub>E</sub>X. Additional processing, necessary for producing output for symbolic algebra systems, is carried out in the Expression Analysis Pass. The Lexed BST is translated into an Operator Tree, which describes the order and scope of operations in the input expression. The tree manipulations used in each pass are represented compactly using tree transformations. The compiler-like architecture of the system allows robust handling of unexpected input, increases the scalability of the system, and provides the groundwork for handling dialects of mathematical notation.
Text Analytics and Ontology: the next frontiers of Manufacturing Innovation
  • Sudripto De
  • Ashish Sureka
  • Srinivas Narasimhamurthy
Sudripto De, Ashish Sureka and Srinivas Narasimhamurthy, " Text Analytics and Ontology: the next frontiers of Manufacturing Innovation ", International Conference of Manufacturing Research proceedings pp265~268, Sept 2007.
Bayesian Network tool box
  • Kevin Murphy
Kevin Murphy, " Bayesian Network tool box ", www. cs.ubc.ca/~murphyk/Software/BNT/bnt.html.
Determine Superscipt/Subscript Relations in Typeset Mathematical expressions Based on Statistic Features
  • Jianming Jin
Two-dimensional Mathematical Notations In Syntactic Pattern Recognition Applications
  • R H Anderson