Article

POD Evaluation: The Key Performance Indicator for NDE 4.0

Authors:
  • AV-NDT Berlin Germany
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Reliability evaluations of modern test systems under the Industry 4.0 technologies, play a vital role in the successful transformation to NDE 4.0. This is due to the fact that NDE 4.0 is mainly based on the interconnection between the cyber-physical systems. When the individual reliability of the various important technologies from the Industry 4.0 such as the digital twin, digital thread, Industrial Internet of Things (IIoT), artificial intelligence (AI), data fusion, digitization, etc. is high, then it is possible to obtain the reliability beyond the intrinsic capability of the test system. In this paper, the significance of the reliability evaluation is reviewed under the vision of NDE 4.0, including examples of data fusion concepts as well as the importance of algorithms (like explainable artificial intelligence), the practical use is discussed and elaborated accordingly.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Analysts and managers do not have end-to-end access to all functions or subprocesses to check and advise. AI will automate many of these jobs or procedures by clearly communicating what needs to be rectified and how soon [3][4][5].Managers or analysts who see AI as a collaborator will see there is no need to "race against a computer." While human judgment is unlikely to be automated, intelligent robots may significantly contribute to this job by aiding with decision support and data-driven simulations, among other things that will impact the KPIs. ...
... In the present, however, Machine learning Algorithms can be used to interpret the trends more accurately in the data and then turn those trends or patterns into business uses. Supervised learning, the most used machine learning, helps predict and classify data [3,14,16]. The insurance and financial industries use Supervised algorithms such as classification, where the data is divided into individual attributes and features which correspond to one of the target or output variables [17]. ...
Article
Full-text available
AI and Machine learning are playing a vital role in the financial domain in predicting future growth and risk and identifying key performance areas. We look at how machine learning and artificial intelligence (AI) directly or indirectly alter financial management in the banking and insurance industries. First, a non-technical review of the prior machine learning and AI methodologies beneficial to KPI management is provided.This paper will analyze and improve key financial performance indicators in insurance using machine learning (ML) algorithms. Before applying an ML algorithm, we must determine the attributes directly impacting the business and target attributes. The details must be manually mapped from string values to fit the model and its required datatypes for applying these specific features to an ML model. We propose hashing to convert string values to numeric values for data analysis within our model. After the string values are hashed, we can introduce our model. In our case, we have chosen to use a decision tree model. Decision Trees are beneficial for this use case as this algorithm generates rulesets that govern the target value output. These rulesets can then be applied to the financial dataset and infer the "best fit" value that might be wrong/missing. Finally, because of the model, we can use this most accurate data version to detect general ledger transactional data patterns.
... [1954,1956] ( Even though sporadic POD activities were initiated at DMRL and IIT Madras, the total amount of POD activity carried out in India was and still is in its infantry stage. This can be clearly observed from the number of total global publications on POD worldwide as shown in Figure 2 [18]. As shown in Figure 2, the total number of publications accounts to approximately 800 amongst which the publications from India accounts to approximately 20 (from the POD activities carried out from both DMRL and IIT Madras). ...
... POD publications in the field of NDE[18] Fig.2:POD publications in the field of NDE[18] ...
Article
Full-text available
Usage of non-destructive testing or evaluation (NDT/E) techniques is widely accepted across various industries all over the world in order to maintain certain safety and quality standards. However, not many countries really perform the reliability evaluation of their NDT techniques. In this context, this article discusses the importance of reliability evaluations and state of reliability programs performed in India versus the reliability activities performed in the western countries. In addition, brief results from one of the POD programs carried out at DMRL are also presented for understanding the challenges involved in pursuing reliability programs in India. Moreover, remarks on the possible direction of POD especially under the context of transforming the industry towards NDE 4.0 were made.
Article
Full-text available
Cognitive sensor systems (CSS) determine the future of inspection and monitoring systems for the nondestructive evaluation (NDE) of material states and their properties and key enabler of NDE 4.0 activities. CSS generate a complete NDE 4.0 data and information ecosystem, i. e. they are part of the materials data space and they are integrated in the concepts of Industry 4.0 (I4.0). Thus, they are elements of the Industrial Internet of Things (IIoT) and of the required interfaces. Applied Artificial Intelligence (AI) is a key element for the development of cognitive NDE 4.0 sensor systems. On the one side, AI can be embedded in the sensor’s microelectronics (e. g. neuromorphic hardware architectures) and on the other side, applied AI is essential for software modules in order to produce end-user-information by fusing multi-mode sensor data and measurements. Besides of applied AI, trusted AI also plays an important role in CSS, as it is able to provide reliable and trustworthy data evaluation decisions for the end user. For this recently rapidly growing demand of performant and reliable CSS, specific requirements have to be fulfilled for validation and qualification of their correct function. The concept for quality assurance of NDE 4.0 sensor and inspection systems has to cover all of the functional sub-systems, i. e. data acquisition, data processing, data evaluation and data transfer etc. Approaches to these objectives are presented in this paper after giving an overview on the most important elements of CSS for NDE 4.0 applications. Reliable and safe microelectronics is a further issue in the qualification process for CSS.
Article
Full-text available
Across so many industries, non-destructive evaluation has proven its worth time and again through quality and safety assurance of valuable assets. Yet, over time, it became underappreciated in business decisions. In most cases, the data gathered by NDT is used for quality assurance assessments resulting in binary decisions. And we seem to miss out on value of the information content of NDE which goes way deeper and can help other stakeholders: such as engineering, management, inspectors, service providers, and even regulators. Some of those groups might not even be aware of the benefits of NDE data and its digitalization. Unfortunately, the NDE industry typically makes the data access unnecessarily difficult by proprietary interfaces and data formats. Both those challenges need to be addressed now by the NDE industry. The confluence of NDE and Industry 4.0, dubbed as NDE 4.0, provides a unique opportunity for the NDE/NDT Industry to not only readjust the value perception but to gain new customer groups through a broad set of value creation activities across the ecosystem. The integration of NDE into the Cyber-Physical Loop (including IIoT and Digital Twin) is the chance for the NDE industry to now shift the perception from a cost center to a value center. This paper provides an overview of the NDE ecosystem, key value streams, cyber-physical loops that create value, and a number of use cases for various stakeholders in the ecosystem.
Article
Full-text available
Establishing probability of detection (POD) or reliability of various nondestructive testing (NDT) techniques is essential for implementing damage tolerant (DT) methodology for aero-engines. This POD is usually established with the help of a large number of service expired aero-engine components containing several fatigue cracks. In the absence of such components, artificial defects such as electrical discharge machining (EDM) notches or starter cracks were explored. However, such artificial defects would not meet the key features such as tightness of the fatigue cracks and the possible oxidation in the crack opening and thus, limiting their usage. Therefore, in the current study, an innovative approach of generating fatigue cracks at 650 °C (~ typical aero-engine service temperatures) with key high temperature service degradation aspects of oxidation and fatigue cracking is demonstrated for the first time using Gleeble® test system. Further, POD is estimated by inspecting these laboratory generated fatigue cracks using fluorescent liquid penetrant technique (FLPT) and eddy current technique (ECT) under HIT (defect detected) vs. MISS (defect not detected) and â (signal response) vs. a (crack size) methodologies. The current study also discusses a statistical approach of random generation of crack sizes for use in NDT reliability analysis. In addition, an attempt has been made to understand the effect of a90/95 values on remnant life calculations. It is concluded that the eddy current response of oxidized fatigue cracks results in better (high sensitive) a90/95 values compared to the eddy current response obtained from non-oxidized fatigue cracks.
Article
Full-text available
Cyber technologies are offering new horizons for quality control in manufacturing and safety assurance in-service of physical assets. The line between non-destructive evaluation (NDE) and Industry 4.0 is getting blurred since both are sensory data-driven domains. This multidisciplinary approach has led to the emergence of a new capability: NDE 4.0. The NDT community is coming together once again to define the purpose, chart the process, and address the adoption of emerging technologies. In this paper, the authors have taken a design thinking approach to spotlight proper objectives for research on this subject. It begins with qualitative research on twenty different perceptions of stakeholders and misconceptions around the current state of NDE. The interpretation is used to define ten value propositions or use cases under ‘NDE for Industry 4.0’ and ‘Industry 4.0 for NDE’ leading up to the clarity of purpose for NDE 4.0—enhanced safety and economic value for stakeholders. To pursue this worthy cause, the paper delves into some of the top adoption challenges, and proposes a journey of managed innovation, conscious skills development, and a new form of leadership required to succeed in the cyber-physical world.
Article
Full-text available
Damage Tolerance (DT) lifing methodology for aero-engines require the reliability of Non-Destructive Testing (NDT) techniques used. Probability of Detection (POD) for measuring NDT reliability yields the a90/95 (flaw detection with 90% probability and 95% confidence) value. This a90/95 or the largest crack size missed by an NDT technique is in general, incorporated into the DT calculations for estimating the remaining fatigue cycles the component can withstand before failure. Hence, it is essential to estimate the a90/95 value to the closest accuracy. However, the NDT inspection data at a site containing multiple cracks results in ambiguity of HIT/MISS approaches to be adopted for the estimation of POD or a90/95 values. Several approaches were attempted by the researchers to minimize the ambiguity but with limited success due to the restrictions in implementing them. Moreover, to the best of the author’s knowledge, the physical significance of the a90/95 value obtained from different HIT/MISS approaches on the remnant life calculations of aero-engine components was not available in the literature. Therefore, in the current study, physical manifestation of a90/95 in remnant life calculations obtained from the maximum flaw size and sum of flaw sizes approaches for inspection of natural fatigue cracks in a nickel based superalloy using fluorescent penetrant (FPI) and eddy current inspection (ECI) techniques was attempted. It was observed that ECI technique provides the higher number of remnant cycles than the FPI technique due to its higher sensitivity. In addition, it was also observed that regardless of the NDT techniques used, maximum flaw size approach results in higher number of fatigue cycles. However, the actual number of remnant cycles of the component can be exactly known provided the capability of the current NDT techniques in resolving a group of flaws in a particular location is enhanced.
Conference Paper
Full-text available
Fatigue cracks originating from in-service aero-engine turbine discs are usually known to follow log-normal distribution. Non-destructive testing techniques used for detecting fatigue cracks produce response either as HIT (detected)/ MISS (undetected) or "a" (crack size) Vs. "â" (crack response) depending on the type of the NDT techniques used. Under the fracture mechanics based damage tolerance methodology widely used in aero-engine industry, thorough understanding of the reliability of NDT techniques is as equally important as identifying cracks. In general, the reliability of an NDT technique is usually estimated by plotting Probability of Detection (POD) curves. POD is a function of crack parameters such as size, shape and orientation along with type of material.POD of any NDT technique can be estimated by using the standard test procedure and methods mentioned in MIL-HDBK 1823A. However, as the experimental estimation of POD involves more laborious and time consuming process, model assisted POD (MAPOD) approaches are currently in practice. In this study, MAPOD approaches were demonstrated for volumetric cracks using ultrasonic testing. A commercial grade Titanium alloy Ti-6Al-4V cylindrical block (50 mm x15 mm) with a cylindrical defect (0.5mm x 5 mm) at the centre was initially inspected with ultrasonic testing in A-scan mode and the corresponding amplitude vs. time data of the block was analyzed. Ultrasonic wave interaction with a cylindrical defect was simulated as a 2-D axisymmetric model using COMSOL Multiphysics software, which was further validated with the help of experimental data. As fatigue cracks usually follow log-normal distribution, distribution of crack sizes for POD curve generation in MAPOD approach was also assumed to be log-normal in nature. Hence, in this study, both 'a' and 'â' follow log normal distribution. However, 'log (a)' or log (â)' follow normal distribution. Further, log-log linear regression with normal distribution was performed and correspondingly mean and standard deviation of the distribution was obtained. Further, these mean and standard deviation were log-transformed for obtaining scale and location parameters of log normal distribution. Furthermore, using these scale and location parameters, CDF of log-normal distribution was plotted resulting in a POD curve. Moreover, 95% confidence bounds of the POD curve were also plotted and a flaw size with 90 % probability and 95 % confidence limit (a 90/95) value was obtained.
Conference Paper
Full-text available
Visual understanding of complex urban street scenes is an enabling factor for a wide range of applications. Object detection has benefited enormously from large-scale datasets, especially in the context of deep learning. For semantic urban scene understanding, however, no current dataset adequately captures the complexity of real-world urban scenes. To address this, we introduce Cityscapes, a benchmark suite and large-scale dataset to train and test approaches for pixel-level and instance-level semantic labeling. Cityscapes is comprised of a large, diverse set of stereo video sequences recorded in streets from 50 different cities. 5000 of these images have high quality pixel-level annotations; 20000 additional images have coarse annotations to enable methods that leverage large volumes of weakly-labeled data. Crucially, our effort exceeds previous attempts in terms of dataset size, annotation richness, scene variability, and complexity. Our accompanying empirical study provides an in-depth analysis of the dataset characteristics, as well as a performance evaluation of several state-of-the-art approaches based on our benchmark.
Article
Full-text available
We use three clustering algorithms to aggregate a three-modal NDT data set into defect and not-defect groups. Our data set consist of impact-echo (IE), ultrasound (US) and ground penetrating radar (GPR) data collected on a large concrete slab with embedded simulated honeycombing defects. US performs best in defect discriminating and sizing, however the false positive rate is still high. We fuse the data set using K-Means, Fuzzy C-Means and DBSCAN clustering at feature-level. We discern that DBSCAN improves the detectability up to 10%. A discussion of its advantages over commonly used K-Means and Fuzzy C-Means clustering are provided.
Conference Paper
Full-text available
Traditional empirical studies to estimate probability of detection (or POD) are expensive and time consuming. Over the past thirty years, much progress has been made in the use of physics-based models to predict POD. A deterministic model for flaw response can be combined with a probability distribution for inspection variabilities to provide a model-based POD. Actual inspections, however, involve complicated variabilities from a variety of sources and modeling all of the important ones, and especially human factors variabilities, would be difficult or impossible. Bruce Thompson's knowledge of physics, probability, statistics and industry needs gave him the insights to pioneer and subsequently serve as the leader in the important area that is now called "Model Assisted POD" or MAPOD. The basic idea of MAPOD is to find an appropriate combination of a physics-based model, combined with limited (usually by time and cost constraints) experimental data and statistical modeling to establish POD. This talk will outline Bruce Thompson's important contributions to this area.
Chapter
This chapter mainly focuses on the major aspects of the reliability of nondestructive testing (NDT) techniques. From the safety point of view, evaluation of NDT techniques is vital for many risk-involved industries such as in aero-industry, railways, nuclear, oil and gas, etc. In addition, successful implementation of the damage tolerance concept highly relies on the reliability of NDT techniques. In other words, due to the aims of NDE 4.0, the qualitative evaluation of NDT is becoming vital. The first part of this chapter deals with the importance of NDT reliability with regard to the economical, jurisdictional, and safety-critical requirements. Upon highlighting the importance of NDT, the second part of the chapter provides an overview of the understanding on the reliability of NDT. The third and last subsection of this chapter focuses on the topic of the reliability evaluation under NDE 4.0 along with discussion on the need and possibilities of the reliability evaluation.
Article
Round robin exercises have traditionally been difficult to arrange in the field of non-destructive testing (NDT). To create a representative round robin exercise, representative mock-ups with representative flaws are needed. The mock-ups are costly and transporting them around the world to facilitate testing by numerous laboratories is difficult. The few round robins that have been completed have often contributed significantly to our understanding on the capability of the used NDT methods and procedures. Recently, the increased use of automated inspections together with the development of virtual flaws (independently by Trueflaw and EPRI) has enabled a new type of round robin, where instead of moving samples around the world, the round robin is focused on the data analysis and only pre-acquired data files are distributed. This makes conducting a round robin much more cost-effective both in terms of arrangement and in terms of inspection effort from the participating companies. In addition, the virtual flaw technology allows unprecedented number and variety of flaws to be included. With high number of flaws included, the results present statistically meaningful sample and can be further analyzed to estimate the probability of detection (POD) with standard statistical tools. In connection with the international project “PIONIC”, such a virtual round robin was arranged for the first time. The exercise showed, that virtual flaws and virtual round robins can be used to extract important information about NDT reliability and performance. Also, some points of development were identified for further studies: the sizing and detection files should be better optimized for their respective uses and the data could be further obfuscated to avoid any possibility of inspectors learning to recognize repeating signal patterns. 12 inspectors submitted results to the virtual round robin. The results showed a90/95 ranging from 1.2 to 7.0 mm – a significant variation in performance. The difference was mainly attributed to different inspection strategies. In addition, an unexplained tendency to miss big cracks was noted on some result sets. One of the data files did not contain any flaws. None of the inspectors correctly identified the file as flawless.
Article
One of the many applications of X-ray computed tomography (CT) in industry is the detection of pores, cavities and other flaws in cast metal parts. Because of its improvement on part safety and saving of expenses, CT inspection is moving from a random sample inspection towards a full in-line inspection. With the increasing amount of produced data, however, comes the need for an automated processing. Due to tight time constraints the resulting CT scans are very artifact afflicted, which impedes automated inspection. In recent years, deep learning methods—convolutional neural networks in particular—have been used with great success to tackle even complex segmentation tasks in cluttered scenes. As we show, these methods are also applicable to the domain of industrial CT data: they are able to cope with noise, beam hardening, scatter and other artifacts which we encounter here. However, these methods need a vast amount of precisely labeled training data to work properly. Gathering the necessary data is not only cumbersome due to the need of annotating three-dimensional data but also expensive as it requires the knowledge of domain experts. Therefore, we present a new approach: We train our models on realistically simulated CT data only. Here, a precise per-voxel ground truth can simply be computed. In order to show that the simulated data is sufficient to train a segmentation network, we turn to its prediction performance on real CT data. We compare the prediction performance of traditional algorithms as well as the trained segmentation network on simulated and real validation data and demonstrate that they behave similarly. The ground truth for the real validation data is hand-labeled using high-quality CT scans, while the actual validation set consists of CT scans of lower quality of the exact same parts. For a comprehensive evaluation, we evaluate the probability of detection as well as the intersection over union. The first tells us how likely a flaw of given size can be found with a given confidence, which is of special interest to domain experts. The latter gives us a per-voxel information of how precise the overall segmentation is. Moreover, our synthetic data enables us to examine the influence of different artifact types on the detection rate. Besides these quantitative analyses we show some qualitative results of real-world applications. To the best of our knowledge, we describe the first approach for defect detection in three-dimensional CT data, which is solely trained with simulated data.
Article
X-ray computed tomography (XCT) is a promising non-destructive evaluation technique for additively manufactured (AM) parts with complex shapes. Industrial XCT scanning is a relatively new development, and XCT has several acquisition parameters a user can change for a scan whose effects are not fully understood. An artifact incorporating simulated defects of different sizes was produced using laser powder bed fusion (LPBF) AM. The influence of six XCT acquisition parameters was investigated experimentally based on a fractional factorial designed experiment. Twenty experimental runs were performed. The noise level of the XCT images was affected by the acquisition parameters, and the importance of the acquisition parameters was ranked. The measurement results were further analyzed to understand the probability of detection (POD) of the simulated defects. The POD determination process is detailed, including estimation of the POD confidence limit curve using a bootstrap method. The results are interpreted in the context of the AM process and XCT acquisition parameters.
Article
Assessing the reliability of non-destructive testing (NDT) techniques in detecting in-service fatigue cracks is vital for ensuring the structural integrity of aero engines. However, the requirement of a large number of in-service failed components with numerous initiating fatigue cracks makes it a cost-intensive methodology. Hence, laboratory samples with electrical discharge machined (EDM) notches representing fatigue cracks have also been used for NDT reliability studies. However, probability of detection (POD), a measure of NDT reliability, is usually a function of all crack dimensions rather than only length. This limits the applicability of EDM notches (minimum width of notch ~0.3 mm) as artificial fatigue cracks for POD studies. The current study demonstrates the methodology of generating cracks under laboratory conditions in nickel-based superalloy samples (representative aero engine material), mimicking in-service conditions such as crack tightness as low as 1 μm, crack tortuosity, transgranular nature, crack branching and multiple initiation sites of cracks. POD curves generated using these samples are demonstrated and the feasibility of this approach is discussed. © 2019 British Institute of Non-Destructive Testing. All Rights Reserved.
Article
In order to successfully implement Damage Tolerance (DT) methodology for aero-engines, Non-Destructive Testing (NDT) techniques are vital for assessing the remaining life of the component. Probability of Detection (POD), a standard measure of NDT reliability, is usually estimated as per MIL-HDBK-1823A standard. Estimation of POD of any NDT technique can be obtained by both experimental and model assisted methods. POD depends on many factors such as material, geometry, defect characteristics, inspection technique, etc. These requirements put enormous limitations on generating experimental POD curves and hence, Model Assisted Probability of Detection (MAPOD) curves are currently in vogue. In this study, MAPOD approaches were demonstrated by addressing various issues related to selection of crack sizes distribution, challenges involved in censoring and regression, estimation of distribution parameters, etc. Ultrasonic testing on volumetric defects has been identified as a platform to discuss the challenges involved. A COMSOL Multiphysics based FEM numerical model developed to simulate ultrasonic response from a Ti-6Al-4V cylindrical block has been validated experimentally. Further, the individual ultrasonic response from various Flat Bottom Hole (FBH) defects following lognormal distribution has been generated using the numerical model. a90/95(detecting a flaw with 90% probability and 95% confidence) value obtained from POD curve showed that the POD value increased with an increase in decision threshold.
Chapter
The Federal Aviation Administration requires that transport aircraft be “damage tolerant.” That is, they must be: “evaluated to ensure that should serious fatigue, corrosion, or accidental damage occur within the operational life of the airplane, the remaining structure can withstand reasonable loads without failure or excessive structural deformation until the damage is detected. ” [1]
Article
The aim of this publication is to provide an overview of new methodologies for evaluating the reliability of NDE systems in accordance with the specific requirements of industrial applications. After a review of the substantive issues of the previous decades, the go forward guidance is concluded. For high safety demands a quantitative POD (Probability of Detection) created from hit/miss experiments or signal response analysis and ROC (Receiver Operating Characteristics) are typically created. The modular reliability model distinguishes between the pure physical-technical influence, industrial application factors, and human factors. It helps to learn which factors can be determined by modeling and by open or blind trials. A new paradigm is offered to consider the POD or reliability of the system as a function of the configuration of input variables and is used for optimization rather than for a final judgement. New approaches are considered dealing with real defects in a realistic environment, affordable but precisely like the Bayesian approach or model assisted methods. Among the influencing parameters, human factors are of high importance. A systematic psychological approach helps to find out where the bottlenecks are and shows possibilities for improvement.
Article
It is not always only the size of the flaw that determines the severity of the flaw for the structure. In such cases, it is important to express the capability of the non-destructive testing (NDT) system to detect a flaw with respect to exactly those parameters that determine flaw severity. The multi-parameter reliability model presented in this article shows a way of calculating and expressing the probability of detection (POD) as a function of different influencing parameters, using numerically simulated NDT system responses and experimentally measured responses. A successful application of the model is demonstrated on the data from a transmit-receive longitudinal (TRL) ultrasonic inspection of a cast iron component. The POD of the surface-breaking semi-elliptical crack-like flaw is expressed as a function of its depth and length. In a direct comparison with the conventional signal response analysis, where the POD is expressed as a function of only the flaw size, the method provides a more comprehensive estimation of the reliability of the NDT system.
Article
This paper presents a methodology based on the Bayesian data fusion techniques applied to non-destructive and destructive tests for the structural assessment of historical constructions. The aim of the methodology is to reduce the uncertainties of the parameter estimation. The Young's modulus of granite stones was chosen as an example for the present paper. The methodology considers several levels of uncertainty since the parameters of interest are considered random variables with random moments. A new concept of Trust Factor was introduced to affect the uncertainty related to each test results, translated by their standard deviation, depending on the higher or lower reliability of each test to predict a certain parameter.
Article
The assessment of the Probability of Detection (POD) is used to evaluate the reliability of the non-destructive testing (NDT) system. The POD is required in industries, where a missed flaw might cause grave consequences. If only the artificial defects are evaluated, the POD could lead to wrong conclusion or even be invalid. The POD based on real flaws is needed. A small amount of real flaws can lead to a not statistically significant result or even to incorrect results. This work presents an approach to obtain to a significant result for the POD of the current dataset, despite the small amount of real defects. Two steps are necessary to assess a NDT system based on real flaws. First we evaluated the correlation between the NDT signal and the real size of the flaw. Second we use a statistical approach based on the Bayesian statistics to assess a POD in spite of the small amount of data. The approach allows including information of the POD evaluation of artificial defects in the assessment of the POD of real flaws.
Article
The evaluation of non-destructive testing (NDT) methods in terms of reliability is an increasing demand in various industries and applications. The probability of detection (POD) is the most frequently used method for this task. In the testing of holes with low-frequency eddy current testing, the frequently-used one-parametric POD approach cannot be used because the requirements, mainly the linearity, cannot be met. Therefore, the use of a multi-parametric non-linear regression approach to calculate the POD is proposed. Instead of numerical simulations, commonly used in multi-parametric approaches, an analytical model is used. The goal of this work is to evaluate the reliability of the eddy current system for the testing of electron-beam welded copper canisters by calculating a POD with the help of different artificial hole-like defects. In this example, the multi-parametric non-linear regression approach is shown to be successful, enabling the combination of depth and diameter in the POD calculation.
Article
The Pascal Visual Object Classes (VOC) challenge consists of two components: (i) a publicly available dataset of images together with ground truth annotation and standardised evaluation software; and (ii) an annual competition and workshop. There are five challenges: classification, detection, segmentation, action classification, and person layout. In this paper we provide a review of the challenge from 2008–2012. The paper is intended for two audiences: algorithm designers, researchers who want to see what the state of the art is, as measured by performance on the VOC datasets, along with the limitations and weak points of the current generation of algorithms; and, challenge designers, who want to see what we as organisers have learnt from the process and our recommendations for the organisation of future challenges. To analyse the performance of submitted algorithms on the VOC datasets we introduce a number of novel evaluation methods: a bootstrapping method for determining whether differences in the performance of two algorithms are significant or not; a normalised average precision so that performance can be compared across classes with different proportions of positive instances; a clustering method for visualising the performance across multiple algorithms so that the hard and easy images can be identified; and the use of a joint classifier over the submitted algorithms in order to measure their complementarity and combined performance. We also analyse the community’s progress through time using the methods of Hoiem et al. (Proceedings of European Conference on Computer Vision, 2012) to identify the types of occurring errors. We conclude the paper with an appraisal of the aspects of the challenge that worked well, and those that could be improved in future challenges.
Article
Estimates of the probability of detection curve based on hit∕miss data without reference to a parametric function or model are presented. The estimate is based on maximum likelihood estimation making the sole assumption that the probability of detection curve is a continuous monotonic increasing function. The calculations required for the estimated probabilities of detection are simple and easy to implement. The nonparametric estimates are similar in nature to binomial methods proposed in the literature, but have the advantage that a priori assumptions concerning how to aggregate data are not forced upon the analyst. It is proposed that the nonparametric fit is most useful as a comparison to the fits from parametric models, thereby providing a visual goodness of fit between the data and the parametric model.
Article
The assessment of nondestructive flaw detection reliability is complex in character due to the varied engineering and scientific disciplines involved. The evolution of nondestructive flaw detection reliability demonstration and assessment has involved varied efforts by workers in various industries, applications and environments. A significant data base has been established and has contributed to a general understanding of the elements of inspection reliability. A considerable number of analyses have been performed to effect a better understanding of the problem and to identify critical factors in both the inspection process performance and in reliability assessment. This paper reviews principle factors in nondestructive flaw detection process performance and suggests an alternative approach to the analysis of performance data. The approach includes consideration of the conditional probability character of flaw detection and consideration for predictive modeling based on signal and noise analyses of flaw detection by instrumental techniques and by human operators. (Author)
Article
The economic drive towards using aircraft beyond their initial design life has created a great interest in damage-tolerance (DT) based maintenance. The DT approach relies on routine nondestructive inspections (NDI), and requires that the NDI performance to be quantified in terms probability of detection (POD) to determine the safe inspection intervals. The most common approach for determining NDI POD is to perform inspections on representative components or specimens simulating the actual parts. This approach is practical but can be very expensive. A more economical approach may be to use actual field inspection data to obtain POD. This approach is particularly attractive for airframe inspection techniques, since most airframe structures cannot be easily simulated. There are a number of difficulties with this approach: Firstly, there is usually a very limited amount of field data. This may require special statistical treatment. Secondly, crack growth data must exist to allow the estimation of flaw sizes at the inspection sites at inspection times before the flaws were found. These factors and others affect the confidence in the calculated POD, and must be quantified before POD data of this type can be used. In this work, data from full scale fatigue tests were analyzed, and methods of overcoming the problems of small sample sizes and crack growth data requirements were investigated. .
Webinar: Data: Perception vs. Reality with A
  • T Anbarasu
  • R Wenzel
  • J Singh
  • Vrana
Nondestructive Monitoring along the Product Life Cycle
  • Izfp Fraunhofer
Guidelines for NDE reliability determination and description
  • O Førli
  • K Ronold
Theory and Application of the Modular Approach to NDE Reliability Bridging the Gap Between Safety Requirements and Economy
  • C Mueller
  • T Fritz
  • G.-R Tillack
  • C Bellon
  • M Scharmach
Probability of defect detection of Posiva's electron beam weld
  • D Kanzler
  • C Müller
  • J Pitkänen
NDE reliability data analysis: nondestructive evaluation and quality control
  • A P Berens
An approach to the question ‘How to account for human error in MAPOD?
  • M Spies
  • H Rieder
A unified model of inspection and monitoring quality
  • E Bismut
  • D Straub
Statistical Assessment Within a Global Conception of Reliability of NDE. Non-Destructive Examination Practice and Results-State of the Art and PISC III Results
  • Ch Nockemann
  • G R Tillack
  • D Schnitger
Generating Meaningful Synthetic Ground Truth for Pore Detection in
  • P Fuchs
  • T Kröger
  • T Dierig
  • Ch Garbe
General principles of reliability assessment of nondestructive diagnostic systems and its applicability to the demining problem
  • Ch Müller
  • M Scharmach