Phimeca Engineering SA
  • Cournon-d’Auvergne, France
Recent publications
Metrology tools are applicable to check whether dimensions lay within their tolerance intervals. However, measurement errors cannot be avoided. This paper develops a method to partially correct the effect of such errors in case their probability density function is known. We assume that the probability density function associated with the measured dimension is available, for instance identified after multiple parts are manufactured and their dimension is measured. In this context, the probability density function of the measured value is the convolution product between the probability density functions of the measurement error and of the true value, respectively. Therefore, characterizing the true value is a challenging task, as this involves a deconvolution. We propose a procedure based on: (i) the transformation of the integral involved in the convolution product into a sum using numerical integration; (ii) the definition of a set of non-linear equalities solved using Newton-Raphson's procedure. An application example is used to discuss the relevance of the proposed method and the influence of various parameters on the results.
Rib-to-deck connections are one of the most critical fatigue-prone locations in orthotropic steel decks. Initiated cracks in the root of a fillet weld in this location can propagate towards the deck plate due to the existence of transversal tension within the deck plate near the cracking area. Such cracks can put the structure in a very critical situation since they can reach their critical length without being inspected. The main goal of this paper, therefore, is to investigate the influence of the transversal tension on the crack propagation direction in this region. For that, one translates the loading and boundary conditions from a 3D model to a 2D model to avoid computational issues with 3D modelling. Then, a 2D crack propagation analysis is performed using XFEM method, enabling the comparison of several repair solutions. To illustrate the proposed concepts, this paper investigates the fatigue problem on a real bridge case study and characterizes the effectiveness of two possible repair solutions to cope with such fatigue issues.
Metrology is extensively used in the manufacturing industry to determine whether the dimensions of parts are within their tolerance interval. However, errors cannot be avoided. If the metrology experts are actually aware of it, and currently able to identify the different sources that contribute to making errors, very few research has been made in this area to develop metrology methods accounting for such errors. The probability density function of the error is here assumed to be given as an input. This work deals with a batch of measures and its statistical properties. The first proposed method aims to correct the effects of the errors to the distribution that characterize the entire batch. Then a second method tries to estimate for each single measure, the dimension that is being the most likely given by a measure, after the error is deducted. It is based on the output knowledge of the first method and integrates it with Bayesian statistics. Only Gaussian distributions are considered in the paper. Their relevance is shown through one example applied on simulated data.
The unit influence line of a structure reflects its behaviour and changes in response to damage that may occur. An iterative algorithm is presented in this paper to obtain the shape of the instantaneous influence line of a bridge together with the relative axle loads of trucks passing overhead. One great advantage of this approach is that the need for sensor calibration with pre-weighed trucks can be avoided. The only initial information needed are the measurement data and a preliminary estimate of influence line based on engineering judgement. The concept of a so-called population unit influence line is also presented. This is an influence line that is found from a population of trucks instead of a single vehicle. An illustrative example is presented, where strain data have been collected on a reinforced concrete culvert. As well as the robustness of the proposed algorithms, the influence of temperature on the results is demonstrated. The sensitivity of the population influence line to temperature shows that it is likely to be equally sensitive to loss of stiffness in the structure.
In this paper damage indicators are presented for bridges based on measured data collected underuncontrolled traffic conditions.The test bridge considered is a small stiff concrete culvert with low dynamics, making it sub-optimal for monitoring with accelerometers.Largely static strain responses are measured, with the variability in the vehicle weights and configurations addressed by handling the data in statistical terms. Continuously recorded temperature data is used in an alternative and innovative way to validate damage indicators. Temperature change causes stiffness change in concrete structures. It is posited that temperature can be used as a proxy for bridge dam-ageas, while the effects are clearly different, both cause changes in flexural rigidity and therefore in bridge response to load. It is demonstrated here, using field data, that the damage indicators are sensitive to temperature change. Strain and temperature data for a 2-year period are obtained for a culvert bridge in Slovenia. The defined damage indicators are calculated for the strain data and validated using the simultaneously recorded temperature data. Two ways of using the proposed damage indicators together with the information about temperature for bridge Structural Health Monitoring (SHM) are also presented: one is including the mean and standard deviation of the damage indicators and a compensation for the temperature effect and another to obtain directly the modified Z-score, which is used for identifying outliers, and draw a 2D risk map having damage indicator and temperature on the two axes. Keywords: Bridge; Safety; Field data; Damage Indicator; SHM
In this paper a methodology is presented to update estimated level of corrosion damage using routine short-term health monitoring measurement data. The case study bridge is a reinforced concrete (RC) slab. It is subject to traffic loading from a weigh-in-motion database and time-dependent deterioration is modelled through corrosion. Different damage indicators based on different virtual sensor measurements (strain, deflection and rotation) are investigated. A method of comparing the performance of the damage indicators is presented and their data is used to achieve a Bayesian update of prior estimates of bridge reinforcement corrosion. It is shown how, with few and uncertain data, the estimation of deterioration level can be significantly improved, particularly with damage indicators based on rotation.
The Fukushima Daiichi Nuclear Power Plant accident of March 11 2011 led to a significant release of radionuclides in the environment. More than 99% of the release activity in the atmosphere was due to highly volatile radionuclides such as I, Te, Cs, Xe, Kr. Fairly quickly after the accident, the main release events had been identified and their consequences roughly assessed. Most releases were dispersed over the Pacific Ocean whereas about 20% was dispersed over the Japan main highland causing areas of significant deposit. Since then, the understanding of the different episodes has been greatly enhanced. It is appropriate to review what happened in terms of releases into the atmosphere, transport, and deposition of radionuclides on the Japanese territory. We describe here, the current state of understanding of the release phase of the accident and the means used to achieve it. Numerous radiological measurements taken in the Japanese environment enabled the scientists to substantially reconstruct the four main sequences of contamination, to identify the probable trajectories of the radioactive plumes, and to link them with precipitation data to explain the areas of deposition. The measurements were supplemented by modelling techniques. The most significant progress come from the quantification of the atmospheric releases, the improvement of meteorological data to better take into account the influence of the complex orography on the plumes trajectories and the modelling of deposition processes. Notwithstanding more realistic simulations, progress is still to be made to accurately estimate people exposure due to the release phase of the FDNPP accident. An important result is that the bulk of the deposition was mostly generated at the beginning of the precipitation, by light rain in less than one hour. The scavenging of plume transported in altitude generates high deposition zones. Therefore, they do not necessarily match the zones within which inhalation exposure to the radioactive plumes was the largest.
The aim of this paper is to provide an overview of a developed statistical tolerance analysis technique for over-constrained mechanism. Statistical tolerance analysis is a more practical and economical way of looking at tolerances and works on setting the tolerances so as to assure a desired functionality; however, most of statistical tolerance analysis techniques are dedicated to isoconstrained mechanisms and simple over-constrained mechanisms, they need an explicit assembly response function. The developed technique is based on: –the formalization of the geometrical behavior of the mechanism by an implicit assembly response function, –the formalization of the tolerancing requirement by the quantifier notion, –the coupling of some optimization techniques and reliability techniques.
Modeling studies on the atmospheric diffusion and deposition of the radiocesium associated with the Fukushima Dai-ichi Nuclear Power Plant accident is reviewed here, with a focus on a research collaboration between l'Institut de Radioprotection et de Sûreté Nucléaire (IRSN) - the French institute in charge of evaluating the consequences of nuclear accidents and advising authorities in case of a crisis - and the Meteorological Research Institute (MRI) of the Japan Meteorological Agency - an operational weather forecasting center in Japan. While the modelers have come to know that wet deposition is one of the key processes, the size of its influence is unknown. They also know that the simulation results vary, but they do not know exactly why. Under the research collaboration, we aimed to understand the atmospheric processes, especially wet deposition, and to quantify the uncertainties of each component of our simulation using various numerical techniques, such as ensemble simulations, data assimilation, elemental process modeling, and inverse modeling. The outcomes of these collaborative research topics are presented in this paper. We also discuss the future directions of atmospheric modeling studies: data assimilation using the high temporal and spatial resolution surface concentration measurement data, and consideration of aerosol properties such as size and hygroscopicity into wet and dry deposition schemes
This paper introduces the various aspects of bridge safety models. It combines the different models of load and resistance involving both deterministic and stochastic variables. The actual safety, i.e. the probability of failure, is calculated using Monte Carlo simulation and accounting for localized damage of the bridge. A possible damage indicator is also presented in the paper and the usefulness of updating the developed bridge safety model, with regards to the damage indicator, is examined.
We present an efficient method based on the inclusion-exclusion principle to compute the reliability of systems in the presence of epistemic uncertainty. A known drawback of belief functions and other imprecise probabilistic theories is that their manipulation is computationally demanding. Therefore, we investigate some conditions under which the measures of belief function theory are additive. If this property is met, the application of belief functions is more computationally efficient. It is shown that these conditions hold for minimal cuts and paths in reliability theory. A direct implication of this result is that the credal state (state of beliefs) about the failing (working) behavior of components does not affect the credal state about the working (failing) behavior of the system. This result is proven using a reliability analysis approach based on belief function theory. This result implies that the bounding interval of the system's reliability can be obtained with two simple calculations using methods similar to those of classical probabilistic approaches. A discussion about the applicability of the discussed theorems for non-coherent systems is also proposed.
The spatial variability of stress fields resulting from polycrystalline aggregate calculations involving random grain geometry and crystal orientations is investigated. A periodogram-based method is proposed to identify the properties of homogeneous Gaussian random fields (power spectral density and related covariance structure). Based on a set of finite element polycrystalline aggregate calculations the properties of the maximal principal stress field are identified. Two cases are considered, using either a fixed or random grain geometry. The stability of the method w.r.t the number of samples and the load level (up to 3.5 % macroscopic deformation) is investigated.
Seismic hazard curves provide the rate (or probability) of exceedance of different levels of a ground motion parameter (e.g., the peak ground acceleration, PGA) in a given geographical point and for a given time frame. Hence, to evaluate seismic hazard curves, one needs an occurrence model of earthquakes and an attenuation law of the ground motion with the distance. Generally, the input data needed to define the occurrence model consists in values of the magnitude, experimentally observed or, in the case of ancient earthquakes, indirectly inferred based on historically recorded damages. In this paper, we sketch a full Bayesian methodology for estimating the parameters characterizing the seismic activity in pre-determined seismotectonical zones, given such a catalog of recorded magnitudes. The statistical model, following the peak over threshold formalism, consists in the distribution of the annual number of earthquakes exceeding a given magnitude, coupled with the probability density of the magnitudes, given that they exceed the threshold. Then, as an example of the possible applications of the proposed methodology, the PGA is evaluated in several sites of interest, while taking into account the uncertainty tainting the parameters of the magnitudes' distribution in several seismotectonical zones and the attenuation law. Finally, some perspectives are sketched. Copyright © 2014 John Wiley & Sons, Ltd.
Accidental toxic release in a built environment – risk map (at 1.5 m above the ground level) emphasizing the safe (risk is less than 2.5%), dangerous (risk is greater than 97.5%) and uncertain zones (risk is comprised between 2.5% and 97.5%), estimated using Monte Carlo sampling with 10,000 runs of the vector GP surrogate model. The orange line corresponds to the median critical toxic load contour level while the black line corresponds to that obtained from the deterministic risk assessment study.
Methodologies for quantitative risk assessment regarding CO2 storage operations are currently scarce, mostly because of the lack of experience in this field and the relatively significant degree of uncertainty regarding the subsurface intrinsic properties and the processes occurring after the injection starts. This paper presents a practical approach designed to perform a quantitative risk assessment in an uncertain context. Our approach is illustrated by a realistic case study (Paris Basin, France), conceived to be representative of the level of information available in the early stages of a project. It follows the risk assessment principles from the international standards (ISO 31000:2009), which are adapted to account for the specificities and challenges of subsurface operations. After the establishment of the context of the specific case study, the main risks are identified and we analyze two different risk scenarios: risk of brine leakage from an abandoned well and risk of subsurface use conflict.These scenarios were selected to give a comprehensive overview of different types of analysis in terms of available data, modeling tools and uncertainty management methodologies. The main benefit of this paper is to propose an approach, based on existing risk assessment standards, best practices, and analysis tools, which allows an objective quantitative risk analysis taking into account the uncertainties, and therefore enabling fully informed decision-making while evaluating risk acceptability.© 2014 Society of Chemical Industry and John Wiley & Sons, Ltd
Industrial components subjected to random vibration fatigue (loading with a power spectral density, PSD, or a sinusoidal spectrum) are frequently designed by the use of numerical simulations to estimate the lifetime and ensure the mechanical system behavior. Finite element simulations need the use of a significant amount of input parameters to solve the problem (material properties such as Young's modulus, density, stiffness of joints between components, model element choice...). Some of these parameters are identified from tests with unavoidable uncertainty and others are subjected to inherent variability. It is important to know the influence of this variability/uncertainty on model responses. Innovative methods have been developed to determine the sensitivity of model responses to uncertain parameters in order to study the robustness of the design. Then, the identification of the most influential parameters, associated with the study of their variations, allow to determine result dispersions for the modal analysis (eigen frequencies, mode shapes), the calculation of random vibration (RMS constraints) and damage calculation. This methodology provides an estimation of the probability of system failure instead of a binary result obtained by a deterministic design method. (C) 2013 The Authors. Published by Elsevier Ltd.
Wells drilled through low-permeable caprock are potential connections between the CO2 storage reservoir and overlying sensitive targets like aquifers and targets located at the surface. The wellbore integrity can be compromised due to in situ operations, including drilling, completion, operations and abandonment or to geochemical degradation of the caprock-cement-casing system. We present here an experimental set-up in the underground rock laboratory of Mont-Terri (St Ursanne, canton of Jura, Switzerland): the drilling and well completion in the laboratory will be done in the aim of reconstructing interfaces between the caprock, the cement and the casing steel that would be close to the ones observed in situ. These well features will then be dipped within a CO2 stream, during a given time period before a final over-coring. Such an experiment should provide new insights on the quality of bounding between casing/cement/clay interfaces and its evolution due to geochemical reactions. In parallel, a modeling effort is performed focused on both geochemical and transport aspects of the interactions between the fluids and the well compartments.
Many parameters used for predicting times to failure of structures due to fatigue are uncertain and their variations have a bid influence on the real life time. This paper focused on global methodology to take main sources of variability in fatigue prediction for stay cables into accounts. The first step of this methodology is to model the variability of each parameter. Loading is the one of the most important sources of variability is the strength of the stay cable. Finally reliable is assessed using Monte Carlo simulations.
In the present paper a probabilistic model for fatigue crack growth in welded steel details in road bridges is presented. The probabilistic model takes the influence of bending stresses in the joints into account The bending stresses can either be introduced by e.g. misalignment or redistribution of stresses in the structure. The fatigue stress ranges are estimated from traffic measurements and a generic bridge model. Based on the probabilistic models for the resistance and load the reliability is estimated for a typical welded steel detail. The results show that large misaligmnents in the joints can have a significant influence on the reliability level.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
18 members
Sylvain Girard
  • Fiabilité et robustesse
Oussama Braydi
  • Fiabilité et Incertitudes
Raphaël Périllat
  • Incertitudes
Information
Address
Cournon-d’Auvergne, France