Amandine Marrel

Amandine Marrel
  • Research Scientist
  • Engineer at Atomic Energy and Alternative Energies Commission

About

65
Publications
8,090
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,851
Citations
Introduction
Amandine Marrel currently works as research scientist at CEA of Cadarache in the Nuclear Energy Division. Her research interests involve the development of probabilistic and statistical approaches for the uncertainty quantification, design, metamodeling and sensitivity analysis of computer experiments, in support of safety studies for nuclear reactors and environmental impact studies.
Current institution
Atomic Energy and Alternative Energies Commission
Current position
  • Engineer
Additional affiliations
November 2008 - March 2011
IFP Energies nouvelles
Position
  • Researcher
Description
  • Research engineer on the uncertainty management in numerical simulation, mainly motivated by oil exploration problems
May 2011 - present
Atomic Energy and Alternative Energies Commission
Position
  • Engineer
Description
  • Research engineer, for the development of probabilistic and statistical approaches for the uncertainty quantification, design, metamodeling and sensitivity analysis of computer experiments, in support of safety studies for nuclear reactors.

Publications

Publications (65)
Article
The Hilbert-Schmidt independence criterion (HSIC) is a dependence measure based on reproducing kernel Hilbert spaces. This measure can be used for the global sensitivity analysis of numerical simulators whose objective is to identify the most influential inputs on the output(s) of the code. For this purpose, HSIC-based sensitivity measures and inde...
Article
In decommissioning projects of nuclear facilities, radiological characterisation aims to estimate the quantity and spatial distribution of different radionuclides. To carry out the estimation, measurements are performed on site to obtain preliminary information. The usual industrial practice consists of applying spatial interpolation tools (as the...
Preprint
Full-text available
Nowadays, numerical models are widely used in most of engineering fields to simulate the behaviour of complex systems, such as for example power plants or wind turbine in the energy sector. Those models are nevertheless affected by uncertainty of different nature (numerical, epistemic) which can affect the reliability of their predictions. We devel...
Preprint
In decommissioning projects of nuclear facilities, the radiological characterisation step aims to estimate the quantity and spatial distribution of different radionuclides. To carry out the estimation, measurements are performed on site to obtain preliminary information. The usual industrial practice consists in applying spatial interpolation tools...
Article
Within the framework of best-estimate-plus-uncertainty approaches, the assessment of model parameter uncertainties, associated with numerical simulators, is a key element in safety analysis. The results (or outputs) of the simulation must be compared and validated against experimental values, when such data are available. This validation step, as p...
Preprint
Numerical simulators are widely used to model physical phenomena and global sensitivity analysis (GSA) aims at studying the global impact of the input uncertainties on the simulator output. To perform GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on the Hilbert-Schmidt independence criterion (HSI...
Article
Numerical simulators are widely used to model physical phenomena and global sensitivity analysis (GSA) aims at studying the global impact of the input uncertainties on the simulator output. To perform GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on the Hilbert–Schmidt independence criterion (HSI...
Article
Full-text available
The Hilbert-Schmidt Independence Criterion (HSIC) is a dependence measure based on reproducing kernel Hilbert spaces that is widely used to test independence between two random vectors. Remains the delicate choice of the kernel. In this work, we develop a new HSIC-based aggregated procedure which avoids such a kernel choice, and provide theoretical...
Article
Full-text available
Finding outliers in functional infinite-dimensional vector spaces is widely present in the industry for data that may originate from physical measurements or numerical simulations. An automatic and unsupervised process of outlier identification can help ensure the quality of a dataset (trimming), validate the results of industrial simulation codes,...
Article
In the framework of risk assessment in nuclear accident analysis, best-estimate computer codes associated with probabilistic modeling of uncertain input variables are used to estimate safety margins. Often, a first step in such uncertainty quantification studies is to identify the critical configurations (or penalizing, in the sense of a prescribed...
Conference Paper
The ever increasing computational power available for engineers in the context of nuclear safety has widely extended the use of numerical simulators that complement the safety reports and that provide an insight and facilitate the physical analysis of accidental transients. In this context, the French nuclear industry mainly relies upon the code CA...
Article
The manufacturing quality of industrial pieces is linked with the intrinsic concentration field of certain chemical species. Thus, from punctual measurements sparsely distributed on the piece, a robust estimation of the maximum concentration is very valuable. To this aim, Gaussian Process regression models may be used, providing useful confidence i...
Article
Physical phenomena are commonly modeled by numerical simulators. Such codes can take as input a high number of uncertain parameters and it is important to identify their influences on the outputs via a global sensitivity analysis (GSA). However, these codes can be time consuming, which prevents a GSA based on the classical Sobol' indices, requiring...
Article
In the framework of uncertainty treatment in numerical simulation, Global sensitivity analysis (GSA) aims at determining (qualitatively or quantitatively) how the variability of the uncertain inputs affects the model output. However, from reliability and risk management perspectives, GSA might be insufficient to capture the influence of the inputs...
Article
Full-text available
Within the framework of the French 4th-generation Sodium-cooled Fast Reactor safety assessment, methodology on VVUQ (Verification, Validation, Uncertainty Quantification) is conducted to demonstrate that the CEA's thermal-hydraulic Scientific Computation Tools (SCTs) are effective and operational for design and safety studies purposes on this type...
Chapter
Functional data consist in the most typical case of one-dimensional curves that represent the evolution of some physical parameter of interest with time. However, the analysis of this kind of objects is far from being simple, and the possibility of treating contaminated data is a classical problem that can arise in this framework as frequently as i...
Preprint
In the framework of risk assessment in nuclear accident analysis, best-estimate computer codes are used to estimate safety margins. Several inputs of the code can be uncertain, due to a lack of knowledge but also to the particular choice of accidental scenario being considered. The objective of this work is to identify the most penalizing (or criti...
Article
Full-text available
Usually, simulation tools are validated based on experimental data considering a best estimate simulation case; however, there is no quantification of this validation, which remains based on rough expert judgment. This paper presents advanced validation treatment of the simulation tool OCARINa devoted to unprotected transient overpower (UTOP) accid...
Article
The analysis of expensive numerical simulators usually requires metamodelling techniques, among which Gaussian process regression is one of the most popular approaches. Frequently, the code outputs correspond to physical quantities with a behavior which is known a priori: Chemical concentrations lie between 0 and 1, the output is increasing with re...
Article
Within the framework of the Generation IV Sodium-cooled Fast Reactor (SFR) R&D program of CEA (French Commissariat à l’Energie Atomique et aux Energies Alternatives), the reactor behavior in case of severe accidents is assessed through experiments and simulations. Such accidents are usually simulated with mechanistic calculation tools (such as SAS-...
Article
This paper presents a statistical methodology for a quantified validation of the OCARINa simulation tool which models the Unprotected Transient OverPower (UTOP) accidents. This validation on CABRI experiments is based on a Best Estimate Plus Uncertainties (BEPU) approach. To achieve this, a general methodology based upon recent statistical techniqu...
Article
Within the framework of long term prospective studies, an inherently-safe Sodium Fast Reactor (SFR) core, named CADOR (Core with Adding DOppleR effect), is studied at CEA (French commissariat à l’énergie atomique et aux énergies alternatives). This core concept mainly relies on its enhanced Doppler effect. The behavior of this innovative core desig...
Article
In the framework of the estimation of safety margins in nuclear accident analysis, a quantitative assessment of the uncertainties tainting the results of computer simulations is essential. Accurate uncertainty propagation (estimation of high probabilities or quantiles) and quantitative sensitivity analysis may call for several thousand code simulat...
Preprint
Global sensitivity analysis (GSA) of numerical simulators aims at studying the global impact of the input uncertainties on the output. To perform the GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on dependence measures based on reproducing kernel Hilbert spaces: the Hilbert-Schmidt Independence C...
Preprint
Dependence measures based on reproducing kernel Hilbert spaces, also known as Hilbert-Schmidt Independence Criterion and denoted HSIC, are widely used to statistically decide whether or not two random vectors are dependent. Recently, non-parametric HSIC-based statistical tests of independence have been performed. However, these tests lead to the qu...
Preprint
In the framework of the estimation of safety margins in nuclear accident analysis, a quantitative assessment of the uncertainties tainting the results of computer simulations is essential. Accurate uncertainty propagation (estimation of high probabilities or quantiles) and quantitative sensitivity analysis may call for several thousand of code simu...
Article
In the context of sensitivity analysis of complex phenomena in presence of uncertainty, we motivate and precise the idea of orienting the analysis towards a critical domain of the studied phenomenon. We make a brief history of related approaches in the literature, and propose a more general and systematic approach. Nonparametric measures of depende...
Thesis
Full-text available
Complex computer codes are commonly used to model, investigate and predict many physical phenomena. These simulators often need a large number of uncertain input variables and can provide as output high-dimensional results. In this framework, stochastic methods have become an essential tool for the uncertainty treatment and analysis in computer exp...
Article
Full-text available
In a case of radioactive release in the environment, modeling the radionuclide atmospheric dispersion is particularly useful for emergency response procedures and risk assessment. For this, the CEA has developed a numerical simulator, called Ceres-Mithra, to predict spatial maps of radionuclide concentrations at different instants. This computer co...
Chapter
This section presents several sensitivity analysis methods to deal with spatial and/or temporal models. Focusing on the variance-based approach, solutions are proposed to perform global sensitivity analysis with functional inputs and outputs. Some of these solutions are illustrated on two industrial case studies: an environmental model for flood ri...
Article
Full-text available
This paper proposes a new methodology to quantify the uncertainties associated to multiple dependent functional random variables, linked to a quantity of interest, called the covariate. The proposed methodology is composed of two main steps. First, the functional random variables are decomposed on a functional basis. The decomposition basis is comp...
Article
Complex computer codes are often too time expensive to be directly used to perform uncertainty, sensitivity, optimization and robustness analyses. A widely accepted method to circumvent this problem consists in replacing cpu-time expensive computer models by cpu inexpensive mathematical functions, called metamodels. For example, the Gaussian proces...
Article
Within the framework of the Generation IV Sodium-cooled Fast Reactors (SFR) R&D program of CEA, the core behavior in case of severe accidents is being assessed. Such transients are usually simulated with mechanistic codes (such as SIMMER-III). As a complement to this code, which gives reference accidental transient, a physico-statistical approach i...
Article
In this paper, we define a new methodology to perform sensitivity analysis of a computer simulation code in a particular case, whose study is motivated by a nuclear reliability application. This particular framework is characterized by three features. The first feature is that this kind of code is computationally expensive, which limits the number...
Article
Full-text available
Within the framework of the generation IV Sodium Fast Reactors (SFR) R&D program of CEA (French commissariat à l’énergie atomique et aux énergies alternatives), the safety in case of accidents is assessed. These accidental scenarios involve very complex transient phenomena. To get round the difficulty of modelling them, only ‘Bounding’ (most damagi...
Article
Full-text available
Physical phenomena are often studied using numerical simulators. Such computer codes are function of uncertain input parameters and a global sensitivity analysis (GSA) can be performed to identify their impacts on the simulator outputs. Sobol' indices, based on output variance decomposition, are commonly used to perform quantitative GSA. For many y...
Conference Paper
Full-text available
Within the framework of the generation IV Sodium Fast Reactors, the safety in case of severe accidents is assessed. From this statement, CEA has developed a new physical tool to model the accident initiated by the Total Instantaneous Blockage (TIB) of a sub-assembly. This TIB simulator depends on many uncertain input parameters related to the core...
Article
Within the framework of the generation IV Sodium Fast Reactors, the safety in case of severe accidents is assessed. From this statement, CEA has developed a new physical tool to model the accident initiated by the Total Instantaneous Blockage (TIB) of a sub-assembly. This TIB simulator depends on many uncertain input parameters. This paper aims at...
Article
In this paper, we define a new methodology to perform sensitivity analysis of a computer simulation code in a particular case, whose study is motivated by a nuclear reliability application. This particular framework is characterized by three features. The first feature is that this kind of code is computationally expensive, which limits the number...
Chapter
This section presents several sensitivity analysis methods to deal with spatial and/or temporal models. Focusing on the variance-based approach, solutions are proposed to perform global sensitivity analysis with functional inputs and outputs. Some of these solutions are illustrated on two industrial case studies: an environmental model for flood ri...
Article
Full-text available
Physical phenomena are commonly modeled by numerical simulators. Such codes can take as input a high number of uncertain parameters and it is important to identify their influences via a global sensitivity analysis (GSA). However, these codes can be time consuming which prevents a GSA based on the classical Sobol' indices, requiring too many simula...
Article
o evaluate the consequences on human health of radionuclide releases in the environment, numerical simulators are used to model the radionuclide atmospheric dispersion. These codes can be time consuming and depend on many uncertain variables related to radionuclide, release or weather conditions. These variables are of different kind: scalar, funct...
Article
Full-text available
Gaussian process modeling is one of the most popular approaches for building a metamodel in the case of expensive numerical simulators. Frequently, the code outputs correspond to physical quantities with a behavior which is known a priori: Chemical concentrations lie between 0 and 1, the output is increasing with respect to some parameter, etc. Sev...
Article
Full-text available
The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses can be applicable to deterministic computer codes. Deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis meth...
Article
Full-text available
Reservoir engineering studies involve a large number of parameters with great uncertainties. To ensure correct future production, a comparison of possible scenarios in managing related uncertainties is needed. Comparisons can be performed with more information than only a single mean case for each scenario. The Bayesian formalism is well tailored t...
Article
The global sensitivity analysis of a complex numerical model often calls for the estimation of variance-based importance measures, named Sobol' indices. Metamodel-based techniques have been developed in order to replace the cpu time-expensive computer code with an inexpensive mathematical function, which predicts the computer code output. The commo...
Article
Full-text available
To perform the global sensitivity analysis of a complex and cpu time expensive code, a mathematical function built from a small number of simulations referred to as a metamodel can be used to approximate the code. In some applications like oil reservoir simulations, the code output can depend on complex stochastic inputs such as random permeability...
Article
Full-text available
Complex computer codes, for instance simulating physical phenomena, are often too time expensive to be directly used to perform uncertainty, sensitivity, optimization and robustness analyses. A widely accepted method to circumvent this problem consists in replacing cpu time expensive computer models by cpu inexpensive mathematical functions, called...
Article
Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consis...
Conference Paper
Full-text available
In some studies requiring predictive numerical models, it can be advantageous to replace cpu time expensive computer models by cpu-inexpensive mathematical functions, called metamodels. In this paper, we focus on the Gaussian process metamodel whose construction requires an initial design of computer model simulations. A numerical study compares di...
Article
Complex computer codes are often too time expensive to be directly used to perform uncertainty propagation studies, global sensitivity analysis or to solve optimization problems. A well known and widely used method to circumvent this inconvenience consists in replacing the complex computer code by a reduced model, called a metamodel, or a response...
Article
Full-text available
Dans le cadre des études d'impact et de maîtrise des risques environnementaux, des modèles numériques sont utilisés pour simuler, expliquer et prédire les transferts de polluants. Ces codes de calcul prennent en entrée un grand nombre de paramètres entachés d'incertitudes (paramètres géophysiques, chimiques, etc.) et peuvent s'avérer couteux en tem...

Network

Cited By