# Amandine MarrelAtomic Energy and Alternative Energies Commission | CEA · Département d’Études des Réacteurs (DER)

Amandine Marrel

Research Scientist

## About

48

Publications

5,978

Reads

**How we measure 'reads'**

A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more

1,292

Citations

Citations since 2016

Introduction

Amandine Marrel currently works as research scientist at CEA of Cadarache in the Nuclear Energy Division.
Her research interests involve the development of probabilistic and statistical approaches for the uncertainty quantiﬁcation, design, metamodeling and sensitivity analysis of computer experiments, in support of safety studies for nuclear reactors and environmental impact studies.

Additional affiliations

## Publications

Publications (48)

In the framework of risk assessment in nuclear accident analysis, best-estimate computer codes associated with probabilistic modeling of uncertain input variables are used to estimate safety margins. Often, a first step in such uncertainty quantification studies is to identify the critical configurations (or penalizing, in the sense of a prescribed...

The ever increasing computational power available for engineers in the context of nuclear safety has widely extended the use of numerical simulators that complement the safety reports and that provide an insight and facilitate the physical analysis of accidental transients. In this context, the French nuclear industry mainly relies upon the code CA...

The manufacturing quality of industrial pieces is linked with the intrinsic concentration field of certain chemical species. Thus, from punctual measurements sparsely distributed on the piece, a robust estimation of the maximum concentration is very valuable. To this aim, Gaussian Process regression models may be used, providing useful confidence i...

Physical phenomena are commonly modeled by numerical simulators. Such codes can take as input a high number of uncertain parameters and it is important to identify their influences on the outputs via a global sensitivity analysis (GSA). However, these codes can be time consuming, which prevents a GSA based on the classical Sobol' indices, requiring...

In the framework of uncertainty treatment in numerical simulation, Global sensitivity analysis (GSA) aims at determining (qualitatively or quantitatively) how the variability of the uncertain inputs affects the model output. However, from reliability and risk management perspectives, GSA might be insufficient to capture the influence of the inputs...

Within the framework of the French 4th-generation Sodium-cooled Fast Reactor safety assessment, methodology on VVUQ (Verification, Validation, Uncertainty Quantification) is conducted to demonstrate that the CEA's thermal-hydraulic Scientific Computation Tools (SCTs) are effective and operational for design and safety studies purposes on this type...

Functional data consist in the most typical case of one-dimensional curves that represent the evolution of some physical parameter of interest with time. However, the analysis of this kind of objects is far from being simple, and the possibility of treating contaminated data is a classical problem that can arise in this framework as frequently as i...

In the framework of risk assessment in nuclear accident analysis, best-estimate computer codes are used to estimate safety margins. Several inputs of the code can be uncertain, due to a lack of knowledge but also to the particular choice of accidental scenario being considered. The objective of this work is to identify the most penalizing (or criti...

Usually, simulation tools are validated based on experimental data considering a best estimate simulation case; however, there is no quantification of this validation, which remains based on rough expert judgment. This paper presents advanced validation treatment of the simulation tool OCARINa devoted to unprotected transient overpower (UTOP) accid...

The analysis of expensive numerical simulators usually requires metamodelling techniques, among which Gaussian process regression is one of the most popular approaches. Frequently, the code outputs correspond to physical quantities with a behavior which is known a priori: Chemical concentrations lie between 0 and 1, the output is increasing with re...

Within the framework of the Generation IV Sodium-cooled Fast Reactor (SFR) R&D program of CEA (French Commissariat à l’Energie Atomique et aux Energies Alternatives), the reactor behavior in case of severe accidents is assessed through experiments and simulations. Such accidents are usually simulated with mechanistic calculation tools (such as SAS-...

This paper presents a statistical methodology for a quantified validation of the OCARINa simulation tool which models the Unprotected Transient OverPower (UTOP) accidents. This validation on CABRI experiments is based on a Best Estimate Plus Uncertainties (BEPU) approach. To achieve this, a general methodology based upon recent statistical techniqu...

Within the framework of long term prospective studies, an inherently-safe Sodium Fast Reactor (SFR) core, named CADOR (Core with Adding DOppleR effect), is studied at CEA (French commissariat à l’énergie atomique et aux énergies alternatives). This core concept mainly relies on its enhanced Doppler effect. The behavior of this innovative core desig...

In the framework of the estimation of safety margins in nuclear accident analysis, a quantitative assessment of the uncertainties tainting the results of computer simulations is essential. Accurate uncertainty propagation (estimation of high probabilities or quantiles) and quantitative sensitivity analysis may call for several thousand code simulat...

Global sensitivity analysis (GSA) of numerical simulators aims at studying the global impact of the input uncertainties on the output. To perform the GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here on dependence measures based on reproducing kernel Hilbert spaces: the Hilbert-Schmidt Independence C...

Dependence measures based on reproducing kernel Hilbert spaces, also known as Hilbert-Schmidt Independence Criterion and denoted HSIC, are widely used to statistically decide whether or not two random vectors are dependent. Recently, non-parametric HSIC-based statistical tests of independence have been performed. However, these tests lead to the qu...

In the framework of the estimation of safety margins in nuclear accident analysis, a quantitative assessment of the uncertainties tainting the results of computer simulations is essential. Accurate uncertainty propagation (estimation of high probabilities or quantiles) and quantitative sensitivity analysis may call for several thousand of code simu...

In the context of sensitivity analysis of complex phenomena in presence of uncertainty, we motivate and precise the idea of orienting the analysis towards a critical domain of the studied phenomenon. We make a brief history of related approaches in the literature, and propose a more general and systematic approach. Nonparametric measures of depende...

Complex computer codes are commonly used to model, investigate and predict many physical phenomena. These simulators often need a large number of uncertain input variables and can provide as output high-dimensional results. In this framework, stochastic methods have become an essential tool for the uncertainty treatment and analysis in computer exp...

In a case of radioactive release in the environment, modeling the radionuclide atmospheric dispersion is particularly useful for emergency response procedures and risk assessment. For this, the CEA has developed a numerical simulator, called Ceres-Mithra, to predict spatial maps of radionuclide concentrations at different instants. This computer co...

This section presents several sensitivity analysis methods to deal with spatial and/or temporal models. Focusing on the variance-based approach, solutions are proposed to perform global sensitivity analysis with functional inputs and outputs. Some of these solutions are illustrated on two industrial case studies: an environmental model for flood ri...

This paper proposes a new methodology to quantify the uncertainties associated to multiple dependent functional random variables, linked to a quantity of interest, called the covariate. The proposed methodology is composed of two main steps. First, the functional random variables are decomposed on a functional basis. The decomposition basis is comp...

Complex computer codes are often too time expensive to be directly used to perform uncertainty, sensitivity, optimization and robustness analyses. A widely accepted method to circumvent this problem consists in replacing cpu-time expensive computer models by cpu inexpensive mathematical functions, called metamodels. For example, the Gaussian proces...

Within the framework of the Generation IV Sodium-cooled Fast Reactors (SFR) R&D program of CEA, the core behavior in case of severe accidents is being assessed. Such transients are usually simulated with mechanistic codes (such as SIMMER-III). As a complement to this code, which gives reference accidental transient, a physico-statistical approach i...

In this paper, we define a new methodology to perform sensitivity analysis of a computer simulation code in a particular case, whose study is motivated by a nuclear reliability application. This particular framework is characterized by three features. The first feature is that this kind of code is computationally expensive, which limits the number...

Within the framework of the generation IV Sodium Fast Reactors (SFR) R&D program of CEA (French commissariat à l’énergie atomique et aux énergies alternatives), the safety in case of accidents is assessed. These accidental scenarios involve very complex transient phenomena. To get round the difficulty of modelling them, only ‘Bounding’ (most damagi...

Physical phenomena are often studied using numerical simulators. Such computer codes are function of uncertain input parameters and a global sensitivity analysis (GSA) can be performed to identify their impacts on the simulator outputs. Sobol' indices, based on output variance decomposition, are commonly used to perform quantitative GSA. For many y...

Within the framework of the generation IV Sodium Fast Reactors, the safety in case of severe accidents is assessed. From this statement, CEA has developed a new physical tool to model the accident initiated by the Total Instantaneous Blockage (TIB) of a sub-assembly. This TIB simulator depends on many uncertain input parameters related to the core...

Within the framework of the generation IV Sodium Fast Reactors, the safety in case of severe accidents is assessed. From this statement, CEA has developed a new physical tool to model the accident initiated by the Total Instantaneous Blockage (TIB) of a sub-assembly. This TIB simulator depends on many uncertain input parameters. This paper aims at...

In this paper, we define a new methodology to perform sensitivity analysis of a computer simulation code in a particular case, whose study is motivated by a nuclear reliability application. This particular framework is characterized by three features. The first feature is that this kind of code is computationally expensive, which limits the number...

This section presents several sensitivity analysis methods to deal with spatial and/or temporal models. Focusing on the variance-based approach, solutions are proposed to perform global sensitivity analysis with functional inputs and outputs. Some of these solutions are illustrated on two industrial case studies: an environmental model for flood ri...

Physical phenomena are commonly modeled by numerical simulators. Such codes
can take as input a high number of uncertain parameters and it is important to
identify their influences via a global sensitivity analysis (GSA). However,
these codes can be time consuming which prevents a GSA based on the classical
Sobol' indices, requiring too many simula...

o evaluate the consequences on human health of radionuclide releases in the environment, numerical simulators are used to model the radionuclide atmospheric dispersion. These codes can be time consuming and depend on many uncertain variables related to radionuclide, release or weather conditions. These variables are of different kind: scalar, funct...

Gaussian process modeling is one of the most popular approaches for building a metamodel in the case of expensive numerical simulators. Frequently, the code outputs correspond to physical quantities with a behavior which is known a priori: Chemical concentrations lie between 0 and 1, the output is increasing with respect to some parameter, etc. Sev...

The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses can be applicable to deterministic computer codes. Deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis meth...

Reservoir engineering studies involve a large number of parameters with great uncertainties. To ensure correct future production, a comparison of possible scenarios in managing related uncertainties is needed. Comparisons can be performed with more information than only a single mean case for each scenario. The Bayesian formalism is well tailored t...

The global sensitivity analysis of a complex numerical model often calls for
the estimation of variance-based importance measures, named Sobol' indices.
Metamodel-based techniques have been developed in order to replace the cpu
time-expensive computer code with an inexpensive mathematical function, which
predicts the computer code output. The commo...

To perform the global sensitivity analysis of a complex and cpu time expensive code, a mathematical function built from a small number of simulations referred to as a metamodel can be used to approximate the code. In some applications like oil reservoir simulations, the code output can depend on complex stochastic inputs such as random permeability...

Complex computer codes, for instance simulating physical phenomena, are often
too time expensive to be directly used to perform uncertainty, sensitivity,
optimization and robustness analyses. A widely accepted method to circumvent
this problem consists in replacing cpu time expensive computer models by cpu
inexpensive mathematical functions, called...

Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consis...

In some studies requiring predictive numerical models, it can be advantageous to replace cpu time expensive computer models by cpu-inexpensive mathematical functions, called metamodels. In this paper, we focus on the Gaussian process metamodel whose construction requires an initial design of computer model simulations. A numerical study compares di...

Complex computer codes are often too time expensive to be directly used to perform uncertainty propagation studies, global sensitivity analysis or to solve optimization problems. A well known and widely used method to circumvent this inconvenience consists in replacing the complex computer code by a reduced model, called a metamodel, or a response...

Dans le cadre des études d'impact et de maîtrise des risques environnementaux, des modèles numériques sont utilisés pour simuler, expliquer et prédire les transferts de polluants. Ces codes de calcul prennent en entrée un grand nombre de paramètres entachés d'incertitudes (paramètres géophysiques, chimiques, etc.) et peuvent s'avérer couteux en tem...

## Projects

Project (1)

-Sodium thermohydraulics at various scales
-Development of evaluation tools (fast-running physical tools coupled to advanced statistical technics) for the three main accident scenarios.