About
36
Publications
6,657
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
895
Citations
Introduction
Additional affiliations
May 2016 - May 2019
March 2013 - April 2016
Education
October 2009 - May 2014
August 2004 - June 2009
Publications
Publications (36)
Surface faulting earthquakes are known to cluster in time from historical and palaeoseismic studies, but the mechanism(s) responsible for clustering, such as fault interaction, strain-storage, and evolving dynamic topography, are poorly quantified, and hence not well understood. We present a quantified replication of observed earthquake clustering...
In this study, we present an adaptive multilevel Monte Carlo (AMLMC) algorithm for approximating deterministic, real-valued, bounded linear functionals that depend on the solution of a linear elliptic PDE with a lognormal diffusivity coefficient and geometric singularities in bounded domains of Rd. Our AMLMC algorithm is built on the results of the...
We present an adaptive multilevel Monte Carlo (AMLMC) algorithm for approximating deterministic, real-valued, bounded linear functionals that depend on the solution of a linear elliptic PDE with a lognormal diffusivity coefficient and geometric singularities in bounded domains of $\mathbb{R}^d$. Our AMLMC algorithm is built on the results of the we...
The potential of a full-margin rupture along the Cascadia subduction zone poses a significant threat over a populous region of North America. Previous probabilistic tsunami hazard assessment studies produced hazard curves based on simulated predictions of tsunami waves, either at low resolution or at high resolution for a local area or under limite...
To assess whether continental extension and seismic hazard are spatially-localized on single faults or spread over wide regions containing multiple active faults, we investigated temporal and spatial slip-rate variability over many millennia using in-situ ³⁶ Cl cosmogenic exposure dating for active normal faults near Athens, Greece. We study a ~ NN...
The potential of a full-margin rupture along the Cascadia subduction zone poses a significant threat over a populous region of North America. Traditional probabilistic tsunami hazard assessments produce hazard maps based on simulated prediction of tsunami waves either under limited ranges of scenarios or at low resolution, due to cost. We use the G...
An optimal experimental set‐up maximizes the value of data for statistical inferences. The efficiency of strategies for finding optimal experimental set‐ups is particularly important for experiments that are time‐consuming or expensive to perform. In the situation when the experiments are modeled by partial differential equations (PDEs), multilevel...
An optimal experimental set-up maximizes the value of data for statistical inferences and predictions. The efficiency of strategies for finding optimal experimental set-ups is particularly important for experiments that are time-consuming or expensive to perform. For instance, in the situation when the experiments are modeled by Partial Differentia...
Earthquakes are known to cluster in time, from historical and palaeoseismic studies, but the mechanism(s) responsible for clustering, such as evolving dynamic topography, fault interaction, and strain-storage in the crust are poorly quantified, and hence not well understood. We note that differential stress values are (1) output by calculations of...
We interpret uncertainty in a model for seismic wave propagation by treating the model parameters as random variables, and apply the Multilevel Monte Carlo method to reduce the cost of approximating expected values of selected, physically relevant, quantities of interest (QoI) with respect to the random variables. Targeting source inversion problem...
This paper proposes an extension of the Multi-Index Stochastic Collocation (MISC) method for forward uncertainty quantification (UQ) problems in computational domains of shape other than a square or cube, by exploiting isogeometric analysis (IGA) techniques. Introducing IGA solvers to the MISC algorithm is very natural since they are tensor-based P...
An optimal experimental set-up maximizes the value of data for statistical inference and prediction, which is particularly important for experiments that are time consuming or expensive to perform. In the context of partial differential equations (PDEs), multilevel methods have been proven in many cases to dramatically reduce the computational comp...
In this
paper, we present the VOLNA-OP2 tsunami model and implementation; a
finite-volume non-linear shallow-water equation (NSWE) solver built on the
OP2 domain-specific language (DSL) for unstructured mesh computations.
VOLNA-OP2 is unique among tsunami solvers in its support for several
high-performance computing platforms: central processing un...
Over the past 20 years, analyzing the abundance of the isotope chlorine-36 (³⁶Cl) has emerged as a popular tool for geologic dating. In particular, it has been observed that ³⁶Cl measurements along a fault plane can be used to study the timings of past ground displacements during earthquakes, which in turn can be used to improve existing seismic ha...
We interpret uncertainty in the parameters of a model for seismic wave propagation by treating the parameters as random variables, and we apply the Multilevel Monte Carlo (MLMC) method to reduce the cost of approximating expected values of selected, physically relevant, quantities of interest (QoI) with respect to the random variables. Aiming to so...
This paper proposes an extension of the Multi-Index Stochastic Collocation method (MISC) for forward Uncertainty Quantification (UQ) problems in computational domains of more generic shapes than a square/cube, by exploiting Isogeometric analysis (IGA) techniques. Introducing IGA solvers within the MISC algorithm is very natural since they are tenso...
In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computatio...
Over the past twenty years, analyzing the abundance of the isotope chlorine-36 (³⁶Cl) has emerged as a popular tool for geologic dating. In particular, it has been observed that ³⁶Cl measurements along a fault plane can be used to study the timings of past ground displacements during earthquakes, which in turn can be used to improve existing seismi...
In this paper, we present the VOLNA-OP2 tsunami model and implementation; a finite volume non-linear shallow water equations (NSWE) solver built on the OP2 domain specific language for unstructured mesh computations. VOLNA-OP2 is unique among tsunami solvers in its support for several high performance computing platforms: CPUs, the Intel Xeon Phi,...
Isogeometric Analysis (IGA) typically adopts tensor-product splines and NURBS as a basis for the approximation of the solution of PDEs. In this work, we investigate to which extent IGA solvers can benefit from the so-called sparse-grids construction in its combination technique form, which was first introduced in the early 90s in the context of the...
Isogeometric Analysis (IGA) typically adopts tensor-product splines and NURBS as a basis for the approximation of the solution of PDEs. In this work, we investigate to which extent IGA solvers can benefit from the so-called sparse-grids construction in its combination technique form, which was first introduced in the early 90s in the context of the...
Computer simulators can be computationally intensive to run over a large number of input values, as required for optimization and various uncertainty quantification tasks. The standard paradigm for the design and analysis of computer experiments is to employ Gaussian random fields to model computer simulators. Gaussian process models are trained on...
Vacuum/pressure swing adsorption is an attractive and often energy efficient separation process for some applications. However, there is often a trade-off between the different objectives: purity, recovery and power consumption. Identifying those trade-offs is possible through use of multi-objective optimisation methods but this is computationally...
Optimization based process design tools are most useful when combined with the human engineer’s insight. Further insight can be gained through the use of these tools by encouraging the exploration of the design space. Visualization is one technique which makes it easier for an engineer to understand the designs identified by an optimization tool. T...
In this work we consider quasi-optimal versions of the Stochastic Galerkin method for solving linear elliptic PDEs with stochastic coefficients. In particular, we consider the case of a finite number NN of random inputs and an analytic dependence of the solution of the PDE with respect to the parameters in a polydisc of the complex plane CNCN. We s...
In this work we explore the extension of the quasi-optimal sparse grids method proposed in our previous work "On the optimal polynomial approximation of stochastic PDEs by Galerkin and Collocation methods" to a Darcy problem where the permeability is modeled as a lognormal random field. We propose an explicit a-priori/a-posteriori procedure for the...
Pressure swing adsorption (PSA) is a cyclic adsorption process for gas separation and purification with the potential for high productivity compared to alternative separation processes. The design of a cost-competitive and highly productive PSA process requires the optimisation of the process conditions with respect to the usually conflicting objec...
This paper describes the testing, comparison and application of global sensitivity techniques for the study of the impact of the stream impurities on CO2 pipeline failure. Global sensitivity analysis through non-intrusive generalised polynomial chaos expansion with sparse grids is compared to more common techniques and is found to achieve superior...
Pressure swing adsorption (PSA) is a cyclic adsorption process for gas separation and purification. PSA offers a broad range of design possibilities influencing the device behaviour. In the last decade much attention has been devoted towards simulation and optimisation of various PSA cycles. The PSA beds are modelled with hyperbolic/parabolic parti...
In this work we focus on the numerical approximation of the solution u of a linear elliptic PDE with stochastic coefficients. The problem is rewritten as a parametric PDE and the functional dependence of the solution on the parameters is approximated by multivariate polynomials. We first consider the Stochastic Galerkin method, and rely on sharp es...
In this work we first focus on the Stochastic Galerkin approximation of the solution u of an elliptic stochastic PDE. We rely on sharp estimates for the decay of the coefficients of the spectral expansion of u on orthogonal polynomials to build a sequence of polynomial subspaces that features better convergence properties compared to standard polyn...
Much attention has recently been devoted to the development of Stochastic Galerkin (SG) and Stochastic Collocation (SC) methods
for uncertainty quantification. An open and relevant research topic is the comparison of these two methods. By introducing
a suitable generalization of the classical sparse grid SC method, we are able to compare SG and SC...
Much attention has recently been devoted to the development of Stochastic Galerkin (SG) and Stochastic Collocation (SC) methods for uncer- tainty quantication. An open and relevant research topic is the comparison of these two methods. By introducing a suitable generalization of the classi- cal sparse grid SC method, we are able to compare SG and S...