ArticlePDF Available

Probabilistic quantification of tsunami current hazard using statistical emulation

Authors:

Abstract

In this paper, statistical emulation is shown to be an essential tool for the end-to-end physical and numerical modelling of local tsunami impact, i.e. from the earthquake source to tsunami velocities and heights. In order to surmount the prohibitive computational cost of running a large number of simulations, the emulator, constructed using 300 training simulations from a validated tsunami code, yields 1 million predictions. This constitutes a record for any realistic tsunami code to date, and is a leap in tsunami science since high risk but low probability hazard thresholds can be quantified. For illustrating the efficacy of emulation, we map probabilistic representations of maximum tsunami velocities and heights at around 200 locations about Karachi port. The 1 million predictions comprehensively sweep through a range of possible future tsunamis originating from the Makran Subduction Zone (MSZ). We rigorously model each step in the tsunami life cycle: first use of the three-dimensional subduction geometry Slab2 in MSZ, most refined fault segmentation in MSZ, first sediment enhancements of seabed deformation (up to 60% locally) and bespoke unstructured meshing algorithm. Owing to the synthesis of emulation and meticulous numerical modelling, we also discover substantial local variations of currents and heights.
A preview of the PDF is not available
... High-accuracy, highresolution computations are especially useful in tsunami modelling studies to assess inundation, damage to infrastructure and asset losses but also for evacuation modelling. The parameter space dimension is also typically high, and the number of expensive numerical simulations needed to resolve the statistics about the output tends to be large (on the order of thousands for a well-approximated distribution; Salmanidou et al., 2017;Gopinathan et al., 2021) and hard to materialize as it depends on the available resources, code architecture and other factors. ...
... We propose using a statistical surrogate approach, also called emulation, in which one approximates simulation outputs of interest as a function of the scenario parameter space. Such approaches have been implemented for uncertainty quantification of tsunami hazard at various settings (Sraj et al., 2014;Salmanidou et al., 2017Salmanidou et al., , 2019Guillas et al., 2018;Denamiel et al., 2019;Snelling et al., 2020;Gopinathan et al., 2021;Giles et al., 2021). ...
... This is done for the first time towards a realistic case using high-performance computing (HPC). A one-shot random sampling for the training (as for example in Salmanidou et al., 2017;Gopinathan et al., 2021;Giles et al., 2021) lacks the information gain achieved by sequential design. Concretely, sequential design can reduce by 50 % the computational cost, as demonstrated in Beck and Guillas (2016) for a set of toy problems, so applying this novel approach towards a realistic case is showcasing real benefits in the case of high resolution with a complex parametrization of the source. ...
Article
Full-text available
The potential of a full-margin rupture along the Cascadia subduction zone poses a significant threat over a populous region of North America. Previous probabilistic tsunami hazard assessment studies produced hazard curves based on simulated predictions of tsunami waves, either at low resolution or at high resolution for a local area or under limited ranges of scenarios or at a high computational cost to generate hundreds of scenarios at high resolution. We use the graphics processing unit (GPU)-accelerated tsunami simulator VOLNA-OP2 with a detailed representation of topographic and bathymetric features. We replace the simulator by a Gaussian process emulator at each output location to overcome the large computational burden. The emulators are statistical approximations of the simulator's behaviour. We train the emulators on a set of input–output pairs and use them to generate approximate output values over a six-dimensional scenario parameter space, e.g. uplift/subsidence ratio and maximum uplift, that represent the seabed deformation. We implement an advanced sequential design algorithm for the optimal selection of only 60 simulations. The low cost of emulation provides for additional flexibility in the shape of the deformation, which we illustrate here considering two families – buried rupture and splay-faulting – of 2000 potential scenarios. This approach allows for the first emulation-accelerated computation of probabilistic tsunami hazard in the region of the city of Victoria, British Columbia.
... Most of the novel techniques in the field of PTHA are based on the notion of reducing the number of required computational runs with the aid of Gaussian process emulators, which are capable of maintaining good output accuracy and uncertainty quantification. The investigations of Gopinathan et al. [20] and Salmanidou et al. [21] are good examples of this approach, where the former delivered millions of output predictions based on 300 numerically simulated earthquake-tsunami scenarios, and the latter produced 2000 output predictions at each prescribed location, examining 60 full-fledged simulations. ...
... Even so, it is common knowledge that expensive computational resources for an accurate PTHA study are the main downside. Recent work aimed at circumventing this problem makes use of stochastic approximations, called emulators, built upon a pre-computed training set [20,21,39]. An emulator can be seen as an interpolating operator of the map that assigns to each input parametric array its corresponding desired output through a fully fledged simulation. ...
... The effectiveness of an emulator approach is closely related to the construction of the training set, which is its core. In [20], the epicenter location and moment magnitude were sampled using the LHS method to simulate 300 scenarios and retrieve the maximum water height and maximum current velocity at several locations, which in turn constitute the basis for building the training set. In [21], the authors sampled a sevendimensional input space using a sequential design MICE algorithm to generate a training set of 60 simulated scenarios. ...
Article
Full-text available
The application of simulation software has proven to be a crucial tool for tsunami hazard assessment studies. Understanding the potentially devastating effects of tsunamis leads to the development of safety and resilience measures, such as the design of evacuation plans or the planning of the economic investment necessary to quickly mitigate their consequences. This article introduces a pseudo-probabilistic seismic-triggered tsunami simulation approach to investigate the potential impact of tsunamis in the southwestern coast of Spain, in the provinces of Huelva and Cádiz. Selected faults, probabilistic distributions and sampling methods are presented as well as some results for the nearly 900 Atlantic-origin tsunamis computed along the 250 km-long coast.
... Statistical emulation, let alone multi-fidelity emulation, in tsunami risk assessment is a relatively unexplored field although Giles et al. [2021] andGopinathan et al. [2021] have established an insightful framework in GP tsunami emulation and alternative approaches, e.g., polynomial chaos, can be found such as in [Giraldi et al., 2017]. More examples of GP emulation in tsunami simulation can be found in [Guillas et al., 2018, Gopinathan et al., 2020a, Salmanidou et al., 2019a, 2021a, Snelling et al., 2020. The drawbacks of these prior research is that (i) computational cost for generating training data can still be too expensive and (ii) most of their approaches lack flexible experimental design although Salmanidou et al. ...
Conference Paper
Full-text available
Investigating uncertainties in computer simulations can be prohibitive in terms of computational costs, since the simulator needs to be run over a large number of input values. Building a statistical surrogate model of the simulator, using a small design of experiments, greatly alleviates the computational burden to carry out such investigations. Nevertheless, this can still be above the computational budget for many studies. We present a novel method that combines both approaches, the multilevel adaptive sequential design of computer experiments (MLASCE) in the framework of Gaussian process (GP) emulators. MLASCE is based on the two major approaches: efficient design of experiments, such as sequential designs, and combining training data of different degrees of sophistication in a so-called multi-fidelity method, or multilevel in case these fidelities are ordered typically for increasing resolutions. This dual strategy allows us to allocate efficiently limited computational resources over simulations of different levels of fidelity and build the GP emulator. The allocation of computational resources is shown to be the solution of a simple optimization problem in a special case where we theoretically prove the validity of our approach. MLASCE is compared with other existing models of multi-fidelity Gaussian process emulation. Gains of orders of magnitudes in accuracy for medium-size computing budgets are demonstrated in numerical examples. MLASCE should be useful in a computer experiment of a natural disaster risk and more than a mere tool for calculating the scale of natural disasters. To show MLASCE meets this expectation, we propose the first end-to-end example of a risk model for household asset loss due to a possible future tsunami. As a follow-up to this proposed framework, MLASCE provides a reliable statistical surrogate to a realistic tsunami risk assessment under a restricted computational resource and provides accurate and instant predictions of future tsunami risks.
Article
Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is aimed at estimating the annual rate of exceedance of an earthquake-induced tsunami wave of a certain location with reference to a predefined height threshold. The analysis relies on computationally demanding numerical simulations of seismic-induced tsunami wave generation and propagation. A large number of scenarios needs to be simulated to account for uncertainties. However, the exceedance of tsunami wave threshold height is a rare event so that most of the simulated scenarios bring little statistical contribution to the estimation of the annual rate yet increasing the computational burden. To efficiently address this issue, we propose a wrapper-based heuristic approach to select the set of most relevant features of the seismic model, for deciding a priori the seismic scenarios to be simulated. The proposed approach is based on a Multi-Objective Differential Evolution Algorithm (MODEA) and is developed with reference to a case study whose objective is calculating the annual rate of threshold exceedance of the height of tsunami waves caused by subduction earthquakes that might be generated on a section of the Hellenic Arc, and propagated to a set of target sites: Siracusa, on the eastern coast of Sicily, Crotone, on the southern coast of Calabria, and Santa Maria di Leuca, on the southern coast of Puglia. The results show that, in all cases, the proposed approach allows a reduction of 95% of the number of scenarios with half of the features to be considered, and with no appreciable loss of accuracy.
Article
Full-text available
In comparison to the east coast, the tsunami hazard for the west coast of India remains under-recognized, despite the impact in 1945 following a M w 8.1 earthquake in the Makran subduction zone in the northern Arabian Sea. The previous occurrences of tsunamis in the Arabian Sea that would have a bearing on the west coast of India are being debated, including the question whether the Makran region has the potential to generate greater-magnitude earthquakes. With this in the backdrop, we present here the historical and geological evidence of a tsunami impact zone from a site on the Konkan Coast of western India. Located in the village of Kelshi, the impact zone is preserved within a coastal dune complex that also reveals occupation layers. This laterally extending 30-40-cm-thick zone, coinciding with a habitation level, displays varied sedimentary structures including scour-fill features, and is inter-layered with shells, at a height of * 3 m from the high-tide level. We attribute these sedimentary features to a tsunami flooding event that was contemporaneous with the transportation of shells, dated at 1508-1681 CE. The geological inference matches with the description by the Por-tuguese fleets of a sea disturbance in 1524 CE, reported from Dabhol, not far from Kelshi, and also from the Gulf of Cambay, located about 500 km to the north. Precluding submarine landslide scenarios, the modeling results suggest that the high impact in Kelshi could have been generated by a M w C 9 earthquake sourced in the Makran subduction zone. It is, however, intriguing how a M w C 9 earthquake in the Makran region finds no mention in the historical documentation. We underscore the need for fresh efforts along the Makran coast to reconstruct the tsunami recurrence history that would generate required validating constraints on the 1524 event, if it was indeed generated by a massive earthquake among other mechanisms.
Article
Full-text available
The software package Volna-OP2 is a robust and efficient code capable of simulating the complete life cycle of a tsunami whilst harnessing the latest High Performance Computing (HPC) architectures. In this paper, a comprehensive error analysis and scalability study of the GPU version of the code is presented. A novel decomposition of the numerical errors into the dispersion and dissipation components is explored. Most tsunami codes exhibit amplitude smearing and/or phase lagging/leading, so the decomposition shown here is a new approach and novel tool for explaining these occurrences. To date, Volna-OP2 has been widely used by the tsunami modelling community. In particular its computational efficiency has allowed various sensitivity analyses and uncertainty quantification studies. Due to the number of simulations required, there is always a trade-off between accuracy and runtime when carrying out these statistical studies. The analysis presented in this paper will guide the user towards an acceptable level of accuracy within a given runtime.
Article
Full-text available
Abstract This paper documents development of a multiple‐Graphics Processing Unit (GPU) version of FUNWAVE‐Total Variation Diminishing (TVD), an open‐source model for solving the fully nonlinear Boussinesq wave equations using a high‐order TVD solver. The numerical schemes of FUNWAVE‐TVD, including Cartesian and spherical coordinates, are rewritten using CUDA Fortran, with inter‐GPU communication facilitated by the Message Passing Interface. Since FUNWAVE‐TVD involves the discretization of high‐order dispersive derivatives, the on‐chip shared memory is utilized to reduce global memory access. To further optimize performance, the batched tridiagonal solver is scheduled simultaneously in multiple‐GPU streams, which can reduce the GPU execution time by 20–30%. The GPU version is validated through a benchmark test for wave runup on a complex shoreline geometry, as well as a basin‐scale tsunami simulation of the 2011 Tohoku‐oki event. Efficiency evaluation shows that, in comparison with the CPU version running at a 36‐core HPC node, speedup ratios of 4–7 and above 10 can be observed for single‐ and double‐GPU runs, respectively. The performance metrics of multiple‐GPU implementation needs to be further evaluated when appropriate.
Article
Full-text available
The lack of offshore seismic data caused uncertainties associated with understating the behavior of future tsunamigenic earthquakes in the Makran subduction zone (MSZ). Future tsunamigenic events in the MSZ may trigger significant near-field tsunamis. Tsunami wave heights in the near field are controlled by the heterogeneity of slip over the rupture area. Considering a non-planar geometry for the Makran subduction zone, a range of random \(k^{-2}\) slip models were generated to hypothesize rupturing on the fault zone. We model tsunamis numerically and assess probabilistic tsunami hazard in the near field for all synthetic scenarios. The main affected areas by tsunami waves are the area between Jask and Ormara along the shorelines of Iran and Pakistan and the area between Muscat and Sur along the Oman coastline. The maximum peak-wave height along the shores of Iran and Pakistan is about 16 m and about \(12{\text{m}}\) for the Oman shoreline. The slip distributions control the wave height along the Makran coastlines. The dependency of tsunami height on the heterogeneity of slip is higher in the most impacted areas. Those areas are more vulnerable to tsunami hazard than other areas.
Article
Full-text available
The September 2018, Mw 7.5 Sulawesi earthquake occurring on the Palu-Koro strike-slip fault system was followed by an unexpected localized tsunami. We show that direct earthquake-induced uplift and subsidence could have sourced the observed tsunami within Palu Bay. To this end, we use a physics-based, coupled earthquake–tsunami modeling framework tightly constrained by observations. The model combines rupture dynamics, seismic wave propagation, tsunami propagation and inundation. The earthquake scenario, featuring sustained supershear rupture propagation, matches key observed earthquake characteristics, including the moment magnitude, rupture duration, fault plane solution, teleseismic waveforms and inferred horizontal ground displacements. The remote stress regime reflecting regional transtension applied in the model produces a combination of up to 6 m left-lateral slip and up to 2 m normal slip on the straight fault segment dipping 65∘ East beneath Palu Bay. The time-dependent, 3D seafloor displacements are translated into bathymetry perturbations with a mean vertical offset of 1.5 m across the submarine fault segment. This sources a tsunami with wave amplitudes and periods that match those measured at the Pantoloan wave gauge and inundation that reproduces observations from field surveys. We conclude that a source related to earthquake displacements is probable and that landsliding may not have been the primary source of the tsunami. These results have important implications for submarine strike-slip fault systems worldwide. Physics-based modeling offers rapid response specifically in tectonic settings that are currently underrepresented in operational tsunami hazard assessment.
Article
Full-text available
The complexity of coseismic slip distributions influences the tsunami hazard posed by local and, to a certain extent, distant tsunami sources. Large slip concentrated in shallow patches was observed in recent tsunamigenic earthquakes, possibly due to dynamic amplification near the free surface, variable frictional conditions or other factors. We propose a method for incorporating enhanced shallow slip for subduction earthquakes while preventing systematic slip excess at shallow depths over one or more seismic cycles. The method uses the classic k⁻² stochastic slip distributions, augmented by shallow slip amplification. It is necessary for deep events with lower slip to occur more often than shallow ones with amplified slip to balance the long-term cumulative slip. We evaluate the impact of this approach on tsunami hazard in the central and eastern Mediterranean Sea adopting a realistic 3D geometry for three subduction zones, by using it to model ~ 150,000 earthquakes with \(M_{w}\) from 6.0 to 9.0. We combine earthquake rates, depth-dependent slip distributions, tsunami modeling, and epistemic uncertainty through an ensemble modeling technique. We found that the mean hazard curves obtained with our method show enhanced probabilities for larger inundation heights as compared to the curves derived from depth-independent slip distributions. Our approach is completely general and can be applied to any subduction zone in the world.
Article
Full-text available
The Indus Canyon in the northwestern Indian Ocean has been reported to be the site of numerous submarine mass failures in the past. This study is the first to investigate potential tsunami hazards associated with such mass failures in this region. We employed statistical emulation, i.e. surrogate modelling, to efficiently quantify uncertainties associated with slump-generated tsunamis at the slopes of the canyon. We simulated 60 slump scenarios with thickness of 100–300 m, width of 6–10.5 km, travel distances of 500–2000 m and submergence depth of 250–450 m. These scenarios were then used to train the emulator and predict 500,000 trial scenarios in order to study probabilistically the tsunami hazard over the near field. Due to narrow–deep canyon walls and the shallow continental shelf in the adjacent regions (<100 m water depth), the tsunami propagation has a unique pattern as an ellipse stretched in the NE–SW direction. The results show that the most likely tsunami amplitudes and velocities are approximately 0.2–1.0 m and 2.5–13 m/s, respectively, which can potentially impact vessels and maritime facilities. We demonstrate that the emulator-based approach is an important tool for probabilistic hazard analysis since it can generate thousands of tsunami scenarios in few seconds, compared to days of computations on High Performance Computing facilities for a single run of the dispersive tsunami solver that we use here.
Article
A complete suite of closed analytical expressions is presented for the surface displacements, strains, and tilts due to inclined shear and tensile faults in a half-space for both point and finite rectangular sources. These expressions are particularly compact and free from field singular points which are inherent in the previously stated expressions of certain cases. The expressions derived here represent powerful tools not only for the analysis of static field changes associated with earthquake occurrence but also for the modeling of deformation fields arising from fluid-driven crack sources.
Article
Tsunami is one of the most destructive natural disasters and the probabilistic tsunami hazard assessment (PTHA) has become increasingly popular in both academic and engineering field. However, most methods to carry out PTHA cannot avoid large number of scenario simulations generated by comprehensive assessment of the uncertainties of all seismic parameters. In order to balance the accuracy and feasibility in PTHA, we propose a more efficient approach. Based on the linear assumption of tsunami waves in deep water, the computation of tsunami wave propagation is given by the linear superposition of waves caused by unit sources of water level disturbance, which transforms a large number of seismic tsunami scenarios simulations into a limited number of simulations on wave propagating from each unit source. By placing 5,438 unit sources in Gaussian-shaped waveform with the interval of 0.1° around Manila Trench to approximate the initial water level disturbance under any combination of seismic parameters, we have evaluated 1,380,000 potential seismic scenarios in the Manila trench area. The water level fluctuation process of each target point along 100 m isobaths in every scenario is calculated to estimate whether the maximum wave height of the leading wave exceeds a specific critical value. Combining the probability density of each seismic parameter, the exceedance probabilities of specific wave heights and hazard curves at all target points are given.