Conference Paper

High Resolution Forward And Inverse Earthquake Modeling on Terascale Computers.

Carnegie Mellon University, Pittsburgh, Pennsylvania
Conference: Proceedings of the ACM/IEEE SC2003 Conference on High Performance Networking and Computing, 15-21 November 2003, Phoenix, AZ, USA, CD-Rom
Source: IEEE Xplore

ABSTRACT For earthquake simulations to play an important role in the reduction of seismic risk, they must be capable of high resolution and high fidelity. We have developed algorithms and tools for earthquake simulation based on multiresolution hexahedral meshes. We have used this capability to carry out 1 Hz simulations of the 1994 Northridge earthquake in the LA Basin using 100 million grid points. Our wave propagation solver sustains 1.21 teraflop/s for 4 hours on 3000 AlphaServer processors at 80% parallel efficiency. Because of uncertainties in characterizing earthquake source and basin material properties, a critical remaining challenge is to invert for source and material parameter fields for complex 3D basins from records of past earthquakes. Towards this end, we present results for material and source inversion of high-resolution models of basins undergoing antiplane motion using parallel scalable inversion algorithms that overcome many of the difficulties particular to inverse heterogeneous wave propagation problems.

Download full-text


Available from: Jacobo Bielak, Feb 26, 2015
17 Reads
  • Source
    • "Once detailed 3D models of the region of interest have been constructed, as in Figs. 1 and 2 for the Los Angeles and Taipei metropolitan areas, numerical simulations of seismic wave propagation may be performed based upon various numerical methods. Studies of this kind have been conducted based upon finite-difference techniques (e.g., Wald and Graves, 1998; Olsen, 2000; Lee et al., 2007a), finite-element methods (e.g., Bao et al., 1998; Akcelik et al., 2003), and spectral-element methods (e.g., Komatitsch and Tromp, 1999; Komatitsch et al., 2004; Lee et al., 2007b). These simulations generally involve hundreds of millions of integrations points, tens of gigabytes of distributed memory, and are therefore typically performed on parallel computers based upon message-passing techniques (e.g., Gropp et al., 1996). "
    [Show abstract] [Hide abstract]
    ABSTRACT: We provide an overview of some of the issues that need to be considered in the context of quantitative seismic hazard assessment. To begin with, one needs to inventory and characterize the major faults that could produce earthquakes that would impact the region of interest. Next, one needs a seismographic network that continually records ground motion throughout the region. Data from this network may be used to assess and locate seismicity, calibrate ground motion simulations, and to conduct seismic early-warning experiments. To assess the response of engineered structures to strong ground motion, seismographs should also be installed at various locations within such engineered structures, e.g., on bridges, overpasses, dams and in tall buildings. The ultimate goal would be to perform 'end-to-end' simulations, starting with the rupture on an earthquake fault, followed by the propagation of the resulting seismic waves from the fault to an engineered structure of interest, and concluding with an assessment of the response of this structure to the imposed ground motion. To facilitate accurate ground motion and end-to-end simulations, one needs to construct a detailed three-dimensional (3D) seismic model of the region of interest. In particular, one needs to assess the slowest shear-wave speeds within the sediments underlying the metropolitan area. Geological information, and, in particular, seismic reaction and refraction surveys are critical in this regard. In the context of end-to-end simulations, detailed numerical models of engineered structures of interest need to be constructed as well. Data recorded by the seismographic network and in engineered structures after small to moderate earthquakes may be used to assess and calibrate the seismic and engineering models based upon numerical simulations. Once the seismic and engineering models produce synthetic ground motion that match the observed ground motion reasonably well, one can perform simulations of hypothetical large earthquakes to assess anticipated strong ground motion and potential damage. Throughout this article we will use the Los Angeles and Taipei metropolitan areas as examples of how to approach quantitative seismic hazard assessment.
    Journal of Earthquake and Tsunami 01/2012; 01(02). DOI:10.1142/S1793431107000079 · 0.41 Impact Factor
  • Source
    • "In this sense they strike a balance between structure and flexibility, enabling arbitrary-order accuracy (through high-order hexahedral finite/spectral elements along with enforcement of hanging node constraints ) and implicitly providing good element quality and a recursive space-filling curve ordering that enables lightweight load balancing. Parallel octree-based algorithms have recently demonstrated high scalability and low computational overhead [6], [28]–[33]. However, a single octree can encode only a topologically cubeshaped domain, and thus is unsuitable for representing general geometries. "
    Conference Paper: Extreme-scale AMR
    [Show abstract] [Hide abstract]
    ABSTRACT: Many problems are characterized by dynamics occurring on a wide range of length and time scales. One approach to overcoming the tyranny of scales is adaptive mesh refinement/coarsening (AMR), which dynamically adapts the mesh to resolve features of interest. However, the benefits of AMR are difficult to achieve in practice, particularly on the petascale computers that are essential for difficult problems. Due to the complex dynamic data structures and frequent load balancing, scaling dynamic AMR to hundreds of thousands of cores has long been considered a challenge. Another difficulty is extending parallel AMR techniques to high-order-accurate, complex-geometry-respecting methods that are favored for many classes of problems. Here we present new parallel algorithms for parallel dynamic AMR on forest-ofoctrees geometries with arbitrary-order continuous and discontinuous finite/spectral element discretizations. The implementations of these algorithms exhibit excellent weak and strong scaling to over 224,000 Cray XT5 cores for multiscale geophysics problems.
    Conference on High Performance Computing Networking, Storage and Analysis, SC 2010, New Orleans, LA, USA, November 13-19, 2010; 11/2010
  • Source
    • "From this point of view, FEMs are appealing, since they can use unstructured grids or meshes. Due to ever-increasing computational power, these kinds of methods have been the focus of a lot of interest and have been used intensively in seismology (Aagaard et al. 2001; Akcelik et al. 2003; Ichimura et al. 2007). "
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a discontinuous Galerkin finite-element method (DG-FEM) formulation with Convolutional Perfectly Matched Layer (CPML) absorbing boundary condition for 3-D elastic seismic wave modelling. This method makes use of unstructured tetrahedral meshes locally refined according to the medium properties (h-adaptivity), and of approximation orders that can change from one element to another according to an adequate criterion (p-adaptivity). These two features allow us to significantly reduce the computational cost of the simulations. Moreover, we have designed an efficient CPML absorbing boundary condition, both in terms of absorption and computational cost, by combining approximation orders in the numerical domain. A quadratic interpolation is typically used in the medium to obtain the required accuracy, while lower approximation orders are used in the CPMLs to reduce the total computational cost and to obtain a well-balanced workload over the processors. While the efficiency of DG-FEMs have been largely demonstrated for high approximation orders, we favour the use of low approximation orders as they are more appropriate to the applications we are interested in. In particular, we address the issues of seismic modelling and seismic imaging in cases of complex geological structures that require a fine discretization of the medium. We illustrate the efficiency of our approach within the framework of the EUROSEISTEST verification and validation project, which is designed to compare high-frequency (up to 4 Hz) numerical predictions of ground motion in the Volvi basin (Greece). Through the tetrahedral meshing, we have achieved fine discretization of the basin, which appears to be a sine qua non condition for accurate computation of surface waves diffracted at the basin edges. We compare our results with predictions computed with the spectral element method (SEM), and demonstrate that our method yields the same level of accuracy with computation times of the same order of magnitude.
    Geophysical Journal International 11/2010; 183(2). DOI:10.1111/J.1365-246X.2010.04764.X · 2.56 Impact Factor
Show more