Article

A simulation-based approach to forecasting the next great San Francisco earthquake.

Center for Computational Science and Engineering, and Department of Geology, University of California-Davis, Davis, CA 95616, USA.
Proceedings of the National Academy of Sciences (Impact Factor: 9.81). 11/2005; 102(43):15363-7. DOI: 10.1073/pnas.0507528102
Source: PubMed

ABSTRACT In 1906 the great San Francisco earthquake and fire destroyed much of the city. As we approach the 100-year anniversary of that event, a critical concern is the hazard posed by another such earthquake. In this article, we examine the assumptions presently used to compute the probability of occurrence of these earthquakes. We also present the results of a numerical simulation of interacting faults on the San Andreas system. Called Virtual California, this simulation can be used to compute the times, locations, and magnitudes of simulated earthquakes on the San Andreas fault in the vicinity of San Francisco. Of particular importance are results for the statistical distribution of recurrence times between great earthquakes, results that are difficult or impossible to obtain from a purely field-based approach.

0 Bookmarks
 · 
87 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: We discuss the long-standing question of whether the probability for large earthquake occurrence (magnitudes m > 6.0) is highest during time periods of smaller event activation, or highest during time periods of smaller event quiescence. The physics of the activation model are based on an idea from the theory of nucleation, that a small magnitude earthquake has a finite probability of growing into a large earthquake. The physics of the quiescence model is based on the idea that the occurrence of smaller earthquakes (here considered as magnitudes m > 3.5) may be due to a mechanism such as critical slowing down, in which fluctuations in systems with long-range interactions tend to be suppressed prior to large nucleation events. To illuminate this question, we construct two end-member forecast models illustrating, respectively, activation and quiescence. The activation model assumes only that activation can occur, either via aftershock nucleation or triggering, but expresses no choice as to which mechanism is preferred. Both of these models are in fact a means of filtering the seismicity time-series to compute probabilities. Using 25 yr of data from the California-Nevada catalogue of earthquakes, we show that of the two models, activation and quiescence, the latter appears to be the better model, as judged by backtesting (by a slight but not significant margin). We then examine simulation data from a topologically realistic earthquake model for California seismicity, Virtual California. This model includes not only earthquakes produced from increases in stress on the fault system, but also background and off-fault seismicity produced by a BASS-ETAS driving mechanism. Applying the activation and quiescence forecast models to the simulated data, we come to the opposite conclusion. Here, the activation forecast model is preferred to the quiescence model, presumably due to the fact that the BASS component of the model is essentially a model for activated seismicity. These results lead to the (weak) conclusion that California seismicity may be characterized more by quiescence than by activation, and that BASS-ETAS models may not be robustly applicable to the real data.
    Geophysical Journal International 01/2011; 187:225-236. · 2.85 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We are using the QuakeSim environment to model interacting fault systems. One goal of QuakeSim is to prepare for the large volumes of data that spaceborne missions such as DESDynI will produce. QuakeSim has the ability to ingest distributed heterogenous data in the form of InSAR, GPS, seismicity, and fault data into various earthquake modeling applications, automating the analysis when possible. Virtual California simulates interacting faults in California. We can compare output from long time-history Virtual California runs with the current state of strain and the strain history in California. In addition to spaceborne data we will begin assimilating data from UAVSAR airborne flights over the San Francisco Bay Area, the Transverse Ranges, and the Salton Trough. Results of the models are important for understanding future earthquake risk and for providing decision support following earthquakes. Improved models require this sensor web of different data sources, and a modeling environment for understanding the combined data.
    Aerospace conference, 2009 IEEE; 04/2009
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We identify two distinct scaling regimes in the frequency–magnitude distribution of global earthquakes. Spe- cifically, we measure the scaling exponent b = 1.0 for “small” earthquakes with 5.5 b m b 7.6 and b = 1.5 for “large” earthquakes with 7.6 b m b 9.0. This transition at mt = 7.6, can be explained by geometric constraints on the rupture. In conjunction with supporting literature, this corroborates theories in favor of fully self- similar and magnitude independent earthquake physics. We also show that the scaling behavior and abrupt transition between the scaling regimes imply that earthquake ruptures have compact shapes and smooth rupture-fronts.
    Tectonophysics 01/2012; 532-535:167-174. · 2.68 Impact Factor

Full-text

View
56 Downloads
Available from
May 21, 2014