A simulation-based approach to forecasting the next great San Francisco earthquake.

Center for Computational Science and Engineering, and Department of Geology, University of California-Davis, Davis, CA 95616, USA.
Proceedings of the National Academy of Sciences (Impact Factor: 9.81). 11/2005; 102(43):15363-7. DOI: 10.1073/pnas.0507528102
Source: PubMed

ABSTRACT In 1906 the great San Francisco earthquake and fire destroyed much of the city. As we approach the 100-year anniversary of that event, a critical concern is the hazard posed by another such earthquake. In this article, we examine the assumptions presently used to compute the probability of occurrence of these earthquakes. We also present the results of a numerical simulation of interacting faults on the San Andreas system. Called Virtual California, this simulation can be used to compute the times, locations, and magnitudes of simulated earthquakes on the San Andreas fault in the vicinity of San Francisco. Of particular importance are results for the statistical distribution of recurrence times between great earthquakes, results that are difficult or impossible to obtain from a purely field-based approach.

  • [Show abstract] [Hide abstract]
    ABSTRACT: Landslide area probability density function (PDF) statistics elucidate the landslide magnitude–frequency distribution in a small coastal watershed in central California. Detailed mapping into a GIS and compilation of two slide inventories in Walker Creek, Marin County, reflect erosional effects of storms occurring in 1941 and 1998, respectively. We focus on the spatial distribution and area of slides to illustrate that during both periods slides originate mainly from hillslopes underlain by Franciscan Complex units with slopes ranging from ~ 10 to 40%. The majority of slides (~ 70%) have areas ranging between 100 and 1000 m2. The magnitude–frequency distribution of slides represented by the PDF produces a curve that has a linear portion representing the relatively large slides. The power functions of the linear portion of these curves have exponents of about − 2.0 for both inventories. Results suggest that the storm event-based PDF of landslide area developed using historical aerial images is a viable means of quantifying the landslide magnitude–frequency relationship at the scale of a small watershed.
    Catena 10/2013; 109:129–138. · 2.48 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Numerical simulations are routinely used for weather forecasting. It is clearly desirable to develop simulation models for regional seismicity. One model that has been developed for the purpose is the Virtual California (VC) simulation. In order to better understand the behaviour of seismicity simulations, we apply VC to three relatively simple problems involving a straight strike-slip fault. In problem I, we divide the fault into two segments with different mean earthquake interval times. In problem II, we add a central strong (asperity) segment and in problem III we change this to a weak central segment. In all cases we observe limit cycle behaviour with a wide range of periods. We also show that the historical sequence of 13 great earthquakes along the Nankai Trough, Japan, exhibits a limit-cycle behaviour very similar to our asperity model.
    Geophysical Journal International 02/2010; 180(2):734-742. · 2.72 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We discuss the long-standing question of whether the probability for large earthquake occurrence (magnitudes m > 6.0) is highest during time periods of smaller event activation, or highest during time periods of smaller event quiescence. The physics of the activation model are based on an idea from the theory of nucleation, that a small magnitude earthquake has a finite probability of growing into a large earthquake. The physics of the quiescence model is based on the idea that the occurrence of smaller earthquakes (here considered as magnitudes m > 3.5) may be due to a mechanism such as critical slowing down, in which fluctuations in systems with long-range interactions tend to be suppressed prior to large nucleation events. To illuminate this question, we construct two end-member forecast models illustrating, respectively, activation and quiescence. The activation model assumes only that activation can occur, either via aftershock nucleation or triggering, but expresses no choice as to which mechanism is preferred. Both of these models are in fact a means of filtering the seismicity time-series to compute probabilities. Using 25 yr of data from the California-Nevada catalogue of earthquakes, we show that of the two models, activation and quiescence, the latter appears to be the better model, as judged by backtesting (by a slight but not significant margin). We then examine simulation data from a topologically realistic earthquake model for California seismicity, Virtual California. This model includes not only earthquakes produced from increases in stress on the fault system, but also background and off-fault seismicity produced by a BASS-ETAS driving mechanism. Applying the activation and quiescence forecast models to the simulated data, we come to the opposite conclusion. Here, the activation forecast model is preferred to the quiescence model, presumably due to the fact that the BASS component of the model is essentially a model for activated seismicity. These results lead to the (weak) conclusion that California seismicity may be characterized more by quiescence than by activation, and that BASS-ETAS models may not be robustly applicable to the real data.
    Geophysical Journal International 10/2011; 187:225-236. · 2.72 Impact Factor


Available from
May 21, 2014