Article

# A simulation-based approach to forecasting the next great San Francisco earthquake.

Center for Computational Science and Engineering, and Department of Geology, University of California-Davis, Davis, CA 95616, USA.

Proceedings of the National Academy of Sciences (Impact Factor: 9.81). 11/2005; 102(43):15363-7. DOI: 10.1073/pnas.0507528102 Source: PubMed

- [Show abstract] [Hide abstract]

**ABSTRACT:**Landslide area probability density function (PDF) statistics elucidate the landslide magnitude–frequency distribution in a small coastal watershed in central California. Detailed mapping into a GIS and compilation of two slide inventories in Walker Creek, Marin County, reflect erosional effects of storms occurring in 1941 and 1998, respectively. We focus on the spatial distribution and area of slides to illustrate that during both periods slides originate mainly from hillslopes underlain by Franciscan Complex units with slopes ranging from ~ 10 to 40%. The majority of slides (~ 70%) have areas ranging between 100 and 1000 m2. The magnitude–frequency distribution of slides represented by the PDF produces a curve that has a linear portion representing the relatively large slides. The power functions of the linear portion of these curves have exponents of about − 2.0 for both inventories. Results suggest that the storm event-based PDF of landslide area developed using historical aerial images is a viable means of quantifying the landslide magnitude–frequency relationship at the scale of a small watershed.Catena 10/2013; 109:129–138. · 2.48 Impact Factor -
##### Article: A feasibility study of data assimilation in numerical simulations of earthquake fault systems

[Show abstract] [Hide abstract]

**ABSTRACT:**Topologically realistic simulations of earthquake faults systems have been constructed to understand the physics of interacting earthquake fault systems. We focus on one of these models, a simulation called Virtual California, that represents a model for the strike-slip fault system in California. In weather forecasting, current and past observations are routinely extrapolated forward to forecast future weather. The question addressed in this paper is whether a similar application of numerical simulations can be used in earthquake forecasting. Present simulation models are discussed and their ability to successfully generate earthquake recurrence statistics is demonstrated. An important question relates to how paleoseismic data can be used to constrain simulations, and whether these constrained simulations provide improved forecasts of future earthquakes. Here, we show first results from a consideration of these issues using a method of “datascoring”. The data are divided into “training intervals” and “testing intervals”. In the training intervals, the time history of paleoseismic data are used to evaluate space–time windows of simulations. Earthquakes following high-scoring space–time windows in the simulations are then used as a basis for developing waiting time statistics and used to forecast data in the testing intervals. In our present method, we focus on the problem of determining the timing of future earthquakes having magnitude m > 7. Our preliminary conclusion is that the amount of paleoseismic data currently available does not as yet improve the waiting time statistics to a level significantly beyond a random (temporal) predictor. However, this conclusion is based on a set of studies that are not extensive, so further investigations may well reveal important new avenues. In particular, it may be that the true value of this approach lies in defining the probable spatial locations of future earthquakes, rather than their timing.Physics of The Earth and Planetary Interiors 08/2007; 163(s 1–4):149–162. · 2.40 Impact Factor - [Show abstract] [Hide abstract]

**ABSTRACT:**We discuss the long-standing question of whether the probability for large earthquake occurrence (magnitudes m > 6.0) is highest during time periods of smaller event activation, or highest during time periods of smaller event quiescence. The physics of the activation model are based on an idea from the theory of nucleation, that a small magnitude earthquake has a finite probability of growing into a large earthquake. The physics of the quiescence model is based on the idea that the occurrence of smaller earthquakes (here considered as magnitudes m > 3.5) may be due to a mechanism such as critical slowing down, in which fluctuations in systems with long-range interactions tend to be suppressed prior to large nucleation events. To illuminate this question, we construct two end-member forecast models illustrating, respectively, activation and quiescence. The activation model assumes only that activation can occur, either via aftershock nucleation or triggering, but expresses no choice as to which mechanism is preferred. Both of these models are in fact a means of filtering the seismicity time-series to compute probabilities. Using 25 yr of data from the California-Nevada catalogue of earthquakes, we show that of the two models, activation and quiescence, the latter appears to be the better model, as judged by backtesting (by a slight but not significant margin). We then examine simulation data from a topologically realistic earthquake model for California seismicity, Virtual California. This model includes not only earthquakes produced from increases in stress on the fault system, but also background and off-fault seismicity produced by a BASS-ETAS driving mechanism. Applying the activation and quiescence forecast models to the simulated data, we come to the opposite conclusion. Here, the activation forecast model is preferred to the quiescence model, presumably due to the fact that the BASS component of the model is essentially a model for activated seismicity. These results lead to the (weak) conclusion that California seismicity may be characterized more by quiescence than by activation, and that BASS-ETAS models may not be robustly applicable to the real data.Geophysical Journal International 10/2011; 187:225-236. · 2.72 Impact Factor

Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.