Earthquake Spectra

Published by Earthquake Engineering Research Institute
Online ISSN: 8755-2930
Publications
Article
This paper quantitatively evaluates the suitability of multi-sensor remote sensing to assess the seismic vulnerability of buildings for the example city of Padang, Indonesia. Features are derived from remote sensing data to characterize the urban environment and are subsequently combined with in situ observations. Machine learning approaches are deployed in a sequential way to identify meaningful sets of features that are suitable to predict seismic vulnerability levels of buildings. When assessing the vulnerability level according to a scoring method, the overall mean absolute percentage error is 10.6%, if using a supervised support vector regression approach. When predicting EMS-98 classes, the results show an overall accuracy of 65.4% and a kappa statistic of 0.36, if using a naive Bayes learning scheme. This study shows potential for a rapid screening assessment of large areas that should be explored further in the future.
 
Article
Considerable research has been conducted worldwide to assess the unexpected damage to welded steel moment-frame buildings during the 1989 Loma Prieta, 1994 Northridge and 1995 Hyogo-ken Nanbu earthquakes, as well as to find effective and economical remedies that can be incorporated into analysis, design, and construction practices. A major 6-year program was undertaken sponsored by the US Federal Emergency Management Agency (FEMA) synthesized and interpreted the results of this research, and conducted additional investigations to develop reliable, practical and cost-effective guidelines for the design and construction of new steel moment-frame structures, as well as for the inspection, evaluation and repair or upgrading of existing ones. Topics investigated as part of this program included: (1) performance of steel buildings in past earthquakes; (2) material properties and fracture issues; (3) joining and inspection; (4) connection performance; (5) system performance; (6) performance prediction and evaluation; and (7) social, economic and political impacts. The project utilized a performance-based engineering framework and addresses issues pertaining to various types of steel moment-resisting frames including those utilizing welded, bolted, and partially restrained connections. Published late in 2000 by FEMA, the guidelines are applicable to regions of low, medium and high seismicity throughout the US. This paper reviews the overall organization and management of this program of research, guideline development, training and peer evaluation, the scope of the investigations undertaken, and the general organization and contents of the guidelines developed.
 
Chapter
A simplified empirically based seismic site response evaluation procedure that includes measures of the dynamic stiffness of the surficial materials and the depth to bedrock as primary parameters is introduced. This geotechnical site classification scheme provides an alternative to geologic-based and shear wave velocity-based site classification schemes. The proposed scheme is used to analyze the ground motion data from the 1989 Loma Prieta and 1994 Northridge earthquakes. Period-dependent and intensity-dependent spectral acceleration amplification factors for different site conditions are presented. The proposed scheme results in a significant reduction in standard error when compared with a simpler “rock vs. soil” classification system. Moreover, results show that sites previously grouped as “rock” should be subdivided as competent rock sites and weathered soft rock/shallow stiff soil sites to reduce uncertainty in defining site-dependent ground motions. Results also show that soil depth is an important parameter in estimating seismic site response. The standard errors resulting from the proposed site classification system are comparable with those obtained using the more elaborate code-based average shear-wave velocity classification system.
 
Article
We study how the selection of site response model affects the ground motion predictions of seismological models, and in turn how the synthetic motion site response variability propagates to the structural performance estimation. For this purpose, we compute ground motion synthetics for six earthquake scenarios of a strike-slip fault rupture, and estimate the ground surface response for 24 typical soil profiles in Southern California. We use viscoelastic, equivalent linear and nonlinear analyses for the site response simulations, and evaluate the ground surface motion variability that results from the soil model selection. Next, we subject a series of bilinear single degree of freedom oscillators to the ground motions computed using the alternative soil models, and evaluate the consequent variability in the structural response. Results show high bias and uncertainty of the inelastic structural displacement ratio predicted using the linear site response model for periods close to the fundamental period of the soil profile. The amount of bias and the period range where the structural performance uncertainty manifests are shown to be a function of both input motion and site parameters. We finally derive empirical correlations between the site parameters and the variability introduced in structural analyses based on our synthetic ground motion simulations. KeywordsNonlinear-Ground motion-Site response-Bilinear-Drift-Variability
 
Article
The FBA-11 is a feedback-controlled accelerometer widely used to measure and record accelerations arising from earthquakes. It has found application both for structural response and for ground motion studies. The design intent of the FBA-11 was to provide electronic control of the natural frequency, damping, and output voltage. Included in this paper are (1) a circuit analysis yielding the complete closed-loop transfer function, and (2) the corroborative test results from shake table evaluations. The transfer function can be used to correct recorded accelerations for instrument response.
 
Article
Using state-of-the-art computational tools in seismology and structural engineering, validated using data from the Mw=6.7 January 1994 Northridge earthquake, we determine the damage to two 18-story steel moment-frame buildings, one existing and one new, located in southern California due to ground motions from two hypothetical magnitude 7.9 earthquakes on the San Andreas Fault. The new building has the same configuration as the existing building but has been redesigned to current building code standards. Two cases are considered: rupture initiating at Parkfield and propagating from north to south, and rupture propagating from south to north and terminating at Parkfield. Severe damage occurs in these buildings at many locations in the region in the north-to-south rupture scenario. Peak velocities of 1 m.s−1 and 2 m.s−1 occur in the Los Angeles Basin and San Fernando Valley, respectively, while the corresponding peak displacements are about 1 m and 2 m, respectively. Peak interstory drifts in the two buildings exceed 0.10 and 0.06 in many areas of the San Fernando Valley and the Los Angeles Basin, respectively. The redesigned building performs significantly better than the existing building; however, its improved design based on the 1997 Uniform Building Code is still not adequate to prevent serious damage. The results from the south-to-north scenario are not as alarming, although damage is serious enough to cause significant business interruption and compromise life safety.
 
Article
Prior to 18 April 1906 the San Francisco Fire Department and knowledgeable pet-sons in the insurance industry regarded a conflagration in San Francisco as inevitable. The 1906 San Francisco earthquake and en suing fire is the greatest single fire loss in U.S. history, with 492 city blocks destroyed and life loss now estimated at more than 3,000. This paper describes fire protection practices in the United States prior to 1906; the conditions in San Francisco on the eve of the disaster; ignitions, spread, and convergence of fires that generated the 1906 conflagration; and damage to the water supply system in 1906 that gave impetus to construction of the largest high-pressure water distribution network ever built-San Francisco's Auxiliary Water Supply System (AWSS). In the 1980s hydraulic network and fire simulation modeling identified weaknesses in the fire protection of San Francisco-problems mitigated by all innovative Portable Water Supply System (PWSS), which transports water long distances and helped extinguish the Marina fire during the 1989 Loma Prieta earthquake. The AWSS and PWSS concepts have been extended to other communities and provide many lessons, paramount of which is that communities need to develop an integrated disaster preparedness and response capability and be constantly vigilant in maintaining that capability. This lesson is especially relevant to highly seismic regions with large wood building Inventories Such as the western United States and Japan, which are at great risk of conflagration following an earthquake.
 
Article
Twenty-five years have passed since the San Fernando earthquake of February 9, 1971. The paper reviews the lessons learned and not learned from this notable event. Most of the major lessons were reported within a few weeks of the earthquake by a panel appointed by the National Academies of Sciences and of Engineering. In this paper, the status of each of the eighteen general lessons cited by the panel is reviewed, plus two additional lessons selected from other studies of the earthquake. The lessons learned ranged broadly and concerned measures needed to reduce future earthquake hazards, as well as recommended scientific and engineering efforts. Although all of the lessons learned were not heeded, the San Fernando earthquake represented a turning point in public awareness and in actions taken to reduce earthquake hazard. Recent earthquakes have shown, however, that much remains to be done.
 
Article
A Board of Inquiry was appointed by the Governor of California to investigate the 1989 Loma Prieta Earthquake. The formation of the Board was prompted by earthquake damage to bridges and freeway structures and the desire to know not only what happened, but how to prevent such destruction in future earthquakes. The Board made fifty-two specific findings and eight recommendations. They identified three essential challenges that must be addressed by the citizens of California, if they expect a future adequately safe from earthquakes: 1) ensure that earthquake risks posed by new construction are acceptable; 2) identify and correct unacceptable seismic safety conditions in existing structures, and; 3) develop and implement actions that foster the rapid, effective, and economic response to and recovery from damaging earthquakes The Loma Prieta earthquake should be considered a clear and powerful warning to the people of California. Although progress has been made during the past two decades in reducing earthquake risks, much more could have been done, and awaits doing. More aggressive efforts to mitigate the consequences of future, certain earthquakes are needed if their disastrous potential is to be minimized and one of the most fundamental of responsibilities of government is to be fulfilled—to provide for the public safety. The Governor signed an Executive Order implementing the principal recommendations of the Board that may prove to be the most significant step to improve seismic safety within the State taken in the last several decades. It establishes the policy that all state owned and operated structures are to be seismically safe and that important structures are to maintain their function after earthquakes.
 
Article
Computer simulations are employed to assess the effects of near-source ground motions on base-isolated buildings that meet the provisions of the 1997 Uniform Building Code. A six-story base-isolated building designed for Nv = 1.6 exhibits essentially elastic structural behavior when subjected to six actual ground motions containing strong near-source effects. However, two simulated records, one intended to represent the most severe motions from the 1994 Northridge earthquake and the other a strong motion from a hypothetical Mw7.0 thrust earthquake produce larger responses well into the nonlinear range. In addition, a 113 cm ground displacement pulse of three-second duration, which is close to the period of the isolated buildings, causes story drifts of nearly 5% for the Nv = 1.6 design and over 2% for a stronger Nv = 2 design. Such drifts are effectively reduced when supplemental dampers are added alongside the isolators. The original Nv = 1.6 design with supplemental damping in the amount of 20% of critical experiences only 1.3% drift for the same three-second ground displacement pulse.
 
Article
Slip histories for the 2002 M7.9 Denali fault, Alaska, earthquake are derived rapidly from global teleseismic waveform data. In phases, three models improve matching waveform data and recovery of rupture details. In the first model (Phase I), analogous to an automated solution, a simple fault plane is fixed based on the preliminary Harvard Centroid Moment Tensor mechanism and the epicenter provided by the Preliminary Determination of Epicenters. This model is then updated (Phase II) by implementing a more realistic fault geometry inferred from Digital Elevation Model topography and further (Phase III) by using the calibrated P-wave and SH-wave arrival times derived from modeling of the nearby 2002 M6.7 Nenana Mountain earthquake. These models are used to predict the peak ground velocity and the shaking intensity field in the fault vicinity. The procedure to estimate local strong motion could be automated and used for global real-time earthquake shaking and damage assessment.
 
Article
On November 3, 2002, a moment-magnitude (Mw) 7.9 earthquake produced 340 km of surface rupture on the Denali fault and two related faults in central Alaska. The rupture, which proceeded from west to east, began with a 40-km-long break on a previously unknown thrust fault. Estimates of surface slip on this thrust were 3-6 m. Next came the principal surface break, along 220 km of the Denali fault. There, right-lateral offset averaged almost 5 m and increased eastward to a maximum of nearly 9 m. Finally, slip turned southeastward onto the Totschunda fault, where dextral offsets up to 3 m continued for another 70 km. This three-part rupture ranks among the longest documented strike-slip events of the past two centuries. The surface-slip distribution supports and clarifies models of seismological and geodetic data that indicated initial thrusting followed by rightlateral strike slip, with the largest moment release near the east end of the Denali fault. The Denali fault ruptured beneath the Trans-Alaska oil pipeline. The pipeline withstood almost 6 m of lateral offset, because engineers designed it to survive such offsets based on pre-construction geological studies. The Denali fault earthquake was typical of large-magnitude earthquakes on major intracontinental strike-slip faults, in the length of the rupture, the multiple fault strands that ruptured, and the variable slip along strike. published 565-578
 
Article
Seismic intensity in the epicentral area of the 2003 Bam, Iran earthquake is estimated using a questionnaire survey conducted two months after the earthquake. The estimated average seismic intensity on the Japan Meteorological Agency (JMA) scale is 6.1 (VIII to IX in the MMI scale). The peak frequency of the horizontal-to-vertical spectral ratio derived from microtremor measurements conducted during reconnaissance is also compared with the seismic intensity. Collapse rates for various structure types, such as adobe, unreinforced/ reinforced masonry, steel-frame, and reinforced concrete, are obtained by counting the number of demolished buildings within an area of about 50-m radius around an observation point. Results show large differences in collapse rates between unreinforced and reinforced masonry, and suggest the upper limit of seismic intensity that unreinforced masonry can sustain. This fact can be utilized for an initial damage assessment within affected areas after large earthquakes.
 
Article
The 26 December 2004 earthquake and tsunami significantly affected lifelines in Banda Aceh: (a) the water system sustained almost no damage due to shaking, but it had breaks at almost all aboveground stream crossings affected by the tsunami; (b) a waste disposal plant 1 km inland was estroyed by the tsunami; (c) electric utilities were not affected by shaking but were generally destroyed when impacted by the tsunami, and a floating generation barge was carried several kilometers inland; (d) a 70-m steel telecommunications tower at Lho Nga was destroyed by the tsunami; (e) the fishing port at Banda Aceh was totally destroyed, while the deep-water port at Kreung Raya (oil and dry cargo) lost half its piping and 3 of 9 oil tanks; (f) the airport at Banda Aceh is inland and had minor shaking damage to the control tower; and (g) one fire was reported on 26 December due to the earthquake. Significant lessons are (a) since the potential for this event was well known, the global earthquake community should better inform populations and decision makers; (b) urban master planning and building codes should include tsunami design as a standard and significant criterion; (c) the needs of lifeline operators should be considered in the design and development of tsunami warning systems; (d) building and other design codes should include tsunami design as a standard loading; and (e) extrapolation of the experience in this event to other situations is required.
 
Article
The 2004 Great Sumatra-Andaman earthquake had an average source duration of about 500 sec. and a rupture length of 1,200–1,300 km. The seismic moment, M0, determined with a finite source model, was 6.5×1022 N-m, which corresponds to Mw=9.18. Allowing for the uncertainties in the current M0 determinations, Mw is in the range of 9.1 to 9.3. The tsunami magnitude Mt is 9.1, suggesting that the overall size of the tsunami is consistent with what is expected of an earthquake with Mw=9.1 to 9.3. The short-period body-wave magnitude m-hatb is 7.25, which is considerably smaller than that of large earthquakes with a comparable Mw. The m-hatb versus Mw relationship indicates that, overall, the Great Sumatra-Andaman earthquake is not a tsunami earthquake. The tectonic environment of the rupture zone of the Great Sumatra-Andaman earthquake is very different from that of other great earthquakes, such as the 1960 Chile and the 1964 Alaska earthquakes. This difference may be responsible for the unique source characteristics of this earthquake. The extremely large size of the Great Sumatra-Andaman earthquake is reflected in the large amplitude of the long-period phase, the W phase, even in the early part of the seismograms before the arrival of the S wave. This information could be used for various early warning purposes.
 
Article
The normal-faulting earthquake of 6 April 2009 in the Abruzzo Region of central Italy caused heavy losses of life and substantial damage to centuriesold buildings of significant cultural importance and to modern reinforcedconcrete- framed buildings with hollow masonry infill walls. Although structural deficiencies were significant and widespread, the study of the characteristics of strong motion data from the heavily affected area indicated that the short duration of strong shaking may have spared many more damaged buildings from collapsing. It is recognized that, with this caveat of shortduration shaking, the infill walls may have played a very important role in preventing further deterioration or collapse of many buildings. It is concluded that better new or retrofit construction practices that include reinforcedconcrete shear walls may prove helpful in reducing risks in such seismic areas of Italy, other Mediterranean countries, and even in United States, where there are large inventories of deficient structures.
 
Article
This paper compares base shear computed from floor accelerations (inertial base shear) and column shears (structural base shear) for several single-degree-of-freedom (SDF) systems and two mid-rise, multi-story buildings due to a suite of 30 earthquake ground motions. The presented results show that the inertial base shear is close to structural base shear in short-period (<1 sec) SDF systems but may significantly exceed the structural base shear for individual ground motions in longer period (> 1 sec) SDF systems. Furthermore, the inertial base shear exceeds the structural base shear in the median by 10% to 20% and may exceed the structural base shear by as much as 70% for individual ground motions in multi-story buildings. Therefore, it is concluded that the inertial base shear should be used with caution to estimate the structural base shear in buildings with long fundamental vibration period whose motions are recorded during individual earthquake ground shaking.
 
Article
A thorough four-step performance-based seismic evaluation for a six-story unreinforced masonry building is conducted. Incremental dynamic analysis is carried out using the applied element method to take advantage of its ability to simulate progressive collapse of the masonry structure including out-of-plane failure of the walls. The distribution of the structural responses and inters-tory drifts from the incremental dynamic analysis curves are used to develop both spectral-based (Sa) and displacement-based (interstory drift) fragility curves at three structural performance levels. The curves resulting from three-dimensional (3-D) analyses using unidirectional ground motions are combined using the weakest link theory to propose combined fragility curves. Finally, the mean annual frequencies of exceeding the three performance levels are calculated using the spectral acceleration values at four probability levels 2%, 5%, 10%, and 40% in 50 years. The method is shown to be useful for seismic vulnerability evaluations in regions where little observed damage data exists.
 
Proposed Pre-1976 Building strengthening scheme 
Turnbull House 
Ground floor showing strengthening work 
Article
Current seismic strengthening approaches to historic buildings place emphasis upon concealing engineering technologies. This study investigates, through a process of architectural and structural engineering design, the architectural possibilities inherent in a completely different approach. Recognising both conservation concerns and the architectural qualities of two existing earthquake prone buildings, the study explores seismic strengthening strategies that are exposed to view in order to contribute, in both a physical and aesthetic sense, a layer of architectural richness. A 1960s eight storey reinforced concrete office building and a three storey unreinforced masonry building are the subject of theoretical seismic strengthening schemes. The paper describes the buildings, the strengthening approaches from both architectural and structural engineering perspectives, and comments on the outcome with respect to conservation guidelines. Although the proposed schemes challenge some sections of the guidelines, the authors believe the exposed structure enhances the existing architecture, and in so doing suggests an alternative approach for seismic retrofitting.
 
Steps of the ABV methodology.
Seismic vulnerability function for the example building. Each dot represents one simulation of ground shaking, structural response, damage, and repair. S a (T 1 ) measures spectral acceleration at the building's fundamental period. Total cost refers to repair plus loss of use. Twenty simulations are shown per increment of S a . The solid line represents a best fit on the mean.  
Illustrative translations of qualitative performance terminology. The qualitative term used in a PBD code is shown in the left column. A reasonable numerical translation and an example are shown in the second and third columns.
Article
Assembly-based vulnerability (ABV) is a framework for evaluating the seismic vulnerability and performance of buildings on a building-specific basis. It utilizes the damage to individual building components and accounts for the building's seismic setting, structural and nonstructural design and use. A simulation approach to implementing ABV first applies a ground motion time history to a structural model to determine structural response. The response is applied to assembly fragility functions to simulate damage to each structural and nonstructural element in the building, and to its contents. Probabilistic construction cost estimation and scheduling are used to estimate repair cost and loss-of-use duration as random variables. It also provides a framework for accumulating post-earthquake damage observations in a statistically systematic and consistent manner. The framework and simulation approach are novel in that they are fully probabilistic, address damage at a highly detailed and building-specific level, and do not rely extensively on expert opinion. ABV is illustrated using an example pre-Northridge welded-steel-moment-frame office building.
 
Illustration of earthquake rupture forecast (ERF) epistemic uncertainty in the peak ground acceleration hazard curve for a site (V s(30) =760 m/s) in the San Francisco bay area using the Campbell and Bozorgnia (2003) prediction equation.
Dispersion of epistemic uncertainty at 1.2% in 30 years probability of exceedance
Mean hazard curves of the four different sites considered.
summary of uncertainty propagation methods
Correlation of epistemic uncertainty between different intensity measure values: (a) for three different intensity measure values; and (b) for all intensity measure values after normalization.
Article
This paper investigates epistemic uncertainty in the results of seismic hazard analyses for the San Francisco Bay Area and their role in the broader picture of seismic performance assessment. Using the 2002 Working Group on California Earthquake Probabilities earthquake rupture forecast, epistemic uncertainty in the seismic hazard for several different intensity measures and sites in the San Francisco Bay Area is investigated. Normalization of the epistemic uncertainty for various sites and intensity measures illustrates that the uncertainty magnitude can be approximately estimated as a function of the mean exceedance probability. The distribution of the epistemic uncertainty is found to be dependent on the set of alternative ground-motion prediction equations used but is frequently well approximated by the lognormal distribution. The correlation in the hazard uncertainty is observed to be a function of the separation between the two different intensity levels, and a simple predictive equation is proposed based on the data analyzed. Three methods for the propagation of seismic hazard epistemic uncertainty are compared and contrasted using an example of the 30-year collapse probability of a structure. It is observed that, for this example, epistemic uncertainty in the collapse capacity is more influential than that in the seismic hazard
 
Article
At the time of the Northridge earthquake, a number of new technologies, including real-time availability of earthquake source data, improved loss estimation techniques, Geographic Information Systems and various satellite-based monitoring systems, were either available or under consideration as emergency management resources. The potential benefits from these technologies for earthquake hazard mitigation, response and recovery, however, were largely conceptual. One of the major lessons learned from the January 17, 1994 earthquake was that these technologies could confer significant advantages in understanding and managing a major disaster, and that their integration would contribute a significant additional increment of utility. In the two and half years since the Northridge earthquake, important strides have been taken toward the integration of relatively discrete technologies in a system which provides real-time estimates of regional damage, losses and population impacts. This paper will describe the development, operation and application of the first real-time loss estimation system to be utilized by an emergency services organization.
 
Article
This may be an historic event for many of you. It will be the first time that you have ever heard a geologist give a talk related to earthquakes that was not replete with Kodachrome slides of cracks in the ground and maps of active faults, or at least of "allegedly" active faults, or "potentially" active faults or even "possibly" active faults! But I would like to go beyond the detailed discussion of individual earthquakes this afternoon, and instead discuss the broader problem of whether our studies of numerous recent earthquakes -- here and abroad -- are leading to modifications in our hazard assessment techniques, speaking from the point of view of a geologist or seismologist. I emphasize that I make no pretext of speaking for either the geotechnical or structural engineers. You already know, of course, the answer posed by the title. It's both "yes" and "no". And I would like to focus on the question of: In what scientific areas, in particular, are our approaches changing, and in what areas do the traditional methods remain credible?
 
Article
This paper presents the results obtained from tests of a new friction damping system, which has been proposed in order to improve the response of steel Moment Resisting Frames (MRF) and Braced Moment Resisting Frames (BMRF) during severe earthquakes. The system consists of a mechanism containing brake lining pads introduced at the intersection of frame crossbraces. Seismic tests of a three storey Friction Damped Braced Frame (FDBF) model were performed on an earthquake simulator table. The experimental results are compared with the findings of an inelastic time-history dynamic analysis.
 
Article
The design parameter of 1.05 g peak seismic ground acceleration for a 300 kV SF6 circuit breaker necessitated the provision of supplemental friction-based dampers. This paper describes the dampers and the dynamic properties of the circuit breaker as obtained from a series of pull-release tests with increasing force amplitudes. These tests permitted a determination of a wide range of damping ratios and natural frequencies as a function of displacements. A comparison is also presented between the measured and the calculated damping ratios and frequencies, using common engineering approximation of the energy dissipated per cycle for damping, and a discrete parameter and linearized stiffness approach for the calculation of natural frequency. Reasonable comparisons were achieved between the measured and calculated values.
 
Article
Bridges affected by the Northridge earthquake were examined to evaluate the performance of Caltrans bridges, retrofit and peer review programs, and technical procedures. All structures in the region of strong shaking that had been retrofitted since 1989 performed adequately; if the seven bridges that collapsed had been retrofitted, they would be expected to have survived with little damage. The bridges subjected to strong shaking that had been constructed or retrofitted to current Caltrans criteria had, at most, minor damage; all remained in service. The Seismic Advisory Board concluded that the Caltrans seismic design procedures for new bridges and its retrofit procedures for existing hazardous bridges are technically sound. The Board found that the retrofit program is proceeding fairly well, but that the screening methods used to identify hazardous bridges could be improved as could the pace of retrofitting, particularly toll bridges. There were no construction projects underway in the Spring of 1994 for toll bridges, nor had preparation of retrofit designs begun. Twenty-one findings and seventeen recommendations are presented. Although much has been accomplished, much remains to be done. With some improvements, the Caltrans program should be continued with dispatch and determination. The major foreseeable impediments to a successful program are inadequate or fluctuating funding. The Seismic Advisory Board has confidence that the California highway system is progressing in an orderly fashion to one that is significantly more seismically safe.
 
Article
After an overview of the development of U.S. seismic design specifications for highway bridges an evaluation of current Caltrans and AASHTO seismic criteria is presented. Linear and nonlinear response spectra of ground motions recorded on different soil conditions in the Loma Prieta earthquake and other recent earthquakes are compared with code recommendations. Special emphasis is placed on how present design procedures reduce elastic forces to take into account the energy absorption capacity of the structure, and on the estimation of maximum inelastic deformations. Results indicate that current design recommendations may underestimate strength and deformation demands, particularly for short‐period bridges and for bridges on soft soils. Finally, recommendations are made on how seismic design specifications may be improved.
 
Article
This paper examines the question of which sources of uncertainty most strongly affect the repair cost of a building in a future earthquake. Uncertainties examined here include spectral acceleration, ground-motion details, mass, damping, structural force-deformation behavior, building-component fragility, contractor costs, and the contractor's overhead and profit. We measure the variation (or swing) of the repair cost when each basic input variable except one is taken at its median value, and the remaining variable is taken at its 10th and at its 90th percentile. We perform this study using a 1960s highrise nonductile reinforced-concrete moment-frame building. Repair costs are estimated using the assembly-based vulnerability (ABV) method. We find that the top three contributors to uncertainty are assembly capacity (the structural response at which a component exceeds some damage state), shaking intensity (measured here in terms of damped elastic spectral acceleration, Sa), and details of the ground motion with a given Sa.
 
Article
A performance-based earthquake engineering methodology has recently been developed that quantifies building performance in terms of repair costs, life-safety risk, and loss of use (“dollars, deaths, and downtime”). The methodology is used to quantify the economic benefit (avoided future repair costs) of various detailed seismic retrofits, above-code design alternatives, and construction quality levels for several particular, completely designed woodframe buildings. Benefits are quantified assuming each house is located in any of California’s 1,653 ZIP Codes. It is found that one example retrofit (costing approximately $1,400) exhibits benefit-cost ratios as high as 7.8, saving the homeowner up to $11,000 in avoided losses if the house were located in the highesthazard area of the state. Four retrofit or redesign measures are cost effective in at least some locations. Higher quality is estimated to save thousands of dollars per house. We conclude that such quantitative benefit data could inform homeowners’ decisions about mitigating seismic risk.
 
Article
Occurrence of large earthquakes close to cities in California is inevitable. The resulting ground shaking will subject buildings in the near-source region to large, rapid displacement pulses which are not represented in design codes. The simulated Mw7.0 earthquake on a blind-thrust fault used in this study produces peak ground displacement and velocity of 200 cm and 180 cm/sec, respectively. Over an area of several hundred square kilometers in the near-source region, flexible frame and base-isolated buildings would experience severe nonlinear behavior including the possibility of collapse at some locations. The susceptibility of welded connections to fracture significantly increases the collapse potential of steel-frame buildings under strong ground motions of the type resulting from the Mw7.0 simulation. Because collapse of a building depends on many factors which are poorly understood, the results presented here regarding collapse should be interpreted carefully.
 
Article
This paper comprehensively evaluates the Modal Pushover Analysis (MPA) procedure against the ‘‘exact’’ nonlinear response history analysis (RHA) and investigates the accuracy of seismic demands determined by pushover analysis using FEMA-356 force distributions; the MPA procedure in this paper contains several improvements over the original version presented in Chopra and Goel (2002). Seismic demands are computed for six buildings, each analyzed for 20 ground motions. It is demonstrated that with increasing number of ‘‘modes’’ included, the height-wise distribution of story drifts and plastic rotations estimated by MPA becomes generally similar to trends noted from nonlinear RHA. The additional bias and dispersion introduced by neglecting ‘‘modal’’ coupling and P-Δeffects due to gravity loads in MPA procedure is small unless the building is deformed far into the inelastic range with significant degradation in lateral capacity. A comparison of the seismic demands computed by FEMA-356 NSP and nonlinear RHA showed that FEMA-356 lateral force distributions lead to gross underestimation of story drifts and completely fail to identify plastic rotations in upper stories compared to the values from the nonlinear RHA. The ‘‘Uniform’’ force distribution in FEMA-356 NSP seems unnecessary because it grossly overestimates drifts and plastic rotations in lower stories and grossly underestimates them in upper stories. The MPA procedure resulted in estimates of demand that were much better than from FEMA force distributions over a wide range of responses—from essentially elastic response of Boston buildings to strongly inelastic response of Los Angeles buildings. However, pushover analysis procedures cannot be expected to provide satisfactory estimates of seismic demands for buildings deforming far into the inelastic range with significant degradation of the lateral capacity; for such cases, nonlinear RHA becomes necessary.
 
Article
We examine the characteristics of long-period near-source ground motions by conducting a sensitivity study with variations in six earthquake source parameters for both a strike-slip fault (M 7.0-7.1) and a thrust fault (M 6.6-7.0). The directivity of the ruptures creates large displacement and velocity pulses in the forward direction. The dynamic displacements close to the fault are comparable to the average slip. The ground motions exhibit the greatest sensitivity to the fault depth with moderate sensitivity to the rupture speed, peak slip rate, and average slip. For strike-slip faults and thrust faults with surface rupture, the maximum ground displacements and velocities occur in the region where the near-source factor from the 1997 Uniform Building Code is the largest. However, for a buried thrust fault the peak ground motions can occur up-dip from this region.
 
Article
Three exercises of The Plan to Coordinate NEHRP Post-Earthquake Investigations were developed and implemented in late 2003 and early 2004 in order to test the Plan itself via realistic scenarios, and for the NEHRP agencies to learn how to coordinate post-earthquake investigations. The exercises were selected to cover a range of seismic activity and consequences, and were based on scenario events: 1 a Hayward Fault Mw 7 event without foreshocks; 2 a New Madrid seismic zone Mw 7 event with foreshocks, and 3 a Puerto Rico Mw 8 subduction event on the Puerto Rican Trench accompanied by a tsunami affecting the eastern seaboard of the United States. Each exercise consisted of a four-hour telephone conference call with a Web-based electronic link and post-exercise evaluations fed back to participants. Evaluation of the exercises found the Plan to be adequate, with implementation of the Plan by the NEHRP agencies improving with each exercise. Based on the exercises, recommendations were provided that a Plan coordinator should be designated within USGS, an annual exercise of the Plan should be conducted in different regions of the United States, a permanent NEHRP electronic link should be created, and coordination of post-earthquake data collection, preservation, archiving, and dissemination should be greatly improved. DOI: 10.1193/1.2087707
 
Article
The Applied Technology Council is adapting PEER's performance-based earthquake engineering methodology to professional practice. The methodology's damage-analysis stage uses fragility functions to calculate the probability of damage to facility components given the force, deformation, or other engineering demand parameter (EDP) to which each is subjected. This paper introduces a set of procedures for creating fragility functions from various kinds of data: (A) actual EDP at which each specimen failed; (B) bounding EDP, in which some specimens failed and one knows the EDP to which each specimen was subjected; (C) capable EDP, where specimen EDPs are known but no specimens failed; (D) derived, where fragility functions are produced analytically; (E) expert opinion; and (U) updating, in which one improves an existing fragility function using new observations. Methods C, E, and U are all introduced here for the first time. A companion document offers additional procedures and more examples.
 
Article
We establish and test a shake map methodology for intermediate-depth Vrancea earthquakes, based on seismological information gathered in Romania during recent years. We use region- (azimuth-) dependent attenuation relations derived from stochastic simulations of ground motions using spectral models of Vrancea earthquakes. Both region boundaries and Fourier amplification spectra for the characterization of seismic site effects are based on several hundred weak, moderate and strong-motion records and macroseismic intensity maps. We determine region-specific, magnitude- and distance-dependent amplification factors of peak values and instrumental intensity relative to rock. We interpolate recorded ground motions and ground motion estimates from the obtained amplification factors and attenuation relations for rock conditions. The resulting shake maps show a good agreement with macroseismic descriptions of moderate-sized and large Vrancea earthquakes, demonstrating the feasibility of a seismological approach to shake map generation. Unlike previous methodologies, this approach requires neither expensive assessments of geology-dependent site amplification factors, nor large numbers of strong-motion records. Our results are in good agreement with empirical topographic slope-site amplification relations, but give a better reflection of the abnormal attenuation of seismic waves in the Transylvanian region and the strong amplification in the Focsani basin.
 
Article
The Northridge earthquake will long be remembered for the unprecedented losses incurred as a result of a moderate-size event in a suburban area of Los Angeles. Current documented costs indicate that this event is the costliest disaster in U.S. history. Although it is difficult to estimate the full cost of this event, it is quite possible that total losses, excluding indirect effects, could reach as much as $40 billion. This would make the Northridge earthquake less severe than the Kobe event, which occurred exactly one year after the Northridge earthquake, but adds a bit of realism that a Kobe-type disaster is possible in the U.S. This paper attempts to put into perspective the direct capital losses associated with the Northridge earthquake. In doing so, we introduce the concept of hidden and/or undocumented costs that could double current estimates. In addition, we present the notion that a final estimate of loss may be impossible to achieve, although costs do begin to level off two years after the earthquake. Finally, we attempt to reconcile apparent differences between loss totals for two databases tracking similar information.
 
Northridge earthquake injuries ͑ after Seligson and Shoaf 2003 ͒ . “HH” indicates number of households in which at least one person experienced this level of injury. 
Federal values of statistical deaths and injuries avoided, in 1994 US$
Estimated value of injuries in the 1994 Northridge earthquake, in 1994 US$
Article
The economic equivalent value of deaths and injuries in the 1994 Northridge earthquake has not previously been calculated, although number of injuries by category of treatment has. Using dollar-equivalent values for injuries accepted and used by the U.S. government for evaluating the cost-effectiveness of risk-mitigation efforts, the value of injuries in the 1994 Northridge earthquake is estimated to be $1.3 to 2.2 billion in 1994 (90% confidence bounds, equivalent to $1.8 to 2.9 billion in 2005). This is equivalent to 3–4% of the estimated $50 billion (in 1994) estimated direct capital losses and direct business interruption losses. If injuries in the 1994 Northridge earthquake are representative of injuries in future U.S. events, then the economic value of future earthquake injuries—the amount that the U.S. government would deem appropriate to expend to prevent all such injuries—is on the order of $200 million per year (in 2005 constant dollars). Of this figure, 96% is associated with nonfatal injuries, an issue overlooked by current experimental research. Given the apparently high cost of this type of loss, this appears to represent an important gap in the present earthquake research agenda.
 
Article
New performance-based earthquake engineering methods developed by the Pacific Earthquake Engineering Research Center, the Applied Technology Council, and others include damage analysis at a highly detailed level, requiring the compilation of fragility functions for a large number of damageable generic structural and nonstructural components. This brief paper presents the development of a fragility function for hydraulic elevators. It uses post-earthquake survey data from 91 elevators in nine California locations after two earthquakes. Surveys were used to collect data on facilities and elevators. Ground-motion records from the California Integrated Seismic Network were used to estimate engineering demands at each site. Binary regression analysis was used to fit a fragility function, which takes the form of a lognormal cumulative distribution function with median value of PGA=0.42 g and logarithmic standard deviation of 0.3. The fragility function appears to be reasonable based on four criteria.
 
Article
At present it is the consensus that structures should be desitned to undergo ductile deformations if subjected to very strong ground shaking, so structures designed without consideration of large deformations can be expected to experience undesirable damage. However, in most strong earthquakes some unexpected ground motion or unexpected structural failure occurs which deserves closer examination. In both the Northridge and the Kobe earthquakes there were unexpected ground motions as well as unexpected structural damage. The reason such failures are unexpected is that their possibility was not identified during the design process.
 
Article
We examine seismic risk from the commercial real estate investor's viewpoint. We present a methodology to estimate the uncertain net asset value (NAV) of an investment opportunity considering market risk and seismic risk. For seismic risk, we employ a performance-based earthquake engineering methodology called assembly-based vulnerability (ABV). For market risk, we use evidence of volatility of return on investment in the United States. We find that uncertainty in NAV can be significant compared with investors' risk tolerance, making it appropriate to adopt a decision-analysis approach to the investment decision, in which one optimizes certainty equivalent, CE, as opposed to NAV. Uncertainty in market value appears greatly to exceed uncertainty in earthquake repair costs. Consequently, CE is sensitive to the mean value of earthquake repair costs but not to its variance. Thus, to a real estate investor, seismic risk matters only in the mean, at least for the demonstration buildings examined here.
 
Article
The concept of active structural control as a means of structural protection against seismic loads, developed over the last 20 years, has received considerable attention in recent years. It has now reached the stage where active systems have been installed in full-scale structures. It is the purpose of this paper to provide an overview of this development with special emphasis placed on laboratory experiments using model structures and on full-scale implementation of some active control systems. Included in this paper is a report on the formation of a U.S. Panel on Structural Control Research and some discussion on possible future research directions in this exciting research area.
 
Article
The damage caused to the structures due to the 23 June 2001 earthquake in Peru was discussed. The coastline south of the epicenter was extensively damaged by the tsunami that followed the main shock. It was found that the most damage occurred in the cities of Arequipa and Moquegua and to the Enersur plant.
 
Article
We used an expanded PEER NGA-West2 database to develop a new ground motion prediction equation (GMPE) for the average (RotD50) horizontal components of PGA, PGV, and 5%-damped linear pseudo-absolute acceleration response spectra at 21 periods ranging from 0.01 to 10 s. In addition to those terms included in our now superseded 2008 GMPE, we include a more-detailed hanging-wall model, scaling with hypocentral depth and fault dip, regionally independent geometric attenuation, regionally dependent anelastic attenuation and site conditions, and magnitude-dependent aleatory variability. The NGA-West2 database provides better constraints on magnitude scaling and attenuation of small-magnitude earthquakes, where our 2008 GMPE was known to be biased. We consider our new GMPE to be valid for estimating RotD50 from shallow crustal continental earthquakes in an active tectonic domain for rupture distances ranging from 0 to 300 km and magnitudes ranging from 3.3 to 7.5–8.5, depending on source mechanism.
 
Article
Power transformers and bushings are key pieces of substation equipment and are vulnerable to the effects of earthquake shaking. The seismic performance of a 1100 kV bushing, used in a ultra-high voltage power transformer, is studied using a combination of physical and numerical experiments. The physical experiments utilized an earthquake simulator, and included system identification and seismic tests. Modal frequencies and shapes are derived from white-noise tests. Acceleration, strain and displacement responses are obtained from the uniaxial horizontal seismic tests. A finite element model of the 1100 kV bushing is developed and analyzed, and predicted and measured results are compared. There is reasonably good agreement between predicted and measured responses, enabling the finite element model to be used with confidence for seismic vulnerability studies of transformer-bushing systems. A coupling of the experimental and numerical simulations enabled the 1100 kV bushing to be seismically qualified for three-component ground shaking with a horizontal zero period acceleration of 0.53 g.
 
Article
The 11 March 2011 Tohoku-oki, Japan, earthquake, tsunami, and nuclear reactor disasters shattered existing plans for decision making, preparedness, and response operations under conditions of uncertainty and risk. The interaction among these events created dynamics that could not be addressed by any single organization or jurisdiction alone and that had not been considered in the planning processes undertaken by separate jurisdictions and organizations. The scale and scope of devastation overwhelmed those responsible for protecting communities at every level of jurisdictional decision making and organizational management. We examine the policy problem of decision making involving interaction between human and natural systems, and review existing policies, plans, and practices that characterized efforts in disaster risk reduction in Japan prior to 11 March 2011. We contrast these plans with observed practices, focusing on interactions and communication flows among organizations engaged in responding to the disaster. These events demonstrate the compelling need to rethink catastrophe.
 
Top-cited authors
Brian Chiou
  • State of California
Kenneth W. Campbell
  • CoreLogic, Inc.
Yousef Bozorgnia
  • University of California, Berkeley
Jonathan Paul Stewart
  • University of California, Los Angeles
Andrei M Reinhorn
  • University at Buffalo, The State University of New York