Article

A GIS-based decision making model using fuzzy sets and theory of evidence for seismic vulnerability assessment under uncertainty (case study: Tabriz)

Authors:
  • International Institute of Earthquake Engineering and Seismology (IIEES)
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Seismic vulnerability assessment is a critical topic in disaster management. It is a complex uncertain spatial decision making problem due to lack of complete data, vagueness of experts' comments in addition to uncertainties in the numerical data/relations. This paper presents a new Geospatial Information System (GIS)-based multi-criterion decision-making (MCDM) method developed for predicting building damages prior to the occurrence of a potential earthquake scenario which considers different sources of uncertainty to make realistic assessments. The developed method suggests an approximate reasoning approach through using Fuzzy Sets theory (FST) and enhanced Dempster-Shafer theory (DST). FST handles the vagueness of the heuristic knowledge on 'importance weights of the selected criteria' and 'the relationship of the criteria with physical seismic vulnerability (PSV)'. The enhanced DST is used for fusion of the information by taking into account the reliability of the adopted criteria. The proposed method's applicability is tested on existing buildings of a municipality district of Tabriz, a historical and earthquake prone city in Iran. The implementation results confirm that the proposed method is a pragmatic, rational and simple model which reduces uncertainties of PSVA to provide realistic predictions essential for assisting planners and administrators with reducing future earthquake losses in urban areas.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... If the catastrophic effects of earthquakes are calculated in advance, the human and property losses can be reduced through suitable and timely planning in mitigation and preparedness stage. However, it is crucial to consider the incorporated uncertainties in seismic vulnerability assessment to obtain realistic information and thus efficiently reduce potential future losses [2]. This research firstly studies the knowledge-based uncertainties associated with a geographic information system (GIS)based model for assessing building damage (also called physical seismic vulnerability assessment (PSVA)) emphasizing on the involved inconsistency. ...
... It is very important to propose simple, efficient and easily applicable methods for physical and social seismic vulnerability assessment as a decision support tool in such earthquake prone areas considering and subsuming the incorporated uncertainties [3]. To handle the incorporated knowledge-based uncertainties in seismic vulnerability assessment efficiently and realistically, integration of fuzzy sets theory (FST) and Dempster-Shafer theory (DST) has been introduced [2]. However, considering that Dempster combination rule is not efficient when evidences conflict, Shafer discounting rule was used through information fusion. ...
... Dempster combination rule using discounting coefficients [38] has been previously used to combine BPAs namely m (NS)s (or beliefs), m (IS)s (or disbeliefs) and m (ISNS)s (or ignorance uncertainties) [2]. In order to study the influence of conflicts between evidences in the combination process on the outputs of PSVA, in this paper the Shafer discounting rule coefficients are eliminated and the results of PSVA are compared to the outputs of using enhanced Dempster combination rule using Shafer discounting coefficients. ...
Article
Full-text available
Earthquake is one of the natural disasters which threaten many lives every year. It is impossible to prevent earthquakes from occurring; however, it is possible to predict the building damage, human and property losses in advance to mitigate the adverse effects of the catastrophe. Seismic vulnerability assessment is a complex uncertain spatial decision making problem due to intrinsic uncertainties such as lack of complete data, vagueness in experts’ comments and uncertainties in the numerical data/relations. It is important to identify and model the incorporated uncertainties of seismic vulnerability assessment in order to obtain realistic predictions. Fuzzy sets theory can model the vagueness in weights of the selected criteria and relationships of the criteria with building damage. Dempster’s combination rule is useful for fusion of information on the vulnerability of the buildings which leads to decreased uncertainty of the results. However, when there is a conflict among information sources, classical Dempster rule of combination is not efficient. This paper analyses the uncertainty sources in a geospatial information system (GIS)-based seismic vulnerability assessment of buildings and then focuses on assessing the efficiency of Dempster rule of combination in the fusion of the information sources for the seismic vulnerability assessment. Tabriz, a historical and earthquake prone city in the north west of Iran was selected as the study area. The results verified that some inconsistencies among information sources exist which are important to be considered while proposing a method for the fusion of the information in order to obtain vulnerability assessments with less uncertainty. Based on the assessed building damage, the number of probable victims was estimated. The produced physical and social seismic vulnerability maps provide the required information for urban planners and administrators to reduce property and human losses through pre-earthquake mitigation and preparedness plans efficiently.
... Seismic vulnerability assessment studies commonly analyze case studies using a combination of multi-criteria decision-making (MCDM) and geographic information system (GIS) approaches [5][6][7]. Among these, the analytical hierarchy process (AHP) is one of the most widely known MCDM methodologies; it stratifies and quantifies the importance of each applied influential factor to determine its relative importance, and assesses vulnerability by applying weights to all factors [8][9][10][11][12]. ...
Article
Full-text available
The main purpose of this study was to compare the prediction accuracies of various seismic vulnerability assessment and mapping methods. We applied the frequency ratio (FR), decision tree (DT), and random forest (RF) methods to seismic data for Gyeongju, South Korea. A magnitude 5.8 earthquake occurred in Gyeongju on 12 September 2016. Buildings damaged during the earthquake were used as dependent variables, and 18 sub-indicators related to seismic vulnerability were used as independent variables. Seismic data were used to construct a model for each method, and the models’ results and prediction accuracies were validated using receiver operating characteristic (ROC) curves. The success rates of the FR, DT, and RF models were 0.661, 0.899, and 1.000, and their prediction rates were 0.655, 0.851, and 0.949, respectively. The importance of each indicator was determined, and the peak ground acceleration (PGA) and distance to epicenter were found to have the greatest impact on seismic vulnerability in the DT and RF models. The constructed models were applied to all buildings in Gyeongju to derive prediction values, which were then normalized to between 0 and 1, and then divided into five classes at equal intervals to create seismic vulnerability maps. An analysis of the class distribution of building damage in each of the 23 administrative districts showed that district 15 (Wolseong) was the most vulnerable area and districts 2 (Gangdong), 18 (Yangbuk), and 23 (Yangnam) were the safest areas.
... Concerning the necessity of assessing buildings damages in earthquake prone areas, some studies have considered 'estimating buildings damages' as a decision making problem (Armaş, 2012;Moradi et al., 2014;Rezaie and Panahi, 2015). However, decision making process for estimating buildings damages involves some uncertainties (Sadrykia et al., 2017b). In this paper two important aspects of seismic vulnerability assessment are considered; firstly estimating severity of damages caused by a hypothetical earthquake scenario to existing buildings realistically and second estimating the injured/ killed people based on the estimated buildings damages called social vulnerability. ...
Conference Paper
Earthquakes are one of the natural disasters which threaten many lives every year. It is impossible to prevent earthquakes from occurring; but it is possible to predict the human and property losses in advance, to mitigate the adverse effects of the catastrophe. However, seismic vulnerability assessment is a complex uncertain spatial decision making problem due to intrinsic uncertainties such as lack of complete data, vagueness of experts’ comments and uncertainties in the numerical data/relations. The developed method uses Fuzzy Sets Theory to model the vagueness on ‘weights of the selected criteria’ and ‘relationships of the criteria with buildings damages’, Dempster-Shafer Theory is used for dealing with incompleteness of data. Since fusion is a solution to obtain more reliable results in case uncertain and incomplete data exist, Dempster combination rule is suggested to combine independent information on the vulnerability of buildings providing decreased uncertainty of the results. The proposed method’s applicability is tested on existing buildings of a municipality district of Tabriz, a historical, earthquake prone city in Iran. Physical seismic vulnerability maps are produced representing the distribution and severity of the estimated damages to the buildings. The number of injured and killed people is estimated based on the information on the assessed buildings damages. It can be concluded that this paper contributes to a pragmatic and rational modelling of physical and social vulnerability in earthquake prone areas with incomplete data, providing necessary information for urban planners and administrators to reduce earthquake losses through disaster mitigation and preparedness plans.
... Xie et al. [21] proposed a stochastic decision-making intuitionistic fuzzy method using applied foreground theory and gray relational analysis, which verified the effectiveness and feasibility of an example. Sadrykia et al. [22] proposed a new multi-criteria decision-making (MCDM) method based on a geospatial information system, which was applied to predict the extent of building damage before a potential earthquake occurs. ...
Article
Full-text available
Bearings running state affects the normal operation of mechanical equipment. It is of great theoretical and practical value to carry out bearing fault diagnosis. In bearing fault diagnosis research, the extraction and selection of fault features can help improving the accuracy of bearing fault diagnosis. However, these researches suffer from the following weaknesses. (1) High dimension of the selected features. (2) Uncertainty of single sensor for data sampling. Therefore, in this paper, a feature selection feedback network (FSFN) is proposed to overcome the first weakness. At the same time, we proposed an improved Dempster–Shafer (IDS) evidence theory fusion method based on the kappa coefficient to deal with the second weakness. Extensive evaluations of the proposed method on the CUT-2 experimental platform dataset showed that FSFN can not only reduce the dimension of the final selected feature without decreasing the diagnostic accuracy but also shorten the time of feature selection. Moreover, compared with the existing DS evidence theory fusion method, IDS can achieve higher average fusion precision and improve the accuracy and reliability of bearing fault diagnosis.
... Remotely sensed data can cover a large area to retrieve the spectral information in real time during crop growing period [13]. Landsat missions have been collecting and archiving imagery worldwide with multispectral sensors providing insights into plant response to solar radiation, opening the era of remote vegetation indices (VIs). ...
Article
Full-text available
Assessing crop yield trends over years is a key step in site specific management, in view of improving the economic and environmental profile of agriculture. This study was conducted in a 11.07 ha area under Mediterranean climate in Northern Italy to evaluate the spatial variability and the relationships between six remotely sensed vegetation indices (VIs) and grain yield (GY) in five consecutive years. A total of 25 satellite (Landsat 5, 7, and 8) images were downloaded during crop growth to obtain the following VIs: Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), Soil Adjusted Vegetation Index (SAVI), Green Normalized Difference Vegetation Index (GNDVI), Green Chlorophyll Index (GCI), and Simple Ratio (SR). The surveyed crops were durum wheat in 2010, sunflower in 2011, bread wheat in 2012 and 2014, and coriander in 2013. Geo-referenced GY and VI data were used to generate spatial trend maps across the experimental field through geostatistical analysis. Crop stages featuring the best correlations between VIs and GY at the same spatial resolution (30 m) were acknowledged as the best periods for GY prediction. Based on this, 2-4 VIs were selected each year, totalling 15 VIs in the five years with r values with GY between 0.729** and 0.935**. SR and NDVI were most frequently chosen (six and four times, respectively) across stages from mid vegetative to mid reproductive growth. Conversely, SAVI never had correlations high enough to be selected. Correspondence analysis between remote VIs and GY based on quantile ranking in the 126 (30 m size) pixels exhibited a final agreement between 64% and 86%. Therefore, Landsat imagery with its spatial and temporal resolution proved a good potential for estimating final GY over different crops in a rotation, at a relatively small field scale.
... Fuzzy logic methods are able to deal with approximate and inaccurate values and are capable of capturing non-linear relationships between them [21], and so they can be applied to moving point object data sets [30]. The plethora of research contributions on exploiting fuzzy logic approaches demonstrates their applicability in movement applications [26] such as classifying human activity patterns [30], detecting travel transport modes [8], clustering mobile object trajectories [15] and eye movement [7] to name but a few. ...
Article
Movements of objects take place in different contexts and their trajectories are highly influenced by the contexts. Several studies have been conducted in the last decade on similarity measuring of raw trajectories, but very few have used context information in this process. Because the context information is collected from multifarious sources, it is qualitatively and quantitatively heterogeneous and uncertain. Therefore, the current distance functions are unable to measure the similarities between trajectories by considering the heterogeneous context information. This article presents a new context-aware hybrid fuzzy model, named CaFIRST, to measure the similarity of trajectories by considering not only the spatial footprints of moving objects but also various types of internal and external context information. CaFIRST is able to handle multi-size trajectories that are contextually enriched by both quantitative (numeric) and qualitative (descriptive) values. The performance of CaFIRST was examined using two real data sets, obtained from pedestrians and cyclists in New York City, USA. The results showed the robustness of CaFIRST for quantifying the commonalities in multivariate trajectories and its sensitivity to small alterations in context information. Furthermore, the effects of internal and external context information on similarity values are shown to be remarkable.
... Satellite data in combination with Geographical Information System (GIS) and Global Positioning System (GPS) has been used by the researchers in the past as well as today, to gather the data required for monitoring of crops, management of resources and make decisions on farming practices [4]. There is a pressing need to generate and update the data required for analysis and to automate the procedure of crop inventory using the recent technologies such as geoinformatics [5]. ...
Article
Full-text available
Crop growth monitoring and yield estimation is an important process for agricultural economic return prediction and food security. Improvement in the accuracy and timeliness of the information about pre harvest prediction of crops by blending of ancillary data and remotely sensed data in the temporal domain lead to the effective and optimized decision making. Although previous studies have found strong correlation between observed and predicted yield based on Normalized Difference Vegetation Index (NDVI), yet there is a pressing need to provide more accurate and reliable yield in the small areas. The objective of the proposed model is to extract the information related to the crop yield for the sugarcane planted in the small fields of Himalayan foothills region. Relation of crop yield information with the different stages of crop growth has been considered in the proposed model. The process based on correlation analysis of spatiotemporal data to identify of the best periods for the reliable estimation of the sugarcane is presented. The best period for the crop predictability is during 210 to 270 days after plantation. Based on the historical data of 10 years, it has been found that predictability of nonlinear modelling is significant and is of the order of 0.6. By Tukey test and RMSE the best fit regression models are identified for the study area.
Preprint
Full-text available
Population growth causes urban spatial expansion and harms ecological units on the periphery of cities. Determining the growth trends of an urban area is vital in developing predictive planning techniques, defining manageable urban processes, directing investments to be made in the city, and increasing the quality of life with the balance between natural and built environment. It is a necessary planning tool for determining the dynamics affecting the expansion directions of the urban area and determining the possible urban growth areas with a holistic approach, detecting the possible problems that may occur in the future and finding solutions to these problems. The scope of this study is to build a model that predicts the possible urban expansion areas. The model is developed in line with the main criteria defined as proximity, natural environments, built-up environments, and plan decisions. The weights of each criterion and related sub-criteria were determined with the analytical hierarchy process (AHP). As a result of the study, the most probable urban development areas that will serve the 2040 projection population for the city of Saray were determined. This study aims to predict the growth direction of the urban area and determine the areas under the pressure of construction depending on the city's current dynamics. Thus, a practical urban growth estimation model has been put forward for future planning studies. The model results show that the city of Saray is inclined to continue its urban form as mono-centric and compact in the year 2040.
Article
To reduce human losses and minimize social and economic disruption caused by large-scale earthquakes, effective planning and operational decisions need to be made by responsible agencies and institutions across all pre- and post-disaster stages. Operations Research (OR), which encompasses a broad array of quantitative and analytical methods for systematic decision making, has garnered a considerable amount of attention in the disaster operations management literature over the past few decades. The purpose of this review is to highlight and discuss main lines of research involving the use of OR techniques applied specifically to earthquakes disasters. As part of our review, we identify existing research gaps and propose a roadmap to guide future work and enhance the real-world applicability of OR to earthquake operations management. We emphasize the need for (i) developing models that are specifically tailored to earthquake operation management, including the need to contend with cascading effects and secondary disasters caused by aftershocks; (ii) greater stakeholder involvement in problem identification and methodological approach to enhance realism and adoption of OR models by practitioners; (iii) more holistic planning frameworks that combine decision making across multiple disaster stages; (iv) integration of OR methods with real- and near real-time information systems, while confronting the problem of dealing with missing and incomplete data; (v) greater use of use of multi-methodology and interdisciplinary approaches, including behavioral OR and Soft OR techniques as well as seismology and earthquake engineering expertise; and (vi) improved data generation defined at appropriate scales and better probability estimation of earthquake scenarios.
Chapter
Immer wieder erschüttern unvorhergesehene Katastrophen die Welt und verursachen enorme humanitäre und volkswirtschaftliche Schäden. Nach einer Katastrophe fehlen den Akteuren der Hilfseinsätze häufig Informationen über die genaue Situation vor Ort, was eine bedarfsgerechte Versorgung der Betroffenen inkl. der Beschaffung aller benötigten Hilfsgüter und Ausrüstungsgegenstände erschwert. Mittels einer systematischen Literaturrecherche soll erfasst werden, in welcher Form und in welchem Umfang Geographische Informationssysteme (GIS) zur Unterstützung der humanitären Logistik eingesetzt werden können. Dabei werden verschiedene Anwendungsmöglichkeiten von GIS identifiziert, bspw. die Erstellung von Lagebeurteilungen nach einer Katastrophe oder die Standortbestimmung von Hilfsgüterverteilzentren in der Katastrophenvorsorge. Die Ergebnisdarstellung der Recherche erfolgt mithilfe eines Ordnungsrahmens in einem anwendungsorientierten Bezug. Darüber hinaus wird den Akteuren der humanitären Beschaffungslogistik ein umfassender Überblick über aktuelle und potenzielle zukünftige Einsatzmöglichkeiten von GIS geboten.
Article
Full-text available
Both aleatory and epistemic uncertainties associated with different sources and components of risk (hazard, exposure, vulnerability) are present at each step of seismic risk assessments. All individual sources of uncertainty contribute to the total uncertainty, which might be very high and, within the decision-making context, may therefore lead to either very conservative and expensive decisions or the perception of considerable risk. When anatomizing the structure of the total uncertainty, it is therefore important to propagate the different individual uncertainties through the computational chain and to quantify their contribution to the total value of risk. The present study analyses different uncertainties associated with the hazard, vulnerability and loss components by the use of logic trees. The emphasis is on the analysis of epistemic uncertainties, which represent the reducible part of the total uncertainty, including a sensitivity analysis of the resulting seismic risk assessments with regard to the different uncertainty sources. This investigation, being a part of the EU FP7 project MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe), is carried out for the example of, and with reference to, the conditions of the city of Cologne, Germany, which is one of the MATRIX test cases. At the same time, this particular study does not aim to revise nor to refine the hazard and risk level for Cologne; it is rather to show how large are the existing uncertainties and how they can influence seismic risk estimates, especially in less well-studied areas, if hazard and risk models adapted from other regions are used.
Article
Full-text available
The main issue in determining the seismic vulnerability is having a comprehensive view to all probable damages related to earthquake occurrence. Therefore, taking factors such as peak ground acceleration (PGA) in the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education, the physical distance to a hospitals (or medical care centers), etc. into account and categorized under four indicators of geotechnical, structural, social and physical distance to needed facilities and distance from dangerous ones will provide us with a better and more exact outcome. To this end in this paper using analytic hierarchy process (AHP), the amount of importance of criteria or alternatives are determined and using geographical information system (GIS), the vulnerability of Tehran metropolitan as a result of an earthquake, is studied. This study focuses on the fact that Tehran is surrounded by three active and major faults of the Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran city is analysed and classified into four levels including high, medium, low and safe. The results show that regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey respectively make 6, 16 and 10% of Tehran area highly vulnerable and also 34, 14 and 27% are safe.
Article
Full-text available
One of the most important problems in Iranian cities is their vulnerability to natural hazards such as earthquake, storms, etc. This is particularly more critical in old part of the cities due low conditions of its physical fabric considering the quality of materials, less permeability, unrenovation, etc. Purpose: We aim to identify the most deterioration parts within the old texture of city of Zanjan using GIS and relating techniques. We also try to recognize the main reasons which caused these deterioration problems. Methods: To do this, we first conceptualize eleven physical-spatial factors. These factors have been analyzed using fuzzy logic and IHPW Inverse Hierarchy Process Weight within GIS. Results and conclusions: Results of the model which has been applied on the city of Zanjan's old fabric illustrate that a fuzzy approach is a basic tool to identify vulnerability. Its application to the problem helps to unify relevant theory and practice. It also generated maps of vulnerable spots that can be more adaptable to changes and can greatly assist planners and policy makers in decision-making.
Article
Full-text available
Tehran, capital of Iran, is located on a number of known and unknown faults which make this mega city exposed to huge earthquakes. Determining locations and intensity of seismic vulnerability of a city is considered as a complicated disaster management problem. As this problem generally depends on various criteria, one of the most important challenges concerned is the existence of uncertainty regarding inconsistency in combining those effective criteria. The emergence of uncertainty in seismic vulnerability map results to some biases in risk management which has multilateral effects in dealing with the consequences of the earthquake. To overcome this problem, this paper proposes a new approach for Tehran’s seismic vulnerability classification based on granular computing. One of the most significant properties of this method is inference of accurate rules having zero entropy from predefined classification undertaken based on training datasets by the expert. Furthermore, not-redundant covering rules will be extracted for consistent classification where one object maybe classified with two or more nonredundant rules. In this paper, Tehran statistical zones (3,173 according to 1996 census) are considered as the study area. Since this city has not experienced a disastrous earthquake since 1830, this work’s results is the relative accurate with respect to the results of previous studies.
Article
Full-text available
The European Commission funded the RISK-UE project in 1999 with the aim of providing an advanced approach to earthquake risk scenarios for European towns and regions. In the framework of Risk-UE project, two methods were proposed, originally derived and calibrated by the authors, for the vulnerability assessment of current buildings and for the evaluation of earthquake risk scenarios: a macroseismic model, to be used with macroseismic intensity hazard maps, and a mechanical based model, to be applied when the hazard is provided in terms of peak ground accelerations and spectral values. The vulnerability of the buildings is defined by vulnerability curves, within the macroseismic method, and in terms of capacity curves, within the mechanical method. In this paper, the development of both vulnerability and capacity curves is presented with reference to an assumed typological classification system; moreover, their cross-validation is presented. The parameters of the two methods and the steps for their operative implementation are provided in the paper.
Article
Full-text available
We present the results of a new genera tion of probabilistic seismic hazard assessment for Switzerland. This study replaces the previous intensity-based generation of national hazard maps of 1978. Based on a revised moment-magnitude earthquake catalog for Switzerland and the surrounding regions, covering the period 1300–2003, sets of recurrence parameters (a and b values, M max ) are estimated. Information on active faulting in Switzerland is too sparse to be used as source model. We develop instead two models of areal sources. The first oriented towards capturing historical and instrumental seismicity, the second guided largely by tectonic principles and express ing the alterative view that seismicity is less stationary and thus future activity may occur in previously quiet regions. To estimate three alterna tive a and b value sets and their relative weighting, we introduce a novel approach based on the modified Akaike information criterion, which allows us to decide when the data in a zone deserves to be fitted with a zone-specific b value. From these input parameters, we simulate synthetic earthquake catalogs of one-million-year duration down to magnitude 4.0, which also reflect the difference in depth distribution between the Alpine Foreland and the Alps. Using a specific predictive spectral ground motion model for Switzerland, we estimate expected ground motions in units of the 5% damped acceleration response spectrum at frequencies of 0.5–10 Hz for all of Switzerland, referenced to rock sites with an estimated shear wave velocity of 1,500 m/s2 in the upper 30 m. The highest hazard is found in the Wallis, in the Basel region, in Graubünden and along the Alpine front, with maximum spectral accelerations at 5 Hz frequency reaching 150 cm/s2 for a return period of 475 years and 720 cm/s2 for 10,000 years.
Article
Full-text available
Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Zuccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE’s estimation method, and using new ground motion attenuation relationships from the latest published results at that time. CDMG eventually published the second edition map in 1992 following the Governor’s Board of Inquiry on the 1989 Loma Prieta earthquake and at the demand of Caltrans. The third edition map was published by Caltrans in 1996 utilizing GIS technology to manage data that includes a simplified three-dimension geometry of faults and to facilitate efficient corrections and revisions of data and the map. The spatial relationship of fault hazards with highways, bridges or any other attribute can be efficiently managed and analyzed now in GIS at Caltrans. There has been great confidence in using DSHA in bridge engineering and other applications in California, and it can be confidently applied in any other earthquake-prone region. Earthquake hazards defined by DSHA are: (1) transparent and stable with robust MCE moment magnitudes; (2) flexible in their application to design considerations; (3) can easily incorporate advances in ground motion simulations; and (4) economical. DSHA and neo-DSHA have the same approach and applicability. The accuracy of DSHA has proven to be quite reasonable for practical applications within engineering design and always done with professional judgment. In the final analysis, DSHA is a reality-check for public safety and PSHA results. Although PSHA has been acclaimed as a better approach for seismic hazard assessment, it is DSHA, not PSHA, that has actually been used in seismic hazard assessment for building and bridge engineering, particularly in California.
Article
Full-text available
The vulnerability index method, in its version developed in the framework of the European project Risk-UE, has been adapted and applied in this article, to evaluate the seismic risk for the city of Barcelona (Spain) through a GIS based tool. According to this method, which defines five damage states, the action is expressed in terms of the macroseismic intensity and the seismic quality of the buildings by means of a vulnerability index. The probabilities of damage states are obtained considering a binomial or beta-equivalent probability distribution. The most relevant seismic risk evaluation results obtained, for current buildings and monuments of Barcelona, are given in the article as scenarios of expected losses. KeywordsSeismic hazard-Seismic vulnerability-Risk scenarios-Loss estimation-Urban areas-GIS
Article
Full-text available
This article contributes to the development and application of two latest-generation methods of seismic risk analysis in urban areas. The first method, namely vulnerability index method (VIM), considers five non-null damage states, defines the action in terms of macroseismic intensity and the seismic quality of the building by means of a vulnerability index. The estimated damage degree is measured by semi-empirical functions. The second method, namely capacity spectrum based method (CSBM), considers four no damage states, defines the seismic action in terms of response spectra and the building vulnerability by means of its capacity spectrum. In order to apply both methods to Barcelona (Spain) and compare the results, a deterministic and a probabilistic hazard scenario with soil effects are used. The deterministic one corresponds to a historic earthquake, while the probabilistic seismic ground motion has a probability of exceedence of 10% in 50years. Detailed information on the building design has been obtained along years by collecting, arranging, improving, and completing the database of the dwellings of the city. A Geographic Information System (GIS) has been customized allowing storing, analysing, and displaying this large amount of spatial and tabular data of dwellings. The obtained results are highly consistent with the historical and modern evolution of the populated area and show the validity and strength of both methods. Although Barcelona has a low to moderate seismic hazard, its expected seismic risk is significant because of the high vulnerability of its buildings. Cities such as Barcelona, located in a low to moderate seismic hazard region, are usually not aware of the seismic risk. The detailed risk maps obtained offer a great opportunity to guide the decision making in the field of seismic risk prevention and mitigation in Barcelona, and for emergency planning in the city.
Article
Full-text available
Seismic risk evaluation of built-up areas involves analysis of the level of earthquake hazard of the region, building vulnerability and exposure. Within this approach that defines seismic risk, building vulnerability assessment assumes great importance, not only because of the obvious physical consequences in the eventual occurrence of a seismic event, but also because it is the one of the few potential aspects in which engineering research can intervene. In fact, rigorous vulnerability assessment of existing buildings and the implementation of appropriate retrofitting solutions can help to reduce the levels of physical damage, loss of life and the economic impact of future seismic events. Vulnerability studies of urban centres should be developed with the aim of identifying building fragilities and reducing seismic risk. As part of the rehabilitation of the historic city centre of Coimbra, a complete identification and inspection survey of old masonry buildings has been carried out. The main purpose of this research is to discuss vulnerability assessment methodologies, particularly those of the first level, through the proposal and development of a method previously used to determine the level of vulnerability, in the assessment of physical damage and its relationship with seismic intensity. Also presented and discussed are the strategy and proposed methodology adopted for the vulnerability assessment, damage and loss scenarios for the city centre of Coimbra, Portugal, using a GIS mapping application. KeywordsHistoric city centres–Traditional masonry buildings–Vulnerability–Seismic risk–Vulnerability index method–Damage scenarios–Loss estimation–GIS mapping
Article
Full-text available
This paper presents a methodology to represent and propagate epistemic uncertainties within a scenario-based earthquake risk model. Unlike randomness, epistemic uncertainty stems from incomplete, vague or imprecise information. This source of uncertainties still requires the development of adequate tools in seismic risk analysis. We propose to use the possibility theory to represent three types of epistemic uncertainties, namely imprecision, model uncertainty and vagueness due to qualitative information. For illustration, an earthquake risk assessment for the city of Lourdes (Southern France) using this approach is presented. Once adequately represented, uncertainties are propagated and they result in a family of probabilistic damage curves. The latter is synthesized, using the concept of fuzzy random variables, by means of indicators bounding the true probability to exceed a given damage grade. The gap between the pair of probabilistic indicators reflects the imprecise character of uncertainty related to the model, thus picturing the extent of what is ignored and can be used in risk management. KeywordsEarthquake risk analysis–Epistemic uncertainty–Possibility theory–Fuzzy logic–Fuzzy random variable
Article
The book that launched the Dempster–Shafer theory of belief functions appeared 40 years ago. This intellectual autobiography looks back on how I came to write the book and how its ideas played out in my later work.
Article
One of the most important steps in earthquake disaster management is the prediction of probable damages which is called earthquake vulnerability assessment. Earthquake vulnerability assessment is a multi-criteria problem and a number of multi-criteria decision making models have been proposed for the problem. Two main sources of uncertainty including uncertainty associated with experts’ point of views and the one associated with attribute values exist in the earthquake vulnerability assessment problem. If the uncertainty in these two sources is not handled properly the resulted seismic vulnerability map will be unreliable. The main objective of this research is to propose a reliable model for earthquake vulnerability assessment which is able to manage the uncertainty associated with the experts’ opinions. Granular Computing (GrC) is able to extract a set of if-then rules with minimum incompatibility from an information table. An integration of DempsterShafer Theory (DST) and GrC is applied in the current research to minimize the entropy in experts’ opinions. The accuracy of the model based on the integration of the DST and GrC is 83%, while the accuracy of the single-expert model is 62% which indicates the importance of uncertainty management in seismic vulnerability assessment problem. Due to limited accessibility to current data, only six criteria are used in this model. However, the model is able to take into account both qualitative and quantitative criteria.
Chapter
A multivalued mapping from a space X to a space S carries a probability measure defined over subsets of X into a system of upper and lower probabilities over subsets of S. Some basic properties of such systems are explored in Sects. 1 and 2. Other approaches to upper and lower probabilities are possible and some of these are related to the present approach in Sect. 3. A distinctive feature of the present approach is a rule for conditioning, or more generally, a rule for combining sources of information, as discussed in Sects. 4 and 5. Finally, the context in statistical inference from which the present theory arose is sketched briefly in Sect. 6.
Chapter
The problem of combining pieces of evidence issued from several sources of information turns out to be a very important issue in artificial intelligence. It is encountered in expert systems when several production rules conclude on the value of the same variable, but also in robotics when information coming from different sensors is to be aggregated. Solutions proposed in the literature so far have often been unsatisfactory because relying on a single theory of uncertainty, a unique mode of combination, or the absence of analysis of the reasons for uncertainty. Besides dependencies and redundancies between sources must be dealt with especially in knowledge bases, where sources correspond to production rules.
Article
Dempster-Shafer theory of evidence is a tool of uncertainty modeling when information is provided by experts. Dempster has given a rule to combine evidences coming from different independent sources. However, Dempster's rule of combination has been criticized as some times it gives some illogical results. Many alternatives have been proposed by different researchers. In this paper we have also proposed a new rule of combination. Efficiency and validity of our approach have been demonstrated with numerical examples and comparing with other existing methods.
Article
Probabilistic seismic hazard analysis (PSHA) provides the conceptual framework for estimating the likelihood that something of concern related to earthquake shaking will occur over a specified time period. Based on more than thirty years of research and development ( e.g. , Cornell, 1968; Algermissen et al. , 1982; SSHAC, 1997), PSHA has become a standard tool for combining information on earthquake occurrence, seismic radiation, and shaking response to produce hazard estimates, including the U.S. Geological Survey's national seismic hazard maps (Frankel et al. , 1996, 1997). PSHA methods, while now mature, continue to evolve as scientists improve the characterization of earthquake hazards and engineers develop new measures of seismic shaking for performance-based design. The Southern California Earthquake Center (SCEC), in collaboration with USGS, the California Geological Survey (CGS), and other partners, has undertaken a series of studies aimed at improving the regional application of PSHA methods. Phase I examined the implications of the 1992 Landers earthquake sequence for regional seismic hazards (WGCEP, 1992), Phase II developed a probabilistic earthquake forecast model (WGCEP, 1995), and Phase III assessed the wave-propagation and site effects that give rise to local variations in seismic shaking (see Field et al. [2000] for an overview). The Phase III study found that accounting for some site attributes ( i.e. , the 30-meter shear-wave velocity and basin depth) can lead to significant improvements in PSHA. It was also found that making such corrections does not significantly reduce the prediction uncertainty associated with empirical ground-motion relations. In fact, 3D waveform modeling presented in the same report (Olsen, 2000) implied that this residual uncertainty represents an intrinsic variability caused by complex propagation effects that are unique to each earthquake-rupture/site combination. The Phase III study therefore concluded that significant improvements in PSHA will require replacing the standard empirical-regression …
Article
Modelling has permeated virtually all areas of industrial, environmental, economic, bio-medical or civil engineering: yet the use of models for decision-making raises a number of issues to which this book is dedicated: How uncertain is my model Is it truly valuable to support decision-making What kind of decision can be truly supported and how can I handle residual uncertainty How much refined should the mathematical description be, given the true data limitations Could the uncertainty be reduced through more data, increased modeling investment or computational budget Should it be reduced now or later How robust is the analysis or the computational methods involved Should could those methods be more robust Does it make sense to handle uncertainty, risk, lack of knowledge, variability or errors altogether How reasonable is the choice of probabilistic modeling for rare events How rare are the events to be considered How far does it make sense to handle extreme events and elaborate confidence figures Can I take advantage of expert phenomenological knowledge to tighten the probabilistic figures Are there connex domains that could provide models or inspiration for my problem Written by a leader at the crossroads of industry, academia and engineering, and based on decades of multi-disciplinary field experience, Modelling Under Risk and Uncertainty gives a self-consistent introduction to the methods involved by any type of modeling development acknowledging the inevitable uncertainty and associated risks. It goes beyond the "black-box" view that some analysts, modelers, risk experts or statisticians develop on the underlying phenomenology of the environmental or industrial processes, without valuing enough their physical properties and inner modelling potential nor challenging the practical plausibility of mathematical hypotheses; conversely it is also to attract environmental or engineering modellers to better handle model confidence issues through finer statistical and risk analysis material taking advantage of advanced scientific computing, to face new regulations departing from deterministic design or support robust decision-making. Modelling Under Risk and Uncertainty: Addresses a concern of growing interest for large industries, environmentalists or analysts: robust modeling for decision-making in complex systems. Gives new insights into the peculiar mathematical and computational challenges generated by recent industrial safety or environmental control analysis for rare events. Implements decision theory choices differentiating or aggregating the dimensions of risk/aleatory and epistemic uncertainty through a consistent multi-disciplinary set of statistical estimation, physical modelling, robust computation and risk analysis. Provides an original review of the advanced inverse probabilistic approaches for model identification, calibration or data assimilation, key to digest fast-growing multi-physical data acquisition. Illustrated with one favourite pedagogical example crossing natural risk, engineering and economics, developed throughout the book to facilitate the reading and understanding. Supports Master/PhD-level course as well as advanced tutorials for professional training Analysts and researchers in numerical modeling, applied statistics, scientific computing, reliability, advanced engineering, natural risk or environmental science will benefit from this book.
Article
Tehran, one of the important cities of Iran, is in great risk against earthquakes because several active faults are located into or around it. The significance of this metropolis from the economic and political point of view, high population and risks of possible earthquake has drawn the attention of urban managers to this problem. On this basis, to confront the probable risks and reduce the negative effects of this phenomenon, it is indispensible and of important objectives of Tehran urban manag ement to investigate the seismic vulnerability of the city. With respect to this important issue, region 1 of Tehran municipality was selected as the study area because of its proximity to the active faults at north of Tehran. The study method and the analysis of the gathered data were performed using the methods based on information database, RADIUS, TOPSIS and AHP models, and the software based on the Geographical Information System. Variables such as the buildings location in proportion to faults, type of materials, oldness of the buildings, number of floors, population density, soil type, slope of the region, and pathway network were used for the research and the region vulnerability using 3 probable earthquake scenarios were investigated. Results indicated that region 1 of Tehran municipality is vulnerable against earthquakes.
Article
The expansive infrastructure, along with the high population density, makes cities highly vulnerable to the severe impacts of natural hazards. In the context of an explosive increase in value of the damage caused by natural disasters, the need for evaluating and visualizing the vulnerability of urban areas becomes a necessity in helping practitioners and stakeholders in their decision-making processes. The paper presented is a piece of exploratory research. The overall aim is to develop a spatial vulnerability approach to address earthquake risk, using a semi-quantitative model. The model uses the analytical framework of a spatial GIS-based multi-criteria analysis. For this approach, we have chosen Bucharest, the capital city of Romania, based on its high vulnerability to earthquakes due to a rapid urban growth and the advanced state of decay of the buildings (most of the building stock were built between 1940 and 1977). The spatial result reveals a circular pattern, pinpointing as hot spots the Bucharest historic centre (located on a meadow and river terrace, and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic conditions). In a sustainable development perspective, the example of Bucharest shows how spatial patterns shape the “vulnerability profile” of the city, based on which decision makers could develop proper prediction and mitigation strategies and enhance the resilience of cities against the risks resulting from the earthquake hazard.
Article
The vulnerability assessment is important for earthquake prevention and mitigation. Since many criteria need to be considered during the evaluation process, it can be modeled as a multiple criteria decision making (MCDM) problem. This paper proposes an approach which integrates the results of different MCDM methods to provide regional earthquake vulnerability assessment. The key idea of this approach is to determine the most trustable MCDM method by calculating the weights of several MCDM methods using the Spearman’s ranking correlation coefficients. The most trustable MCDM method is the one with the highest weight, which indicates that it has the strongest agreements with other MCDM methods, and is used to provide a final assessment using the combination of other MCDM methods. The proposed approach is applied to evaluate the earthquake vulnerability of 31 Chinese regions using six MCDM methods and eleven vulnerability evaluation indices. The results indicate that the proposed approach can integrate the inconsistent evaluation results of different MCDM methods and produce a comprehensive assessment of regional earthquake vulnerability.
Article
In the current practice of probabilistic seismic hazard analysis (PSHA) using logic trees, it is common to use the mean hazard curve to determine ground motions for engineering design. This paper presents the case against the use of the mean hazard curve and explains why this practice should be discontinued and, where necessary, removed from regulations. It is emphasized that the hazard curve taken as the basis for design should be chosen on the basis of the fractile that reflects the desired degree of confidence that the safety level implied by the selected annual frequency of exceedance is being achieved in light of the uncertainty in the estimation of the hazard.
Article
Seismic design parameters are inherently imprecise and fuzzy. Formal models of inexact inferences are therefore highly relevant to the interpretation of such inexactly described seismic parameters. Application of such techniques becomes even more relevant if it is envisaged that, in the future, provisions for seismic design may be more effectively represented by encoding them as fuzzy expert systems. The linguistic representation of imprecise variables is presented. More importantly, how an evidence-based scheme, the Dempster-Shafer theory of evidence, can be used to infer conclusions about imprecisely stated problems is demonstrated. A set of seismic parameters, namely the seismic performance category (SPC), the seismic hazard exposure (SHE), and the seismic intensity (SI), are selected from a seismic design code. How the relationship between the three parameters can be “softened” is demonstrated using a linguistic representation; such a representation involves setting up conditional IF... THEN types of rules describing how different states of rule-antecedents, SI and SHE, will lead to different states of a rule-consequent, SPC. A frame of discernment (θ) is then defined which comprises several hypotheses about SPC. Thereafter evidence about SI and SHE is used to infer conclusions about SPC. The conclusions are represented by a combined degree of belief and plausibility in the hypotheses in θ.
Article
A body of evidence in the sense of Shafer can be viewed as an extension of a probability measure, but as a generalized set as well. In this paper we adopt the second point of view and study the algebraic structure of bodies of evidence on a set, based on extended set union, intersection and complementation. Several notions of inclusion are exhibited and compared to each other. Inclusion is used to compare a body of evidence to the product of its projections. Lastly, approximations of a body of evidence under the form of fuzzy sets are derived, in order to squeeze plausibility values between two grades of possibility. Through all the paper, it is pointed out that a body of evidence can account for conjunctive as well as a disjunctive information, i.e. the focal elements can be viewed either as sets of actual values or as restrictions on the (unique) value of a variable.
Book
This is a collection of classic research papers on the Dempster-Shafer theory of belief functions. The book is the authoritative reference in the field of evidential reasoning and an important archival reference in a wide range of areas including uncertainty reasoning in artificial intelligence and decision making in economics, engineering, and management. The book includes a foreword reflecting the development of the theory in the last forty years.
Article
Estimation of urban vulnerability to natural hazards such as earthquakes can be considered as an ill-structured problem (i.e. a problem for which there is no unique, identifiable, objectively optimal solution). This paper outlines developing a spatial decision support system (SDSS) to assist earthquake vulnerability assessment. There are different criteria in characterizin g real world for disaster management. This paper addresses a suitable method for weighing different related factors affecting in decision making process and managing uncertainties of defined factors which is inherited from reality. These two problems, especially the latter, result in biases in decisions. The proposed solution could be uncertainty absorption, evaluation and documentation at each step, especially in real world characterization and hypotheses derivation. This research deals with such treatment for weighing and normalizing contributed factors in vulnerability of cities' population against earthquakes using analytic hierarchy process (AHP) method. The results shown that this method provide high degree of analytical capabilities and can be used as the basis for further research due to introduction of other effective factors such as social and economic situations) in earthquakes' vulnerability.
Article
We are concerned here with the problem of selecting an optimal alternative in situations in which there exists some uncertainty in our knowledge of the state of the world. We show how the Dempster-Shafer belief structure provides a unifying framework for representing various types of uncertainties. We also show how the OWA aggregation operators provide a unifying framework for decision making under ignorance. In particular we see how these operators provide a formulation of a type epistemic probabilities associated with our degree of optimism.
Article
Earthquakes have a greater effect on society than most people think. These effects range from structural damages to economic impacts and fatalities. An earthquake only lasts for a few seconds and the aftershocks may continue for days, but the damage does continue for years. Residential site safety and earthquake damage assessment studies play a crucial role in developing reliable rehabilitation and development programs, improving preparedness and mitigating losses in urbanized areas. The extremely densely populated metropolis of Tehran, which totals of 7,768,561 for 22 districts (according to the 2006 population census), coupled with the fragility of houses and infrastructure, highlight the necessity of a reliable earthquake damage assessment based on essential datasets, such as building resistance attributes, building population, soil structures, streets network and hazardous facilities. This paper presents a GIS-based model for earthquake loss estimation for a district in Tehran, Iran. Damages to buildings were calculated only for the ground shaking effect of one of the region's most active faults, the Mosha Fault in a likely earthquake scenario. Earthquake intensity for each building location was estimated based on attenuation relation and the ratio of damage was obtained from customized fragility curves. Human casualties and street blockages caused by collapsed buildings were taken into account in this study, as well. Finally, accessibility verification found locations without clear passages for temporary settlements by buildings via open streets. The model was validated using the 2003 Bam earthquake damages. The proposed model enables the decision-makers to make more reliable decisions based on various spatial datasets before and after an earthquake occurs. The results of the earthquake application showed total losses as follows: structural damages reaching 64% of the building stock, a death rate of 33% of all the residents, a severe injury rate reaching 27% of the population and street closures upwards of 22% due to building collapse.Highlights► Verification of previous works about earthquake disaster management and mitigation. ► Verification of the geologic and seismologic features of the study area. ► Developing a method for evaluating damages to buildings and streets in the study area. ► Estimating human casualties of each individual building. ► Finding buildings without access to temporary settlements via undamaged streets after earthquake.
Article
This study, with the help of proposed GIS based RADIUS (GBR) guideline focused on seismic damage estimation of the City of Kelowna, a City in the interior of British Columbia, Canada. Ground shaking intensity in the area was developed utilizing the seismic source zones defined by the Geological Survey of Canada and the expert opinions from the local experts. Building inventories were compiled by aggregating data from municipal databases as well as side‐walk surveys and survey through Google maps. Geographic Information Systems (GIS) came handy to provide a basis for an effective decision making and gauge the vulnerable areas and potential uncalled for scenarios. Estimated damage and damage distributions have been mapped on a block‐by‐block (5 km × 5 km) basis. The assessment revealed, an earthquake scenario of Mw8.5 in the Cascadia zone may potentially damage around 58 buildings within the city, causing 12 injuries. Plus, the study showed some damage assessment for the lifelines, e.g. road and water pipelines networks. The assessment results further revealed Kelowna downtown area expected to suffer highest amount of damage, which in turn may produce the highest amount of economic loss as the concentration of concrete high‐rise buildings and clustered economic activities. Therefore, for good measure, extra meticulous efforts and razor sharp insight bundled with precise seismic damage estimation (2 km × 2 km grids) is conducted for the downtown area to provide guidelines for the emergency response. The proposed GIS‐based RADIUS (GBR) framework provides a useful tool to assess quickly the expected damages in response to a major seismic event, which can be updated easily during disaster. Permalink: http://dx.doi.org/10.1061/(ASCE)NH.1527-6996.0000082
Article
The representation of multiplication operation on fuzzy numbers is very useful and important in the fuzzy system such as the fuzzy decision making. In this paper, we propose a new arithmetical principle and a new arithmetical method for the arithmetical operations on fuzzy numbers. The new arithmetical principle is the L−1-R−1 inverse function arithmetic principle. Based on the L−1-R−1 inverse function arithmetic principle, it is easy to interpret the multiplication operation with the membership functions of fuzzy numbers. The new arithmetical method is the graded multiple integrals representation method. Based on the graded multiple integrals representation method, it is easy to compute the canonical representation of multiplication operation on fuzzy numbers. Finally, the canonical representation is applied to a numerical example of fuzzy decision.
Article
Within the framework of evidence theory, data fusion consists in obtaining a single belief function by the combination of several belief functions resulting from distinct information sources. The most popular rule of combination, called Dempster's rule of combination (or the orthogonal sum), has several interesting mathematical properties such as commutativity or associativity. However, combining belief functions with this operator implies normalizing the results by scaling them proportionally to the conflicting mass in order to keep some basic properties. Although this normalization seems logical, several authors have criticized it and some have proposed other solutions. In particular, Dempster's combination operator is a poor solution for the management of the conflict between the various information sources at the normalization step. Conflict management is a major problem especially during the fusion of many information sources. Indeed, the conflict increases with the number of information sources. That is why a strategy for re-assigning the conflicting mass is essential. In this paper, we define a formalism to describe a family of combination operators. So, we propose to develop a generic framework in order to unify several classical rules of combination. We also propose other combination rules allowing an arbitrary or adapted assignment of the conflicting mass to subsets.
Article
A fuzzy set is a class of objects with a continuum of grades of membership. Such a set is characterized by a membership (characteristic) function which assigns to each object a grade of membership ranging between zero and one. The notions of inclusion, union, intersection, complement, relation, convexity, etc., are extended to such sets, and various properties of these notions in the context of fuzzy sets are established. In particular, a separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.
Article
During the past two years, the Dempster-Shafer theory of evidence has attracted considerable attention within the AI community as a promising method of dealing with uncertainty in expert systems. As presented in the literature, the theory is hard to master. In a simple approach that is outlined in this paper, the Dempster-Shafer theory is viewed in the context of relational databases as the application of familiar retrieval techniques to second-order relations, that is, relations in which the data entries are relations in first normal form. The relational viewpoint clarifies some of the controversial issues in the Dempster-Shafer theory and facilitates its use in AI-oriented applications.
Article
One of the fundamental tenets of modern science is that a phenomenon cannot be claimed to be well understood until it can be characterized in quantitative terms.l Viewed in this perspective, much of what constitutes the core of scientific knowledge may be regarded as a reservoir of concepts and techniques which can be drawn upon to construct mathematical models of various types of systems and thereby yield quantitative information concerning their behavior.
Article
Building hazard assessment prior to earthquake occurrence exposes interesting problems especially in earthquake prone areas. Such an assessment provides an early warning system for building owners as well as the local and central administrators about the possible hazards that may occur in the next scenario earthquake event, and hence pre- and post-earthquake preparedness can be arranged according to a systematic program. For such an achievement, it is necessary to have efficient models for the prediction of hazard scale of each building within the study area. Although there are subjective intensity index methods for such evaluations, the objective of this paper is to propose a useful tool through fuzzy logic (FL) to classify the buildings that would be vulnerable to earthquake hazard. The FL is a soft computing intelligent reasoning methodology, which is rapid, simple and easily applicable with logical and rational association between the building-hazard categories and the most effective factors. In this paper, among the most important factors are the story number (building height), story height ratio, cantilever extension ratio, moment of inertia (stiffness), number of frames, column and shear wall area percentages. Their relationships with the five hazard categories are presented through a supervised hazard center classification method. These five categories are “none”, “slight”, “moderate”, “extensive”, and “complete” hazard classes. A new supervised FL classification methodology is proposed similar to the classical fuzzy c-means procedure for the allocation of hazard categories to individual buildings. The application of the methodology is presented for Zeytinburnu quarter of Istanbul City, Turkey. It is observed that out of 747 inventoried buildings 7.6%, 50.0%, 14.6%, 20.1%, and 7.7% are subject to expected earthquake with “none”, “slight”, “moderate”, “extensive”, and “complete” hazard classes, respectively.
Article
Supplier selection is a multi-criterion decision making problem under uncertain environments. Hence, it is reasonable to hand the problem in fuzzy sets theory (FST) and Dempster Shafer theory of evidence (DST). In this paper, a new MCDM methodology, using FST and DST, based on the main idea of the technique for order preference by similarity to an ideal solution (TOPSIS), is developed to deal with supplier selection problem. The basic probability assignments (BPA) can be determined by the distance to the ideal solution and the distance to the negative ideal solution. Dempster combination rule is used to combine all the criterion data to get the final scores of the alternatives in the systems. The final decision results can be drawn through the pignistic probability transformation. In traditional fuzzy TOPSIS method, the quantitative performance of criterion, such as crisp numbers, should be transformed into fuzzy numbers. The proposed method is more flexible due to the reason that the BPA can be determined without the transformation step in traditional fuzzy TOPSIS method. The performance of criterion can be represented as crisp number or fuzzy number according to the real situation in our proposed method. The numerical example about supplier selection is used to illustrate the efficiency of the proposed method.
Article
This paper outlines a new software system we have developed that utilises the newly developed method (DS/AHP) which combines aspects of the Analytic Hierarchy Process (AHP) with Dempster–Shafer Theory for the purpose of multi-criteria decision making (MCDM). The method allows a decision maker considerably greater level of control (compared with conventional AHP methods) on the judgements made in identifying levels of favouritism towards groups of decision alternatives. More specifically, the DS/AHP analysis allows for additional analysis, including levels of uncertainty and conflict in the decisions made, for example. In this paper an expert system is introduced which enables the application of DS/AHP to MCDM. The expert system illustrates further the usability of DS/AHP, also including new aspects of analysis and representation offered through using this method. The principal application used to illustrate this expert system is that of identifying those residential properties to visit (view), from those advertised for ales through a real estate brokerage firm.