During emergency situations (e.g. pollution peaks, nuclear/radiological accidents, flash-floods) it will often be helpful for decision makers to have maps of the situation. Production of these maps has to be based on automatic procedures, as there will be limited time to analyse the problem. These methods have to be both quick and robust. Although the solutions to such problems tend to be complex, it is easy to forget that simpler methods can be useful. This paper examines some simple geostatistical solutions to two complex mapping problems, showing that these methods can be useful either as part of an automatic mapping procedure, for identification of the most important issues of the method, or as benchmarks for more complex solutions. The first problem is the emergency data set from the SIC2004 exercise, whereas the second problem is related to estimating runoff in flood situations. Two methods are examined-linear variogram and the use of the same variogram in extreme situations as in routine situations. The results from the SIC2004 data indicate that the anisotropy is the most important factor in this case. Both linear variograms and variograms from routine situations give reasonably good results, compared to more sophisticated methods submitted through the exercise. The results from the Austrian runoff data set also indicate that variograms from routine situations can be applied in extreme situations, with reasonably good results. The use of a routing model did not improve the results, and indicated that the flow velocities in flood situations need to be assessed locally.
Reliability-based analysis of cantilever retaining walls requires consideration of different failure mechanisms. In this paper, the reliability of soil-wall system is assessed considering two failure modes: rotational and structural stability, and the system reliability is assumed as a series system. The methodology is based on Monte Carlo Simulation (MCS), and it deals with the variability of the design parameters in the limit equilibrium analysis of a wall embedded in granular soil. Results of the MCS indicate that the reliability of the failure components increases exponentially by increasing the variability of design parameters. The results of the system reliability indicate how the system reliability is different from the component reliabilities. The strength of the weakest component influences the reliability of the system. The system reliability index increases with the wall section gradually. However it remains constant for the rotational failure mode.
Risk management for landslides involves, in addition to active and passive countermeasures, the collection of information through exploration. With such information, it is possible to reduce uncertainties, make more reliable decisions and therefore reduce risk. This paper addresses two types of exploration, namely (1) exploration (information collection) after the decision to obtain additional information is made, which is the standard type of exploration, and where one uses the new information to update risk, and (2) exploration (information collection) before the decision to obtain additional information is made, in which one conducts ‘virtual exploration’ to establish if exploration is worthwhile. The paper shows that both types of exploration can be assessed using either decision trees or Bayesian networks. Both approaches were applied to the Walton's Wood Landslide in England using infinite slope analysis and produced consistent results. The interested user can, based on what is documented in the paper, select and apply either one or both of the approaches.
Development of technologies for site characterization has grown at a faster pace compared to the development of decision-making methods required for the assimilation of inferences they generate. In the case of geophysical surveying, such dephase adds to the dependency on the use of expert's judgment in the interpretation of geophysical mappings. A systematic assimilation of this type of geo-surveying evidence is required, in particular for the integration of spatial geomorphological information (i.e., stratigraphy), characterized from different geophysical methods. This paper presents a methodology to address this challenge by the use of a probabilistic approach. A set of synthetic geophysical mappings are used to illustrate the applicability of the proposed methodology and its potential extrapolation to other scientific imaging disciplines.
Flood risk of levee protected areas stems from the possibility of levee failure. Regarding the basic product formula for the quantification of risk, the reliability of a levee stretch or system of levees, which is dependent on all failure relevant conditions and loadings, must be expressed numerically. Some of the properties (i.e. biological factors) believed to affect a levee's probability of failure significantly, cannot be measured on a metric scale, but can be easily expressed qualitatively. As logistic regression models are able to make use of quantitative as well as of qualitative parameters, the application of such multivariate, statistical models for the reliability analysis of river levees is presented and discussed in this paper. Focusing on the practical issues for the specific purpose, the basic approach is explained only briefly at the beginning. Illustrated by example applications, the requirements, advantages and disadvantages of the approach are presented in more detail. Considering the rapid developments in remote sensing and levee monitoring possibilities during the last years, an increasing application potential of this approach is expected in the future.
Natural hazards pose an increasing threat to society and, for this reason, it is necessary to develop models and methodologies for better understanding and forecasting extreme weather events. A new structure of the Greek Regional Administration (Kallikratis) was established in 2011, based on geographical criteria, in order to create an operational and capable administration. An Atmospheric Hazards Early Warning System (AHEWS) could be characterised as an ultimate tool for the local authorities (first and second tier level) in order to organise and implement efficient plans to mitigate the risk. New operation centres (related to regional and municipality administration level) are suggested to be staffed and equipped with the proposed Early Warning System (EWS). The AHEWS will link to extensive Geographical Information Systems (GIS) datasets and methodologies for safety plans by government agencies and services in order to mitigate the impacts caused by atmospheric extreme events. AHEWS involves high-resolution Numerical Weather Prediction (NWP) products, ground observation network, lightning detection network and satellite information in terms of early convective, initiation and Now-Casting. Storms, lightings, gale winds, snow, hail, tornadoes, low temperatures, heatwaves and several others extreme events are weather phenomena that AHEWS deals with in order to prevent and mitigate impacts on humans and constructions. An automated dissemination procedure is described here for individual and administrative users, followed by safety and action plans, respectively.
A levee system may pose enormous risks to the people protected by the system. Risk analysis is at the heart of levee risk mitigation and engineering decision-making. An explicit methodology for the analysis of flood risk due to levee breaching is desirable. In this paper, a case study on the risks of the North Pearl River Levee System (NPRLS) in Guangdong Province, China, is conducted to illustrate an explicit procedure of flood risk analysis. The performance of the levee upon a 100-year flood at milestone 7 +330 near Shijiao is evaluated. The failure probabilities are evaluated for three failure modes: overtopping, piping and slope sliding. The flood scenario resulted from a levee breach in a 100-year flood event at water level 15.5 m above the mean sea level is simulated. The fatality rate is estimated using a model for estimating the vulnerability of human lives in floods based on Bayesian networks, HURAM. The conditional loss of life is finally estimated according to the risk analysis procedure and the fatality rates from HURAM. Possible measures to mitigate the risks of the levee are also discussed.
The fireproofing of buildings is a critical issue for the field of disaster prevention planning. In Tokyo Metropolitan Area, the amelioration of fireproofing is of particular concern, since the risk of widespread fires pursuant to a devastating earthquake is extremely high. In this study, we construct a stochastic model in order to describe the conversion process of existing structures. Using this model, we attempt to trace out a timed series of changes to structures in a given urban area and in doing so hypothetically eliminate the most hazardous areas from a disaster prevention viewpoint. In the district under consideration, we simulate time-series changes in structure by way of a rigorous application of extant urban planning and building codes. We then assess the piecemeal efficiency of such regulations and their overall effectiveness in ameliorating fire risk in potentially hazardous zones.
The piping mechanism is a dangerous failure mechanism for dikes and other flood protection structures in the world. A dike fails due to piping in case a head difference causes a water flow below the dike that, in case the flow is strong enough, causes the soil particles to erode. The progressing erosion ultimately results in the collapse of the dike. The grain size d 70 (determining piping resistance) is highly variable in space. The resistance of a dike stretch with respect to piping is not determined by the smallest value of d 70 in the sand layer that is susceptible to piping but by the largest d 70 in the erosion path (d 70,mech). The ultimate goal of this paper is to find the relation between (measured) point variability of d 70, which follows from collected samples, and the ‘mechanism variability’ d 70,mech, which follows from an analysis of erosion paths. Based on this relation, a new representative design value of d 70 can be derived. The spatial variability in d 70 is modelled with random field simulations. A model is developed that captures the formation of an erosion path in a generated d 70 random field, using an erosion criterion which combines the representative grain size d 70 and groundwater flow velocity near the tip of the backward developing erosion path. The model captures the search for the weakest path under flow influence. By applying the model to multiple random field realisations and to various assumed scales of fluctuation of d 70, the distribution of d 70,mech is obtained for various scales of fluctuation. The main outcome of the new model is that the smaller the scale of fluctuation, the larger the characteristic value of d 70,mech, whereas smaller characteristic values may be found for larger scales of fluctuation. This has some important implications for the design of dikes with respect to piping. If a detailed analysis shows a small scale of fluctuation of d 70, a larger representative design value of d 70 may be used, based on the distribution of d 70,mech. Hence, this may result in significant smaller dike (or berm) designs and lower dike construction costs.
This article examines the capability of Minimax Probability Machine (MPM) for the determination of stability of slope. MPM is constructed within a probabilistic framework. This study uses MPM as classification and regression tools. Unit weight (γ), cohesion (c), angle of internal friction (φ), slope angle (β), height (H) and pore water pressure coefficient (ru) have been used as inputs of the MPM model. The outputs of MPM are stability status of slope and factor of safety (F). The results of MPM have been compared with the artificial neural network models. The experimental results demonstrate that the developed MPM is a promising tool for the determination of stability of slope.
Assessment and inventory of landslide susceptibility are essential for the formulation of successful disaster mitigation plans. The objective of this study was to assess landslide susceptibility in relation to geo-diversity and its hydrological response in the Lesser Himalaya with a case study using Geographic Information System (GIS) technology. The Dabka watershed, which constitutes a part of the Kosi Basin in the Lesser Himalaya, India, in the district of Nainital, has been selected for the case illustration. The study constitutes three GIS modules: geo-diversity informatics, hydro informatics and landslide informatics. Through the integration and superimposing of spatial data and attribute data of all three GIS modules, Landslide Susceptibility Index (LSI) has been prepared to identify the level of susceptibility for landslide hazards. This resonance study, carried out over a period of five years (2007–2011), found that areas of most stressed geo-diversity (comprising very steep slopes above 30°, geology of Lower Krol and Lariakanta formation, geomorphology of moist areas and debris sites, land use of barren land with a very high drainage frequency and spring density) have a high landslide susceptibility because of high rate of average runoff (33 l/s/km2), flood magnitude (307.28 l/s/km2), erosion (398 tons/km2) and landslide density (5–10 landslides/km2). The areas of least stressed geo-diversity (comprising gentle slopes below 10°, geology of Kailakhan and Siwalik formation, geomorphology of depositional terraces, land use of dense forest with low drainage frequency and spring density) have the lowest landslide susceptibility because of the low rate of average runoff (6.27 l/s/km2), flood magnitude (20.49 l/s/km2), erosion (65.80 tons/km2) and landslide density (1–2 landslides/km2).
The main goal of this study was to assess the prediction reliability, the quantitative differences and the spatial variations of the Morgan––Morgan–Finney (MMF) and the Universal Soil Loss Equation (USLE) erosion prediction models along the 442-km-long and 44-m-wide Right-of-Way of Baku–Tbilisi–Ceyhan oil and South Caucasus gas pipelines. USLE performed better than MMF erosion model by the accurate prediction of 61% of erosion occurrences. Paired-samples T-test with p-value less than 0.05 and bivariate correlation with the Pearson's correlation coefficient equal to 0.23 showed that the predictions of these two models were significantly different. MMF model revealed more clustered patterns of predicted critical erosion classes with a soil loss of more than 10 ton/ha/year in particular ranges of pipelines rather than USLE model with the widespread spatial distribution. The average coefficients of variation of predicted soil loss rates by these models and the number of accurately predicted erosion occurrences within the geomorphometric elements of terrain, vegetation cover and landuse categories were larger in the USLE model. This supported the hypothesis that larger spatial variations of erosion prediction models can contribute to the better soil loss prediction performance and reliability of erosion prediction models.
Two-dimensional ionospheric total electron content (TEC) data during the time period from 00:00 on 2 July to 12:00 UT on 8 July 2013, which was 5 days before to 1 day after a deep earthquake at 18:35:30 on 7 July 2013 UT (M w = 7.2) with a depth at about 378.8 km in Papua New Guinea, were examined by two-dimensional principal component analysis (2DPCA) to detect TEC precursor related to the earthquake because TEC precursors usually have shown up in earlier time periods. A TEC precursor was highly localized around the epicenter from 06:00 to 06:05 on 6 July, where its duration time was at least 5 minutes. Ionizing radiation radon gas release should be a possibility to cause the anomalous TEC fluctuation, e.g., electron density variation. The plasma might have large damping at that time to cause TEC fluctuation of short time, and the gas released with small amount in short time period, and 2DPCA could identify short time TEC fluctuation while the fluctuation lasted for a long time. Other background TEC anomalies caused by the geomagnetic storm, small earthquakes and non-earthquake activities, e.g., equatorial ionization anomaly resulted in the small principal eigenvalues, therefore the detection of TEC precursor was regardless of these background TEC anomalies.
In March 2014, a catastrophic landslide in Washington State destroyed a community and killed 43 people. An analysis of the available information using a new approach to manage risk when dealing with rare, high consequence hazards indicates that if the risk for another landslide is accepted, then the expected time between occurrences of massive landslides at this location is about 2000–3000 years, the mean occurrence rate tends to increase with time since the last occurrence, and the alternative of avoiding the risk is preferred if the present worth cost to avoid it (i.e. prevent development) for 100 years is less than about 1/6 the cost of another massive landslide.
ISO2394:2015 contains a new informative Annex D on “Reliability of Geotechnical Structures”. The emphasis in Annex D is to identify and characterize critical elements of the geotechnical reliability-based design process, while respecting the diversity of geotechnical engineering practice. This paper highlights the main features of Annex D and gaps for future work.
The ISSMGE TC309/TC304/TC222 Third Machine Learning in Geotechnics Dialogue (3MLIGD) was hosted online by the Norwegian Geotechnical Institute on 3 December 2021. There is a consensus that the potential of digital transformation in geotechnical site characterisation is significant. Nonetheless, there is a clear-eyed recognition that the industry is currently governed by a set of rules that evolved from Industry 3.0 and it is only beginning to explore the potential of digital technologies. This state is to be expected as digital transformation is expected to change the “rules of the game” in the context of Industry 4.0 that is rapidly evolving in tandem with emerging technologies. The number of practitioners and researchers who are interested in data-centric geotechnics remains a small minority. There is a unanimous view that this small community can achieve greater impact and hasten progress by fostering more collaborations and working more closely together through: (1) data sharing, (2) creating a “yellow page” of people and projects to facilitate greater connectivity, (3) establishing novel collaborative modes between industry and academia, (4) demonstrating value through “ML supremacy” projects that include mapping studies covering large, real-time, multi-source datasets over large spatial domains, and (5) educating young talents by creating ML internships.
This paper shows the results of a numerical analysis carried out using the equivalent frame method, which aimed to investigate the three-dimensional behaviour of masonry buildings with shallow foundations subjected to different settlement troughs on a set of soil types. The obtained results are first expressed in terms of deterministic relationships between the adopted intensity measure (the maximum differential settlement) and a newly defined engineering demand parameter (the vertical drift ratio). Then, following a probabilistic approach, fragility curves pertaining to the modelled masonry buildings are generated, and their usefulness in preventing the occurrence of a serviceability limit state is discussed.
Since soils are natural and non-homogeneous materials, spatial variability of their properties has been recognised as one of the main sources of uncertainties affecting geotechnical analyses. However, soil testing is limited and the description and quantification of the resulting soil variability still remain largely subjective in practice. In this paper, anapproach is proposed to rationally evaluate the soil spatial variability using field data provided by a site investigation programme based on a lightweight dynamic cone penetrometer test. The first part deals with the boundary identification of statistically mechanical homogeneous soil units. The second part focuses on modelling spatial variability through 3D conditional random fields in the homogeneous soil units previously identified. This approach has been applied to and studied in a real site investigation carried out in an alluvial Mediterranean deltaic environment.
This paper investigates the influence of three forms of uncertainty on the probabilistic stability of an idealised 3D embankment slope. These are: 1D spatial variability in the external geometry of the slope along its length, 2D spatial variability in the depth of the boundary between the embankment material and the foundation layer, and 3D spatial variability in the shear strength properties of the slope and foundation materials. The relative influence of each uncertainty has been investigated using the random finite element method, based on statistics consistent with a Dutch regional dyke. The results indicate that, for such a structure, the soil spatial variability has a much greater influence than uncertainties relating to embankment geometry and inter-layer boundary. In particular, it is demonstrated that the spatial correlation of material properties along the length of the embankment has a greater influence on the probabilistic characteristics of the embankment slope stability and failure consequence than the spatial correlation of properties perpendicular to it. A worst case scale of fluctuation for the material properties is identified.
Issues associated with tunnel construction and adjacent building damage risk are becoming increasingly important as cities expand and make more use of their underground space. A typical geotechnical engineering problem is how to determine the ground settlement susceptibility of buildings due to tunnelling excavations. A risk assessment, considering the analysis of several factors, is required for sustainable and resilient planning of urban areas and underground space. The Analytical Hierarchy Process (AHP) has been widely adopted for similar objectives, while AutoRegressive eXogenous modelling (ARX) as the dynamic component has the potential to produce the necessary analysis to underpin the assessment. Additionally, the database and visualisation capabilities of the Building Information Modelling (BIM) framework could provide a promising environment for accommodating such a risk assessment. The proposed methodology reported herein has employed a spatiotemporal analysis using AHP and ARX to produce analytical outcomes that involved several settlement-inducing criteria with respect to their severity and time-dependent groundwater table level changes, respectively. The integration of these analyses within the BIM framework has produced a tool that can define in detail the settlement vulnerability and building-damage assessments within a 3D geology-tunnel-buildings model.
• A 3D geology-tunnel-buildings model is developed using BIM
• A spatiotemporal analysis provides the tunnelling-induced settlement vulnerability
• A building damage assessment provides the tunnelling-induced settlement hazard
• The tunnelling-induced settlement risk is assessed and presented using BIM
• The resulting BIM visualisations support urban land-use planning
In the bridge design specifications of the American Association of State Highway and Transportation Officials using the Load and Resistance Factor Design method, the loads and resulting force effects are given two-letter designations, e.g. “SE” for “force effects due to settlement”. The same two-letter designation is used as a subscript to distinguish load factors, γ, to be applied to loads and other actions acting on the bridge. Thus, the load factor γSE is used to develop factored values of the additional induced force effects such as moments and shears in a bridge structure due to foundation movements. This paper presents the process to calibrate the SE load factor as a function of the target reliability index based on structural service limit states such as cracking in a bridge superstructure. In this process, the target reliability index, uncertainty in predicted movements, and user-specified deterministic tolerable movements are considered in a unified manner. The calibration process can be used for any analytical method and any pattern of movement. The calibration process is demonstrated by an example of immediate settlement for five analytical methods using a dataset based on 20 instrumented spread footings from 10 bridges in the northeast USA.
Landslides represent a serious hazard in many areas around the world, potentially leading to human losses and significant damages to structures and buildings. For this reason, over the years a consistent number of studies and researches have been carried out to analyse these natural phenomena and their evolution. This study presents the application of an automatic procedure specifically developed to identify the onset of landslide acceleration by analysing monitoring displacement data with a multi-criteria approach. The proposed procedure aims to identify this point by applying a four-level validation process on a pre-determined dataset. Once the analysis returns a positive result for a certain number of monitoring data, it is possible to state that the landslide reached the accelerating phase of its evolution, thus allowing to define a specific point representing the onset of acceleration. The method was applied to several historical case studies taken from scientific literature, in order to test its practicability and effectiveness. This procedure could be especially useful in Early Warning Systems where time of failure forecasting models are implemented, allowing to improve their performances by providing an automated and reliable procedure to define the beginning of potentially critical landslide events.
To enhance the computational efficiency in slope reliability analyses, a binary classification method (BCM) that takes advantage of a judgement-based strength reduction method (SRM) and an active-learning support vector machine (SVM) was developed to conduct system reliability analyses of layered soil slopes. The SVM was naturally employed to establish a binary classifier via the judgment technique to approximate the true limit state function because the stability state (e.g. stable or unstable) of a slope can be determined without calculating its exact factor of safety according to the SRM. An active-learning technique was developed to iteratively search training samples in the vicinity of the border between safe and failure domains, to update the SVM classifier with the aid of a modified initial sampling rule. Then, Latin hypercube sampling was employed, together with the obtained SVM classifier, to compute the slope system probability of failure. Three representative examples taken from the literature were employed to evaluate the performance of the proposed method. The proposed BCM shows great computational efficiency compared with existing methods. It reduced the computational cost to several minutes for simple slopes and to approximately 30 minutes for a complex real case, while maintaining a good computational accuracy.
Driven pile is widely used as an effective and convenient structural component to transfer superstructure loads to deep stiffer soils. Nevertheless, during the design process of piles, due to the intrinsic complexity as well as various design variables, the internal stress state related to pile drivability remains unclear, which makes the analysis imprecise. Thus, the development of an accurate predictive model becomes emergent. This paper presents a practical approach to assess pile drivability in relation to the prediction of Maximum compressive stresses and Blow per foot using a series of machine learning algorithms. A database of more than 4000 piles is employed to construct random forest regression (RFR) and multivariate adaptive regression splines (MARS) models. The 10-fold cross-validation method and Lasso regularisation are adapted to obtain the model of superior generalisation ability and better persuasive results. Lastly, the results of RFR and MARS models were compared and evaluated in accordance with the goodness of fit, running time and interpretability. The results show that the RFR model performs better than the MARS in terms of fitting and operational efficiency, but is short of interpretability.
This paper presents a quantitative framework to optimise embedded footing performance subjected to extreme historical climate events with respect to the uncertainties associated with site-specific soil and climatic parameters. The proposed framework is developed based on partially saturated soil mechanics principles in conjunction with a multi-objective optimisation algorithm called Non-dominated Sorting Genetic Algorithm (NSGA-II) to develop a robust optimised design procedure. The proposed method was applied to two semi-arid climate sites, Riverside and Victorville, both situated in California, United States. The results show that the proposed method generally improves the embedded footing design compared to conventional methods in terms of cost and performance. Based on the findings, under the extreme climate conditions, the proposed method estimates the average soil degree of saturation within the footing influence zone between 52% and 95%, with a mean value of 63.1% for the Victorville site, and 57% and 90% with a mean value of 81.6% for the site in Riverside. It is also found that the optimal design from the proposed method shows a lower total construction cost, 44% and 19%, for the Victorville and Riverside sites, respectively, compared to the ones designed by the conventional methods.
Geotechnical engineering analysis and design involves considerable uncertainties. Reliability-based design optimisation (RBDO) intends to minimise an objective function such as material quantity or cost subject to design requirements specified by probabilistic constraints. In this work, the RBDO of geotechnical systems is studied using a decoupled approach based on adaptive metamodels. To improve the traditional double-loop structure of RBDO, the proposed approach decouples the numerical reliability analysis loop from the design optimisation loop. A metamodeling method based on augmented radial basis function (ARBF) is adopted to create approximate functions of the reliability indices, so that the design optimisation is performed based on the metamodels of the probabilistic constraints. To improve the accuracy of predicting the reliability indices, an adaptive technique of the ARBF metamodels is applied. The optimal point in one design iteration is treated as an additional sample point for updating the metamodels of the reliability indices. The accuracy of the metamodels is progressively enhanced through this dynamic process, especially in the neighbourhood of the optimal point. Mathematical and geotechnical engineering examples are solved and numerical results are presented. The proposed design optimisation framework works well and is a useful alternative for solving RBDO of geotechnical engineering problems.
Geoengineering prognoses are often based on data from a limited number of investigations of soil and rock mass. There is generally a desire to reduce the uncertainty in the prognoses while minimising the investigation costs. Value of Information Analysis (VOIA) is a support for decisions regarding investigation strategies and the aim of this paper is to present methodology for VOIA that takes into account four decision alternatives where the input data could be provided by experts. The methodology will be applied in a case study where the value of information related to an investigation borehole will be calculated. The results indicate that the value of information of the borehole is low compared with the realisation costs of the investigation. It was found that models for VOIA in underground construction projects are complex but that the analysis can be simplified with extensive use of expert knowledge and calculations of the value of perfect information as a benchmark for investigation strategies.
Underground mining and oil and gas drilling have increasingly encroached on public water reservoirs and dams because of the overwhelming demand for energy combined with the growing population. Cases of surface water reservoirs and mine waste impoundments being drained, as well as dam infrastructure being damaged, due to accidents have been documented. The methods used by regulators and industry for determining mining or drilling offset distances are based primarily on three approaches and studies performed in the early 1970s. The former US Bureau of Mines Information Curricular 8741 was the culmination of these studies and continues in use for determining offset distances for underground mining with respect to dams and reservoirs. This study used analytical and empirical methods based on subsidence effects to recommend offset distance extents specifically for miners' safety. A major limitation of the previous studies is that they did not detail the affect mining operations have on changes in groundwater flow due to soil and overburden permeability changes triggered by vertical ground surface subsidence that could lead to increased risk hazards for a reservoir or dam infrastructure. This paper presents a review of international literature related to mining under surface bodies of water (reservoirs) and presents a risk-based event tree analysis quantifying the probability of changes in subsurface permeability due to overburden strain changes. A sensitivity analysis quantifying probabilities of increased subsurface permeability in terms of offset distances from a reservoir is presented and discussed. Empirical results indicate the probability of permeability changes at a 350 ft (107 m) deep mine located at offsets of: 200, 400, and 600 ft (61, 122, and 183 m) were at 41, 0.66, and 0.0067%, respectively.
The economical and safe design of footings supported on aggregate-pier-reinforced clay could benefit from the implementation of a reliability-based approach that incorporates the different sources of uncertainty. Monte Carlo simulations are conducted to quantify the probability distribution of the ultimate bearing capacity for practical design scenarios. A reliability analysis is then conducted to propose design charts that yield the required factor of safety as a function of the major input parameters. The novelty in the proposed methodology is the incorporation of a lower bound shear strength that is based on the remoulded undrained shear strength in the reliability analysis.
The selection of landfill sites for municipal solid waste (MSW) disposal involves consideration of geological, hydrological and environmental parameters which exhibit large spatial variability. Therefore, it is necessary to define, to what extent the chosen sites are reliable such that the probability of environmental pollution and health risks to population is minimal. In the present study, groundwater vulnerability to contamination has been assessed using the standard DRASTIC method. The results showed that the study region has 9.45% of very less, 32.94% of less, 25.47% of moderate, 22.79% of high and 9.35% of very high vulnerable zones. The study also revealed that none of the landfills are located in safe zones. This suggests that it requires proper remedial measures to avoid environmental pollution. A landfill site selection process has been carried out using the Analytical Hierarchy Process integrated with Geographical Information System tools. The obtained results showed that only 3.59 km² (0.08%) of the total area is suitable for landfills. The reliability analysis of the site suitability revealed that landfills are located at unreliable locations where the probability of risk to environmental pollution is high. The presented approach assists decision-makers in selecting reliable locations for the safe disposal of MSW.
The North Qazvin region is a part of the Central Alborz Mountains in Iran and has experienced destructive earthquakes. This region is a popular and industrial zone near Tehran, capital of Iran. To identify the highest and lowest seismic hazard location and consequently the seismic zonation of this region, different parameters, such as topography, geology, tectonics and seismicity, have been focused. Accordingly, the north of Qazvin region can be divided into three subzones: western, eastern and southern. Seismic activity of the western zone is higher than the other ones and seismic potential of the eastern zone is higher than the other two zones. This zoning is also necessary for all seismic active areas to find the most dangerous zone.
Rockfall events constitute one of the most dangerous phenomena in mountainous areas, which can affect transportation routes. In a risk mitigation perspective, the quantification of the risk for pedestrians and vehicles represents a crucial aspect for authorities. A method tailored to these elements at risk is herein presented. The proposed method is based on a mixed formulation of the Quantitative Risk Assessment and the Event Tree Analysis approaches. According to these procedures, an accurate evaluation of the annual probability of adverse outcomes can be computed considering all the scenarios which can lead to a fatality or to an injury. Vice versa, the method lets to evaluate the allowable traffic condition, given an acceptable threshold for the risk. Furthermore, it serves to quantify the risk reduction in case of installed passive mitigation measures and, thus, to plan the priority of intervention works. An application on a study case in the Italian Alps illustrates the potentialities of the methodology.