ArticlePDF Available

Technical note: Design flood under hydrological uncertainty

Authors:

Abstract and Figures

Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost–benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.
Content may be subject to copyright.
A preview of the PDF is not available
... Parameter estimation and uncertainty quantification can be achieved in many well-documented ways (e.g., Commonwealth of Australia, 2019; Interagency Advisory Committee on Water Data, 1982;Ramachandra Rao & Hamed, 2019). Such a model focuses on the marginal distribution of data and is typically used to design some infrastructure (e.g., for flood protection, Botto et al., 2017). However, the assumption of identical distribution does not allow using external information from climate covariates, for instance. ...
Article
Full-text available
Risk assessment for climate‐sensitive systems often relies on the analysis of several variables measured at many sites. In probabilistic terms, the task is to model the joint distribution of several spatially distributed variables, and how it varies in time. This paper describes a Bayesian hierarchical framework for this purpose. Each variable follows a distribution with parameters varying in both space and time. Temporal variability is modeled by means of hidden climate indices (HCIs) that are extracted from observed variables. This is to be contrasted with the usual approach using predefined standard climate indices (SCIs) for this purpose. In the second level of the model, the HCIs and their effects are assumed to follow temporal and spatial Gaussian processes, respectively. Both intervariable and intersite dependencies are induced by the strong effect of common HCIs. The flexibility of the framework is illustrated with a case study in Southeast Australia aimed at modeling “hot‐and‐dry” summer conditions. It involves three physical variables (streamflow, precipitation, and temperature) measured on three distinct station networks, with varying data availability and representing hundreds of sites in total. The HCI model delivers reliable and sharp time‐varying distributions for individual variables and sites. In addition, it adequately reproduces intervariable and intersite dependencies, whereas a corresponding SCI model (where hidden climate indices are replaced with standard ones) strongly underestimates them. It is finally suggested that HCI models may be used as downscaling tools to estimate the joint distribution of several variables at many stations from climate models or reanalyzes.
... In the design process, some researchers employed a safety factor to overcome the uncertainty inherent in the design flood based on deterministic theories [6, [16][17][18][19][20]. However, it often depends on experience in deriving safety factors; thus, it is required to quantify the uncertainty with a more rational method based on data. ...
Article
Full-text available
Agricultural reservoirs play such a central role in supplying water to rural areas that it is essential to properly estimate the design flood for agricultural reservoirs under climate change. The objective of this study was to estimate the inflow design flood interval using a non-parametric resampling technique for agricultural reservoirs under climate change. This study suggested an alternative method to point estimation using insufficient past data by providing the interval of the inflow design flood under the representative concentration pathway. To estimate the interval of the inflow design flood, we employed the bootstrap technique, which estimated the confidence interval corresponding to the 95% confidence level. This study covered a spatial range of 30 agricultural reservoirs in South Korea and a temporal range of past and three future representative periods: the base period (2015s: 1986–2015) and future periods (2040s: 2011–2040, 2070s: 2041–2070, 2100s: 2071–2100). We analyzed the results of a 200-year return period and 24-hour duration as a representative case. For the 97.5th bias-corrected and accelerated percentile value, the overall inflow design floods were larger than the base period value (2015s) with the safety factor applied. The northern and midwestern regions of South Korea showed relatively greater changes than the southeastern region. Some agricultural reservoirs showed a decrease in the design flood during the 2040s but generally increased after the 2070s. Through the non-parametric resampling technique, the interval estimation was provided considering the uncertainty of the inflow design flood. By presenting the results for three periods, we can provide policymakers with information to select according to the target period. The findings may provide an essential step in replacing a safety factor used for determining the design flood of agricultural reservoirs with the confidence interval calculated in accordance with statistical characteristics.
... While this use is straightforward and can easily be applied in practice, it overinterprets the mathematic definition of the intervals (e.g., Klemeš, 2002;Serinaldi and Kilsby, 2015). Another proposed use entails a cost-benefit analysis to determine a design value which accounts for epistemic uncertainty arising from parameter estimation (Botto et al., 2014(Botto et al., , 2017Gaume, 2018). ...
Article
Conventional methods for designing infrastructure that is subject to flood risk, such as dams and levees, assume a stationary design flood. However, observed and potential non-stationarity in floods can result in costly over-design or dangerous under-design. Despite substantial attention, evidence from the literature makes clear there is no consensus methodology for estimating design variables under climate change. Practical guidance remains elusive. This paper presents a review of the challenges and advances in design of infrastructure for floods under non-stationarity. First, potential sources of non-stationarity in time series of floods are described to provide context and motivation. Second, methods for estimating design floods that rely on the stationary assumption are presented and their limitations are discussed. Third, methods for estimating design floods that assume non-stationarity resulting from climate change are summarized. Finally, the inadequacies of current design methodologies in view of the pervasive uncertainties are assessed and strategies to manage the consequences of those uncertainties are presented.
Article
El análisis de frecuencias (AF) permite estimar magnitudes de los datos máximos anuales de crecientes y lluvias diarias asociadas con bajas probabilidades de ser excedidas. Tales estimaciones o predicciones permiten el diseño hidrológico de las obras hidráulicas de aprovechamiento o protección. El AF comprende cinco etapas: (1) verificación de la aleatoriedad de los datos; (2) adopción de una función de distribución de probabilidades (FDP); (3) ajuste de la FDP; (4) evaluación del ajuste logrado, y (5) selección de los resultados. En este estudio se exponen las bases teóricas de la distribución Kappa de cuatro parámetros de ajuste (u, α, k, h) obtenidos a través del método de los momentos L, que se describe con detalle. La distribución Kappa, cuando su segundo parámetro de forma h toma valores de -1, 0 y 1 reproduce a las distribuciones Logística Generalizada (LOG), General de Valores Extremos (GVE) y Pareto Generalizada (PAG). Se procesaron tres registros conjuntos de crecientes anuales de gasto pico (Qp) y volumen escurrido (Vol), dos registros anuales de Qp y tres de precipitación máxima diaria anual. A cada uno de los 11 registros procesados se les ajustaron cinco distribuciones: Kappa, la que reproduce según el valor de h (LOG, GVE o PAG); Log-Pearson tipo III; Log-Normal, y Wakeby. La calidad estadística de cada ajuste se cuantificó con el error estándar de ajuste y el error absoluto medio. Las conclusiones destacan la similitud de los resultados (errores y predicciones), en los 11 registros procesados y sugieren la aplicación sistemática de la distribución Kappa, para complementar las de aplicación bajo precepto y las de uso generalizado.
Article
Full-text available
Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.
Article
Full-text available
Effective flood risk mitigation requires the impacts of flood events to be much better and more reliably known than is currently the case. Available post-flood damage assessments usually supply only a partial vision of the consequences of the floods as they typically respond to the specific needs of a particular stakeholder. Consequently, they generally focus (i) on particular items at risk, (ii) on a certain time window after the occurrence of the flood, (iii) on a specific scale of analysis or (iv) on the analysis of damage only, without an investigation of damage mechanisms and root causes. This paper responds to the necessity of a more integrated interpretation of flood events as the base to address the variety of needs arising after a disaster. In particular, a model is supplied to develop multipurpose complete event scenarios. The model organizes available information after the event according to five logical axes. This way post-flood damage assessments can be developed that (i) are multisectoral, (ii) consider physical as well as functional and systemic damage, (iii) address the spatial scales that are relevant for the event at stake depending on the type of damage that has to be analyzed, i.e., direct, functional and systemic, (iv) consider the temporal evolution of damage and finally (v) allow damage mechanisms and root causes to be understood. All the above features are key for the multi-usability of resulting flood scenarios. The model allows, on the one hand, the rationalization of efforts currently implemented in ex post damage assessments, also with the objective of better programming financial resources that will be needed for these types of events in the future. On the other hand, integrated interpretations of flood events are fundamental to adapting and optimizing flood mitigation strategies on the basis of thorough forensic investigation of each event, as corroborated by the implementation of the model in a case study.
Article
Full-text available
[1] Merz and Blöschl (2008a, 2008b) proposed the concept of flood frequency hydrology, which highlights the importance of combining local flood data with additional types of information: temporal information on historic floods, spatial information on floods in neighboring catchments, and causal information on the flood processes. Although most of the previous studies combined flood data with only one extra type of information, all three types are used here in a Bayesian analysis. To illustrate ways to combine the additional information and to assess its value, flood frequency analyses before and after the extraordinary 2002 flood event are compared for the 622 km2 Kamp river in northern Austria. Although this outlier significantly affects the flood frequency estimates if only local flood data are used (60% difference for the 100 year flood), the effect is much reduced if all additional information is used (only 3% difference). The Bayesian analysis also shows that the estimated uncertainty is significantly reduced when more information is used (for the 100 year return period, the 90% credible intervals range reduces from 140% to 31% of the corresponding flood peak estimate). Further analyses show that the sensitivity of the flood estimates to the assumptions made on one piece of information is small when all pieces of information are considered together. While expanding information beyond the systematic flood record is sometimes considered of little value in engineering hydrology because subjective assumptions are involved, the results of this study suggest that the extra information (temporal, spatial, and causal) may outweigh the uncertainty caused by these assumptions.
Article
Full-text available
The specific objective of the paper is to propose a new flood frequency analysis method considering uncertainty of both probability distribution selection (model uncertainty) and uncertainty of parameter estimation (parameter uncertainty). Based on Bayesian theory sampling distribution of quantiles or design floods coupling these two kinds of uncertainties is derived, not only point estimator but also confidence interval of the quantiles can be provided. Markov Chain Monte Carlo is adopted in order to overcome difficulties to compute the integrals in estimating the sampling distribution. As an example, the proposed method is applied for flood frequency analysis at a gauge in Huai River, China. It has been shown that the approach considering only model uncertainty or parameter uncertainty could not fully account for uncertainties in quantile estimations, instead, method coupling these two uncertainties should be employed. Furthermore, the proposed Bayesian-based method provides not only various quantile estimators, but also quantitative assessment on uncertainties of flood frequency analysis.
Article
Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve at ungauged (or scarcely gauged) sites. Different RFA approaches exist, depending on the way the information is transferred to the site of interest, but it is not clear in the literature if a specific method systematically outperforms the others. The aim of this study is to provide a framework wherein carrying out the intercomparison by building up a virtual environment based on synthetically generated data. The considered regional approaches include: (i) a unique regional curve for the whole region; (ii) a multiple-region model where homogeneous subregions are determined through cluster analysis; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially-smooth estimation procedure where the parameters of the regional model vary continuously along the space. Virtual environments are generated considering different patterns of heterogeneity, including step change and smooth variations. If the region is heterogeneous, with the parent distribution changing continuously within the region, the spatially-smooth regional approach outperforms the others, with overall errors 10%-50% lower than the other methods. In the case of a step-change, the spatially-smooth and clustering procedures perform similarly if the heterogeneity is moderate, while clustering procedures work better when the step-change is severe. To extend our findings, an extensive sensitivity analysis has been performed to investigate the effect of sample length, number of virtual stations, return period of the predicted quantile, variability of the scale parameter of the parent distribution, number of predictor variables and different parent distribution. Overall, the spatially-smooth approach appears as the most robust approach as its performances are more stable across different patterns of heterogeneity, especially when short records are considered. This article is protected by copyright. All rights reserved.
Article
Streamflow at ungauged sites is often predicted by means of regional statistical procedures. The standard regional approaches do not preserve the information related to the hierarchy among gauged stations deriving from their location along the river network. However, this information is important when estimating runoff at a site located immediately upstream or downstream of a gauging station. We propose here a novel approach, referred to as the Along-Stream Estimation (ASE) method, to improve runoff estimation at ungauged sites. The ASE approach starts from the regional estimate at an ungauged (target) site, and corrects it based on regional and sample estimates of the same variable at a donor site, where sample data are available. A criterion to define the domain of application around each donor site of the ASE approach is proposed, and the uncertainty inherent in the estimates obtained is evaluated. This allows one to compare the variance of the along-stream estimates to that of other models that eventually become available for application (e.g. regional models), and thus to choose the most accurate method (or to combine different estimates). The ASE model was applied in the northwest of Italy in connection with an existing regional model for flood frequency analysis. The analysed variables are the first L-moments of the annual discharge maxima. The application demonstrates that the ASE approach can be used effectively to improve the regional estimates for the L-moment of order one (the index flood), particularly when the area ratio of a pair of donor–target basins is less than or equal to ten. However, in this case study, the method does not provide significant improvements to the estimation of higher-order L-moments.Editor D. Koutsoyiannis; Associate editor S. GrimaldiCitation Ganora, D., Laio, F., and Claps, P., 2013. An approach to propagate streamflow statistics along the river network. Hydrological Sciences Journal, 58 (1), 1–13.
Article
Hydraulic infrastructures are commonly designed with reference to target values of flood peak, estimated using probabilistic techniques, such as flood frequency analysis. The application of these techniques underlies levels of uncertainty, which are sometimes quantified but normally not accounted for explicitly in the decision regarding design discharges. The present approach aims at defining a procedure which enables the definition of UNcertainty COmpliant DEsign (UNCODE) values of flood peaks. To pursue this goal, we first demonstrate the equivalence of the Standard design based on the return period and the cost-benefit procedure, when linear cost and damage functions are used. We then use this result to assign an expected cost to estimation errors, thus setting a framework to obtain a design flood estimator which minimises the total expected cost. This procedure properly accounts for the uncertainty which is inherent in the frequency curve estimation. Applications of the UNCODE procedure to real cases leads to remarkable displacement of the design flood from the Standard values. UNCODE estimates are systematically larger than the Standard ones, with substantial differences (up to 55%) when large return periods or short data samples are considered.
Article
Epistemic uncertainty is a result of knowledge deficiency about the system. Sampling error exists when limited amounts of hydrologic data are used to estimate a T year event quantile. Both the natural randomness of hydrologic data and the sampling error in design quantile estimation contribute to the uncertainty in flood damage estimation. This paper presents a framework for evaluating a flood-damage-mitigation project in which both the hydrologic randomness and epistemic uncertainty due to sampling error are considered in flood damage estimation. Different risk-based decision-making criteria are used to evaluate project merits based on the mean, standard deviation, and probability distribution of the project net benefits. The results show that the uncertainty of the project net benefits is quite significant. Ignoring the data sampling error will underestimate the potential risk of each project. It can be clearly shown that adding data to existing sample observations leads to improved quality of information, enhanced reliability of the estimators, and reduced sampling error and uncertainty in the project net benefits. Through the proposed framework, the proper length of the extended record for risk reduction can be determined to achieve the required level of acceptable risk.
Article
Identification of the flood frequency curve in ungauged basins is usually performed by means of regional models based on the grouping of data recorded at various gauging stations. The present work aims at implementing a regional procedure that overcomes some of the limitations of the standard approaches and adds a clearer representation of the uncertainty components of the estimation. The information in the sample records is summarized in a set of sample L-moments, that become the variables to be regionalized. To transfer the information to ungauged basins we adopt a regional model for each of the L-moments, based on a comprehensive multiple regression approach. The independent variables of the regression are selected among a large number of geomorpholoclimatic catchment descriptors. Each model is calibrated on the entire dataset of stations using non-standard least-squares techniques accounting for the sample variability of L-moments, without resorting to any grouping procedure to create sub-regions. In this way, L-moments are allowed to vary smoothly from site to site in the descriptor space, following the variation of the descriptors selected in the regression models. This approach overcomes the subjectivity affecting the techniques for the definition and verification of the homogeneous regions. In addition, the method provides accurate confidence bands for the frequency curves estimated in ungauged basins. The procedure has been applied to a vast region in North-Western Italy (about 30,000 km(2)). Cross-validation techniques are used to assess the efficiency of this approach in reconstructing the flood frequency curves, demonstrating the feasibility and the robustness of the approach.
Article
In the risk-based design for hydraulic structures, the major task is the evaluation of the annual expected damage costs caused by floods. Due to the use of a limited amount of data in flood frequency analysis, the computed flood magnitude of a specified return period is subject to uncertainty. A methodology to integrate such uncertainty in the evaluation of annual expected flood damage is developed and illustrated through an example in culvert design. The effect of uncertainty in estimating flood magnitude using different hydrologic probability models with different sample sizes on the annual expected damage cost is examined. Results of the study show that the effect of the uncertainty in a flood magnitude estimate on annual expected damage is quite significant and is sensitive to the sample sizes and the probability distribution models used.