Article

Adaptive state updating in real-time river flow forecasting - A combined filtering and error forecasting procedure

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

A new robust, accurate and efficient data assimilation procedure based on a general filtering update combined with error forecasting at measurement points is presented. The filtering update procedure is based on a predefined, time invariant weighting (gain) function that is used to distribute model errors at measurement points to the entire state of the river system. The error forecast models are used to propagate model errors at measurement points in the forecast period. The procedure supports a general linear and non-linear formulation of the error forecast models, and fully automatic parameter estimation techniques have been implemented to estimate the parameters of the models based on the observed model errors prior to the time of forecast. The parameter estimates are automatically updated, which allow the error forecast models to adapt to the prevailing conditions at the time of forecast, hence accounting for any structural differences in the model errors in the transition between different flow regimes. The developed procedure is demonstrated in an operational flood forecasting setup of Metro Manila, the Philippines. The results showed significantly improved forecast skills for lead times up to 24 h as compared to forecasting without updating. Erroneous conditions imposed at the downstream boundary were effectively corrected by utilising the harmonic behaviour of the model error in the error forecast model; a situation where the usually applied autoregressive error forecast models would fail.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In this method, simulation model is combined with the observed model residuals to build error corrected forecast model. This method has been widely used in rainfall-runoff models and hydrodynamic models (Madsen & Skotner, 2005). Data assimilation has been used in the field of hydrological modeling with the implementation of numerous models by state updating methods (Rasmussen et al., 2015). ...
... The correction of the output time series is main concern of this technique. In forecast model, the model states and variables are updated using predefined gain function to minimize errors in time series of innovations (Madsen & Skotner, 2005). The results showed better modification between no-update and the updated iterations. ...
... Adaptive state updating in real-time river flow forecasting -A combined filtering and error forecasting procedure Data Assimilation (Madsen & Skotner, 2005) DA is used to improve the forecasting of variables in the hydrodynamic model's simulation. ...
Thesis
From last two decades, the proper management of wastewater has got significant attention because of the quantitative and qualitative problems in drainage network of major cities. Around the globe, these issues have addressed by modelers using different numerical models. The overloading of wastewater treatment plant (WWTP) during wet weather flow and deterioration of water quality in dry weather flow are the major problems of the study area. The previous research studies on this catchment emphasized to improve model forecast for better management of urban drainage network and WWTP. The system measurements were assimilated in MIKE 1D (MIKE URBAN) model of Damhusåen catchment, Denmark using data assimilation tool under MIKE ZERO to get more reliable model forecast. The general filtering algorithm with predefined constant weighting function is used to assimilate discharge and water level measurements in drainage system. The model forecast is corrected using observations at measurement locations to distribute model errors to the whole states of the drainage system. The flow is forecasted using the first order auto-regressive error forecast model (AR1) which propagates model errors at measurement locations in forecast period. The model errors are estimated prior to time of forecast and used in forecast period. Furthermore, the radar-based rainfall is applied and evaluated with rain gauge rainfall. The calibration was performed using hydrodynamic and advection dispersion model. The wet weather and dry weather flow was calibrated and validated for four events. The concentration of ammonia is calibrated for one event. The discharge is assimilated at two locations and the volume error is significantly reduced up to 22% and 6% at verification location and inlet of WWTP, respectively. The water level measurements are bias corrected using mean filed bias adjustment (MFB) method. During water level assimilation, the volume error is only reduced up to 2.3% and 0.9% at FT8182 and inlet of WWTP. The update forecast skill of model is improved up to 04 and 07 hours lead time at assimilation and verification locations as compared to without update. The average RMSE without and with update as a function of forecast lead time are 0.128 m3/s, 0.240 m3/s and 0.060 m3/s, 0.198 m3/s at assimilation and verification locations, respectively. Furthermore, the impact of discharge and water level assimilation on accumulated mass of ammonia per day at inlet of WWTP shows reduction in mass up to 209.5 kg and 9.98 kg. The discharge and water level assimilation does not have significant impact on concentration and mass of ammonia. Furthermore, the radar data is validated with rain gauge rainfall and the correlation coefficient is found more than 0.40 for all rain gauges except one station. The radar-based runoff is compared with rain gauge and radar runoff is overestimated. The mean filed bias (MFB) adjustment is done to correct radar-based rainfall which shows a limited correction to radar-based runoff. Moreover, the evaluation of data assimilation on real-time control shows the minimum volume error as compared to other scenarios. This research can significantly contribute in optimization and efficient management of urban drainage network and WWTP.
... Although the studies discussed in this paragraph did not directly use EObased flood variables for assimilation, they have been included here as they paved the way for future flood data assimilation studies. The first hydraulic data assimilation studies were by Madsen and Skotner (2005) and , who assimilated ground gauge-based river level data at different points along river reaches. Madsen and Skotner (2005) developed a novel hybrid assimilation technique combining a simplified Kalman filter with an error forecast model, using gain functions with predefined shapes that reflect typical error correlation structures along the reach. ...
... The first hydraulic data assimilation studies were by Madsen and Skotner (2005) and , who assimilated ground gauge-based river level data at different points along river reaches. Madsen and Skotner (2005) developed a novel hybrid assimilation technique combining a simplified Kalman filter with an error forecast model, using gain functions with predefined shapes that reflect typical error correlation structures along the reach. used the EnKF to simultaneously update the states and inputs of a 1D-hydrodynamic model. ...
... An autoregressive error model was used to synthetically generate and subsequently predict temporally correlated inflow errors. On testing a variety of temporal sampling intervals for field hydrometric observations of Madsen & Skotner (2005). At each assimilation time step in the filtering period prior to the forecast, the innovation at all update locations is acquired, leading to a time series up to the time of the last available measurement (Row 1 plots). ...
Chapter
Accurately simulating floodplain inundation is absolutely vital to minimize damage to life and property. The giant strides made in advanced computing, now allow running increasingly complex models at reasonable resolutions over large areas, with the promise of further improvement in the near future. However, the uncertainty contributed by input, boundary, and forcing data, often leads to highly erroneous predictions. As spatially distributed Earth Observations of flood extent and water level become increasingly available, they pave the way for further constraining, and hence, improving the accuracy of hydraulic flood forecasting models. Effectively using these datasets requires an in-depth understanding of the impacts that the resolution, accuracy, location, timing, and frequency of acquisition of observations may have on model-data integration efforts. This chapter presents a review of the current capabilities in the field of flood data assimilation. The challenges and opportunities of using Earth Observation data for operational flood inundation forecasting are also discussed.
... The removal of these uncertainties is not possible but their effect on model outputs can be reduced by estimation and rectification of the errors (Hutton et al. 2014;Schellart, Shepherd, and Saul 2012). Data assimilation approaches have the capability to account and correct for the uncertainties (Babovic and Furhrman 2002;Madsen and Skotner 2005). By using the data assimilation, the states of a system are updated in order to achieve a more reliable forecast. ...
... The steady Kalman gain was calculated based on the average of the time-varying Kalman gains using an off-line data assimilation simulation. Madsen and Skotner (2005) introduced a combined filtering and error forecasting procedure to update states in a real-time flood forecasting model. The states of the system were updated by distribution of the model errors at the measurement locations, which were based on predefined, time-invariant weighting functions. ...
... However, technologies and tools are available to handle these issues. The present study is an initial step, where we apply and demonstrate a data assimilation approach to update the states of an urban drainage model using the hybrid filtering and error correction procedure proposed by Madsen and Skotner (2005). This updating approach was adopted because of its computational efficiency. ...
Article
Accurate model-based forecasts (discharge and water level) are considered significant for efficient planning and management of urban drainage systems. These model-based predictions can be improved by assimilating system measurements in physically based, distributed, 1D hydrodynamic urban drainage models. In the present research, a combined filtering and error forecast method was applied for the data assimilation to update the states of the urban drainage model. The developed data assimilation setup in combination with the 1D hydrodynamic model was applied at the Damhusåen Catchment, Copenhagen. Discharge assimilation represented significant potential to update model forecast, and maximum volume error was reduced by 22% and 6% at two verification locations. The assimilation of water levels had a minor impact on the update of the system states. The updated forecast skill using error forecast models was enhanced up to 1-2 hours and 6-7 hours lead time at upstream assimilation and downstream verification locations, respectively.
... Hydrological initial conditions from observations or models for a flood forecasting system represent soil moisture, snow cover, the river and other waterbodies (Li et al., 2009 andMadsen andSkotner, 2005). Initial conditions can be either observed or estimated using models. ...
... Hydrological initial conditions from observations or models for a flood forecasting system represent soil moisture, snow cover, the river and other waterbodies (Li et al., 2009 andMadsen andSkotner, 2005). Initial conditions can be either observed or estimated using models. ...
... The initial conditions in flood forecasting systems include the soil moisture, snow cover, initial state of the rivers and other waterbodies in the catchment (Li et al., 2009;Madsen and Skotner, 2005). Not all initial conditions can be observed or will have data available. ...
Conference Paper
Full-text available
The increased availability and application of probabilistic weather forecasts in flood forecasting means that the uncertainty arising from the precipitation forecast can be assessed. This has led to a wider interest in how uncertainty is affecting flood forecast systems. In literature there are general techniques and principles available on how to deal with uncertainty. However, there are no of well-accepted guidelines on the implementation these principles and techniques. There is neither coherent terminology nor a systematic approach which means that it is difficult and perhaps even impossible to assess the characteristics and limitations of uncertainty quantification methods. Selecting the most appropriate method to match a specific flood forecasting system is therefore a challenge. The main findings of this review are that there are remaining mathematical and theoretical challenges in uncertainty quantification methods and that this leads to the use of assumptions which in turn could lead to a misrepresentation of the predictive uncertainty.
... Therefore, error correction (termed as updating or data assimilation) of the model simulations, prior to the time of forecast is important in order to improve accuracy of forecasting results. Four types of updating methods have been reported (WMO, 1992;Refsgaard, 1997) which are: (i) input updating (Liu et al., 2015) (ii) parameter updating (Yang and Michel, 2000) (iii) state updating (Madsen and Skotner, 2005) and (iv) output variable updating (Nanda et al., 2019). Output updating is the most widely adopted method of updating in flood forecasting studies (Shamseldin and O'connor, 2001;Goswami et al., 2005;Nanda et al., 2019), however, in a single operational flood forecasting model framework, state updating (Madsen and Skotner, 2005;Moradkhani et al., 2005) is widely applied in hydrodynamic modeling for updating state of the river using real-time discharge/water level data along the river channel at different gauging points in the river. ...
... Four types of updating methods have been reported (WMO, 1992;Refsgaard, 1997) which are: (i) input updating (Liu et al., 2015) (ii) parameter updating (Yang and Michel, 2000) (iii) state updating (Madsen and Skotner, 2005) and (iv) output variable updating (Nanda et al., 2019). Output updating is the most widely adopted method of updating in flood forecasting studies (Shamseldin and O'connor, 2001;Goswami et al., 2005;Nanda et al., 2019), however, in a single operational flood forecasting model framework, state updating (Madsen and Skotner, 2005;Moradkhani et al., 2005) is widely applied in hydrodynamic modeling for updating state of the river using real-time discharge/water level data along the river channel at different gauging points in the river. ...
Conference Paper
Full-text available
Reliable and accurate inflow forecasting to a reservoir with sufficient lead times plays a crucial role in flood management and development of early warning systems. The Mahanadi River basin in India suffered from frequent devastating flooding events in the recent decade. Especially in India, the flood forecasting systems are constrained with traditional method of gauge-based discharge estimation, limited network of real-time observed hydro-meteorological information and high biases in the input rainfall products. The present study aims to develop a flood forecasting system for the upper reaches of the Mahanadi River basin using an integrated MIKE 11 NAM-HD model and subsequently attempts to improve them by adopting an error-correcting framework. The calibrated/validated MIKE 11 NAM-HD model is forced with observed hydro-meteorological data for the hindcast period and two Numerical Weather Prediction (NWP) model forecast products, namely European Centre for Medium-Range Weather Forecasts (ECMWF) and India Meteorological Department's Multi-model Ensemble (MME), separately for the forecast horizon (1-5 days). The errors of these two modelling setups are then updated using the MIKE 11 DA (Data Assimilation) framework. The inflow forecasts from the standalone MIKE 11 NAM-HD model are found to be acceptable up to 3-days lead time with an NSE of 0.81-0.92 for ECMWF and 0.80-0.93 for MME rainfall input forcings. However, despite incorporation of error updating, marginal improvement was observed in both the cases up to 3-days ahead. An in-depth investigation revealed that the error time series does not display high persistence (low serial autocorrelation coefficient after 1-day lag time) at a daily temporal resolution. This led to the failure of the MIKE 11 DA model in providing substantial improvement of the lead-time inflow forecasts.
... All these DA applications in hydrologic-hydraulic modeling show high applicability of the standard DA methods, such as EnKF. However, EnKF and similar DA methods are computationally expensive and often struggle to perform within a reasonable time frame (Madsen & Skotner 2005). This makes them less applicable for solving practical problems related to water resources management. ...
... This makes them less applicable for solving practical problems related to water resources management. Therefore, simplified tailor-made and time-effective DA methods suitable for solving some specific problems are used (Madsen & Skotner 2005;Hansen et al. 2014;Fava et al. 2020). ...
Article
Full-text available
Reliable water resources management requires decision support tools to successfully forecast hydraulic data (stage and flow hydrographs). Even though data-driven methods are nowadays trendy to apply, they still fail to provide reliable forecasts during extreme periods due to a lack of training data. Therefore, model-driven forecasting is still needed. However, the model-driven forecasting approach is affected by numerous uncertainties in initial and boundary conditions. To improve the real-time model's operation, it can be regularly updated using measured data in the data assimilation (DA) procedure. Widely used DA techniques are computationally expensive, which reduce their real-time applications. Previous research shows that tailor-made, time-efficient DA methods based on the control theory could be used instead. This paper presents further insights into the control theory-based DA for 1D hydraulic models. This method uses Proportional–Integrative–Derivative (PID) controllers to assimilate computed water levels and observed data. This paper describes the two-stage PID controllers’ tuning procedure. Multi-objective optimization by Nondominated Sorting Genetic Algorithm II (NSGA-II) was used to determine optimal parameters for PID controllers. The proposed tuning procedure is tested on a hydraulic model used as a decision support tool for the transboundary Iron Gate 1 hydropower system on the Danube River, showing that the average discrepancy between modeled and observed water levels can be less than 0.05 m for more than 97% of assimilation window. HIGHLIGHTS Unreliable boundaries and initial conditions affect model-driven forecasting.; Control theory-based data assimilation (DA) is used for 1D open channel hydraulic model updating.; PID controllers, used as DA tools, must be optimally tuned.; A two-stage procedure for tuning PID controllers, using multi-objective optimization, is introduced.;
... Flood models can be updated in DA approaches by ingesting outputs of NWP models or direct rainfall-runoff observations. Stream gauge observations are the most used for updating hydrologic (McLaughlin, 2002;Moradkhani et al., 2005;Liu and Gupta, 2007) and hydraulic (Madsen and Skotner, 2005;Neal et al., 2007) model variables. However, single (or sparse) gauging stations generally fail to provide accurate flow observations during extreme events due to the distributed complex nature of flood processes (e.g. ...
... Therefore we proposed a simplified methodology that aimed to assimilate observations at stage gauge locations and propagate water depths correction (the difference between the posterior and the forecast state variables) for the surrounding channel and floodplain cells. The along-channel upstream and downstream water level correction is performed, applying a distance-based gain function by adopting an approach similar to Madsen and Skotner (2005): ...
Article
Full-text available
Hydro-meteo hazard early warning systems (EWSs) are operating in many regions of the world to mitigate nuisance effects of floods. EWS performances are majorly impacted by the computational burden and complexity affecting flood prediction tools, especially for ungauged catchments that lack adequate river flow gauging stations. Earth observation (EO) systems may integrate the lack of fluvial monitoring systems supporting the setting up of affordable EWSs. But, EO data, constrained by spatial and temporal resolution limitations, are not sufficient alone, especially at medium–small scales. Multiple sources of distributed flood observations need to be used for managing uncertainties of flood models, but this is not a trivial task for EWSs. In this work, a near-real-time flood modelling approach is developed and tested for the simultaneous assimilation of both water level observations and EO-derived flood extents. An integrated physically based flood wave generation and propagation modelling approach, that implements an ensemble Kalman filter, a parsimonious geomorphic rainfall–runoff algorithm (width function instantaneous unit hydrograph, WFIUH) and a quasi-2D hydraulic algorithm, is proposed. An approach for assimilating multiple stage gauge observations is proposed to overcome stability issues related to the updating of the quasi-2D hydraulic model states. Furthermore, a methodology to retrieve distributed observed water depths from satellite images to update 2D hydraulic modelling state variables is implemented. Performances of the proposed approach are tested on a flood event for the Tiber River basin in central Italy. The selected case study shows varying performances depending on whether local and distributed observations are separately or simultaneously assimilated. Results suggest that the injection of multiple data sources into a flexible data assimilation framework constitutes an effective and viable advancement for flood mitigation to tackle EWS uncertainty and numerical stability issues. Specifically, our findings reveal that the simultaneous assimilation of observations from static sensors and satellite images led to an overall improvement of the Nash–Sutcliffe efficiency (NSE) between 5 % and 40 %, the Pearson correlation up to 12 % and bias reduction up to 80 % with respect to the open-loop simulation. Moreover, this combined assimilation allows us to reduce the flood extent uncertainty with respect to the disjoint assimilation simulations for several hours after the satellite image acquisition.
... La plupart des modèles couplés font appelà des modèle hydrologiques distribués (Thompson et al., 2004;Knebl et al., 2005; ou semi-distribués Lian et al., 2007). L'utilisation de modèles globaux semble limitéeà des applications en prévision de crue (Harpin et Clukie, 1982;Madsen et Skotner, 2005;Montanari et al., 2008). . Enfin, Renouf et al. (2005) soulignent la rareté des jaugeages sur les forts débits qui peut rendre hasardeuse l'extrapolation de la courbe de tarage Une généralisation abusive : Les processus hydrologiques sont uniques dans le temps et l'espace (Beven, 2000). ...
... ués restent surtout utilisés dans un cadre de recherche. Le volume de données nécessaires et les délais de mise en oeuvre paraissent peu compatibles avec les contraintes d'uné etude hydraulique classique. un inconvénient majeur : ils sont globaux ! Il n'est donc pas possible d'obtenir directement une répartition des apports latéraux sur le tronçon.Madsen et Skotner (2005) ont contourné ce problème en divisant le bassin intermédiaire en deux sous-bassins, le premier injectéà l'amont et le deuxièmeà l'aval. Cette solution introduit un biais sur les débits intérieurs qui seront donc sous-estimés. Ajami et al. (2004) se limitentégalementà des apports ponctuels injectésà intervalles réguliers sur le bief. Le ...
Thesis
Les modèles hydrauliques sont couramment utilisés pour l'aménagement des rivières et la prévention des dommages liés aux inondations. Ces modèles calculent les hauteurs d'eau et débits sur un tronçon de rivière à partir de sa géométrie et des conditions aux limites du système: débit à l'amont du tronçon, débits d'apports latéraux provenant du bassin intermédiaire et hauteurs d'eau à l'aval. Lorsque le tronçon est long, les apports latéraux deviennent conséquents tout en demeurant rarement mesurés car provenant d'affluents secondaires. L'évaluation de ces apports constitue alors une étape essentielle dans la simulation des crues sous peine de fortes sous ou surestimations des variables hydrauliques. Cette thèse a pour objectif principal d'identifier une méthode de complexité minimale permettant de reconstituer ces apports. Nos travaux s'appuient sur un échantillon de 50 tronçons de rivière situés en France et aux Etats-Unis sur lesquels les apports latéraux ont été estimés à l'aide d'un modèle hydrologique semi-distribué connecté avec un modèle hydraulique simplifié. Une méthode automatisée de découpage du bassin intermédiaire en sous-bassins a d'abord été élaborée afin de faciliter la construction du modèle hydrologique sur les 50 tronçons de rivière. Des tests de sensibilité ont été menés sur le nombre de sous-bassins, la nature uniforme ou distribuée des entrées de pluie et des paramètres du modèle hydrologique. Une configuration à 4 sous-bassins présentant des pluies et des paramètres uniformes s'est avérée la plus performante sur l'ensemble de l'échantillon. Enfin, une méthode alternative de calcul des apports latéraux a été proposée utilisant une transposition du débit mesuré à l'amont et une combinaison avec le modèle hydrologique.
... When assimilating data, model parameter specification and state initialization may play a crucial role, especially for short-range forecasting (Houtekamer & Zhang, 2016). Generally, ensemble initialization of model states and parameters for the forecasting period can be generated approximately, for example, using a random selection from uniform distributions for parameters and setting up the initial state values as an Zahmatkesh et al. (2015) HyMOD, HBV, SWMM -Bayesian inference Warm-up None A10 Li et al. (2014) GR4H -Optimization NA Dual A12 DeChant and Moradkhani (2014) VIC -NA Warm-up Dual A3 Xie and Zhang (2013) SWAT -Random Warm-up Dual A3 Chen et al. (2013) HyMOD -Bayesian inference NA Single A11 Moradkhani et al. (2012) HyMOD -Random Warm-up Dual A3 He et al. (2012) SNOW17+ SAC-SMA -Bayesian inference Warm-up Single A11 Mendoza et al. (2012) TopNet -Manual calibration Warm-up Single A11 Clark et al. (2008) TopNet -Bayesian inference Warm-up Single A11 Ajami et al. (2007) HyMOD, SWB -Bayesian inference Warm-up None A10 Weerts and El Serafy (2006) HBV-96 -NA NA Single A2 Vrugt et al. (2005) HyMOD -Random Arbitrary Dual A3 HyMOD -Random Arbitrary Dual A3 Madsen and Skotner [2005] Mike 11 -Optimization Warm-up Single A11 Beven and Freer (2001) TOPMODEL -Bayesian inference Warm-up Dual A12 ...
... arbitrary number (e.g., zero) at the beginning of the forecasting period (Abbaszadeh et al., 2018;Davison et al., 2017;DeChant & Moradkhani, 2014;Moradkhani, Hsu, et al., 2005;Moradkhani et al., 2012;Vrugt et al., 2005;Xie & Zhang, 2013). Alternatively, the ensemble can be generated more carefully, for example, specifying parameters from relevant distributions (Ajami et al., 2007;Beven & Freer, 2001;Chen et al., 2013;Clark et al., 2008;He et al., 2012;Madsen & Skotner, 2005;Mendoza et al., 2012;Zahmatkesh et al., 2015) and using a warm-up technique for states (Ajami et al., 2007;DeChant & Moradkhani, 2014;He et al., 2012;Mendoza et al., 2012;Wang et al., 2018), as summarized in Table 1. ...
Article
Full-text available
A novel modeling framework that simultaneously improves accuracy, predictability, and computational efficiency is presented. It embraces the benefits of three modeling techniques integrated together for the first time: surrogate modeling, parameter inference, and data assimilation. The use of polynomial chaos expansion (PCE) surrogates significantly decreases computational time. Parameter inference allows for model faster convergence, reduced uncertainty, and superior accuracy of simulated results. Ensemble Kalman filters (EnKFs) assimilate errors that occur during forecasting. To examine the applicability and effectiveness of the integrated framework, we developed 18 approaches according to how surrogate models are constructed, what type of parameter distributions are used as model inputs, and whether model parameters are updated during the data assimilation procedure. We conclude that (1) PCE must be built over various forcing and flow conditions and, in contrast to previous studies, it does not need to be rebuilt at each time step; (2) model parameter specification that relies on constrained, posterior information of parameters (so‐called Selected specification) can significantly improve forecasting performance and reduce uncertainty bounds compared to Random specification using prior information of parameters; and (3) no substantial differences in results exist between single and dual EnKFs, but the latter better simulates flood peaks. The use of PCE effectively compensates for the computational load added by the parameter inference and data assimilation (up to ~80 times faster). Therefore, the presented approach contributes to a shift in modeling paradigm arguing that complex, high‐fidelity hydrologic and hydraulic models should be increasingly adopted for real‐time and ensemble flood forecasting.
... It has gained much popularity in oceanography (Haugen and Evensen, 2002;Zhang et al., 2007), hydrology (Ricci et al., 2011;Man et al., 2016), and petroleum engineering (Naevdal et al., 2002;Lee et al., 2016a,b). In recent years, it has been applied to the hydrodynamic field, including state correction and roughness inversion (Madsen and Skotner, 2005;Matgen et al., 2010;Amour et al., 2013;Huang et al., 2013;Garcia-Pintado et al., 2015). Compared with optimization methods, the assimilation method has higher efficiency in parameter inversion. ...
... Ensemble size is one of the most important variables in the EnKF framework. Generally, the number of members J in the ensemble should be about 100 to obtain a representative distribution and a proper covariance matrix required subsequently (Madsen and Skotner, 2005). Due to the large computational load of the hydrodynamic model, the multi-process parallel calculation is time-consuming. ...
Article
Pumping stations (PSs) are common regulation facilities in a water distribution system, with complex hydraulic characteristics and hydraulic parameters. In this study, a data assimilation-based approach was proposed for the correction of PS parameters, in which a 1D hydrodynamic model with PS inner boundary was established, and then the Ensemble Kalman filter (EnKF) framework was used to correct PS parameters. In EnKF framework, the combination of the PS parameters was considered as the state vector, and water levels and discharge as observation vectors. The 1D hydrodynamic model was taken as the observation operator that constructs the mapping relationship between the state and the observation vector. The method's reliability was demonstrated through the case of Niantou PS in the middle route of South-North Water Transfer Project. Verification results show that the proposed method can accurately inverse PS parameters. Compared with the simulation results using PS parameters based on physical model experiments, the PS parameters corrected by the data assimilation method can greatly improve the accuracy of the hydrodynamic simulation. This data assimilation-based method can be used for parameter inversion of other complex hydraulic structures.
... In this study, runoff values simulated by HBV or GR4J models are either considered directly (such results are called Braw^further in the paper), or after applying error correction procedure (Refsgaard 1997;Madsen and Skotner 2005;Liu et al. 2016). In the second case, after termination of the calibration procedure, the raw results from the HBV and the GR4J models are updated by means of linear regression with exogenous inputs; in this version, the past forecasts from the raw HBVor the raw GR4J predictions are added as exogenous inputs to the linear regression error model: ...
... Tests with different numbers of function calls (1000, 3000, 10,000 and 30,000) are performed independently, hence longer calibration does not necessarily imply better results. Two models are tested (HBV and GR4J), each applied with or without error correction procedure (Madsen and Skotner 2005), and with three calibration procedures (GLPSO, MDE_pBX and SPS-L-SHADE-EIG) at five catchments (the mountainous Biala Tarnowska, Poland and Cedar River, WA, USA; the hilly Fanno Creek, OR, USA; the lowland Irondequoit Creek, NY, USA and Suprasl, Poland) located in temperate climatic conditions. Research is based on 14-39 years long daily data that are divided into calibration and validation parts. ...
Article
Full-text available
Various methods are used in the literature for calibration of conceptual rainfall-runoff models. However, very rarely the question on the relation between the number of model runs (or function calls) and the quality of solutions found is asked. In this study two lumped conceptual rainfall-runoff models (HBV and GR4J with added snow module) are calibrated for five catchments, located in temperate climate zones of USA and Poland, by means of three modern variants of Evolutionary Computation and Swarm Intelligence optimization algorithms with four different maximum numbers of function calls set to 1000, 3000, 10,000 and 30,000. At the calibration stage, when more than 10,000 function calls is used, only marginal improvement in model performance has been found, irrespective of the catchment or calibration algorithm. For validation data, the relation between the number of function calls and model performance is even weaker, in some cases the longer calibration, the poorer modelling performance. It is also shown that the opinion on the model performance based on different popular hydrological criteria, like the Nash-Sutcliffe coefficient or the Persistence Index, may be misleading. This is because very similar, largely positive values of Nash-Sutcliffe coefficient obtained on different catchments may be accompanied by contradictory values of the Persistence Index.
... Many data assimilation studies have shown how flood inundations in river systems and flood plains are forecasted accurately when the system states, and in some cases also the boundary conditions, are updated (Annis et al., 2022;Hostache et al., 2018;Lai et al., 2014;Neal et al., 2007;Pensoneault et al., 2023;Pujol et al., 2022;Revilla-Romero et al., 2016;Van Wesemael et al., 2019). Data assimilation can also be used for estimating or tuning model parameters (Annan et al., 2005;Hendricks Franssen & Kinzelbach, 2008;Madsen & Skotner, 2005). Flow resistance parameters are, however, often updated together with systems states making it hard to distinguish the respective contributions of the two updates. ...
Article
Full-text available
Emergency response to flood plain inundations requires real‐time forecasts of flow depth, velocity, and arrival time. Detailed and rapid flood inundation forecasts can be obtained from numerical solution of 2D unsteady flow equations based on high‐resolution topographic data and geomorphologically informed unstructured meshes. However, flow resistance parameters representing the effects of land surface topography unresolved by digital terrain model data remain uncertain. In the present study, flow resistance parameters representing the effects of roughness, vegetation, and buildings are determined hydraulically in real‐time using flow depth observations. A detailed numerical reproduction of a real flood has been largely corroborated by observations and subsequently used as a surrogate of the ground truth target. In synthetic numerical experiments, flow depth observations are obtained from a network of in‐situ flow depth sensors assigned to hydraulically relevant locations in the flood plain. Starting from a generic resistance parameter set, the capability of a tandem 2D surface flow model and Bayesian optimization technique to achieve convergence to the target resistance parameter set is tested. Convergence to the target resistance parameter set was obtained with 50 or fewer tandem flow + optimization iterations for each forecasting cycle in which the difference between simulated and observed flow depths is minimized. The flood arrival time errors across a 52 km2 km2{\text{km}}^{2} flood plain inundation area were reduced by 3.13 hr with respect to results obtained without optimization from a fixed range of flow resistance parameters. Performance metrics like critical success index and probability of detection reach values above 90% across the flood plain.
... Using historical data, the models can be trained to understand the relationships between various variables and how they affect river flow. Once trained, the model can be used to forecast future river flow based on current and expected weather conditions [148]. These projections are employed to assess flood risk and inform choices about the deployment of flood protection and evacuation of vulnerable areas. ...
Article
Full-text available
Floods are a devastating natural calamity that may seriously harm both infrastructure and people. Accurate flood forecasts and control are essential to lessen these effects and safeguard populations. By utilizing its capacity to handle massive amounts of data and provide accurate forecasts, deep learning has emerged as a potent tool for improving flood prediction and control. The current state of deep learning applications in flood forecasting and management is thoroughly reviewed in this work. The review discusses a variety of subjects, such as the data sources utilized, the deep learning models used, and the assessment measures adopted to judge their efficacy. It assesses current approaches critically and points out their advantages and disadvantages. The article also examines challenges with data accessibility, the interpretability of deep learning models, and ethical considerations in flood prediction. The report also describes potential directions for deep-learning research to enhance flood predictions and control. Incorporating uncertainty estimates into forecasts, integrating many data sources, developing hybrid models that mix deep learning with other methodologies, and enhancing the interpretability of deep learning models are a few of these. These research goals can help deep learning models become more precise and effective, which will result in better flood control plans and forecasts. Overall, this review is a useful resource for academics and professionals working on the topic of flood forecasting and management. By reviewing the current state of the art, emphasizing difficulties, and outlining potential areas for future study, it lays a solid basis. Communities may better prepare for and lessen the destructive effects of floods by implementing cutting-edge deep learning algorithms, thereby protecting people and infrastructure.
... Real-time updating techniques have been widely used to improve forecast accuracy by integrating the most recent measurements or observations prior to the time that the forecast is issued (Madsen and Skotner 2005;Refsgaard 1997;Zhang et al. 2018). Incorporating observed streamflow data into updating schemes has proven to give the best results for operational flood forecasting (Bergeron et al. 2016;Komma et al. 2008;Lee et al. 2011;Prakash and Mishra 2022;Seo et al. 2009;Thirel et al. 2010). ...
Article
Flood warnings provide information about the timing and magnitude of impending floods, which can help mitigate the adverse impacts of flooding. Flood forecasts are highly influenced by uncertainty associated with rainfall forecasts as well as initial catchment wetness. Event-based models are simple and parsimonious and are widely favored by practitioners for flood estimation. However, these models require loss parameters to be manually specified for each simulated event, and this represents an additional source of uncertainty that needs to be considered along with errors in observations and rainfall forecasts. Little attention has been given to the coupling of updating techniques with event-based models to reduce the uncertainty associated with catchment wetness. To this end, we devised a sequential recalibration scheme to characterize uncertainty in ensemble forecasts derived using an event-based flood model. This scheme uses information on both observation and model errors to filter and update catchment loss estimates to improve the accuracy of the forecasts. Analysis of flood forecasts for 22 events showed that although initially, there was low skill in forecasts derived solely from external estimates of catchment wetness, the reliability and accuracy of the forecasts improved rapidly once the flood event commenced and the flood model was coupled with an updating scheme. Compared with forecasts made without any updating scheme, the conditioning and recalibration steps progressively improved the accuracy of the forecasts as measured by Nash-Sutcliffe efficiency from −0.14 to 0.88, bias was reduced by 78%, and root-mean square error reduced by 67%. The use of such schemes thus reinforces the advantages of using parsimonious models that have long been favored by practitioners for design and other purposes.
... Real-time updating techniques have been widely used to improve forecast accuracy by integrating the most recent measurements or observations prior to the time that the forecast is issued (Madsen and Skotner 2005;Refsgaard 1997;Zhang et al. 2018). Incorporating observed streamflow data into updating schemes has proven to give the best results for operational flood forecasting (Bergeron et al. 2016;Komma et al. 2008;Lee et al. 2011;Prakash and Mishra 2022;Seo et al. 2009;Thirel et al. 2010). ...
Article
Flood warnings provide information about the timing and magnitude of impending floods, which can help mitigate the adverse impacts of flooding. Flood forecasts are highly influenced by uncertainty associated with rainfall forecasts as well as initial catchment wetness. Event-based models are simple and parsimonious and are widely favored by practitioners for flood estimation. However, these models require loss parameters to be manually specified for each simulated event, and this represents an additional source of uncertainty that needs to be considered along with errors in observations and rainfall forecasts. Little attention has been given to the coupling of updating techniques with event-based models to reduce the uncertainty associated with catchment wetness. To this end, we devised a sequential recalibration scheme to characterize uncertainty in ensemble forecasts derived using an event-based flood model. This scheme uses information on both observation and model errors to filter and update catchment loss estimates to improve the accuracy of the forecasts. Analysis of flood forecasts for 22 events showed that although initially, there was low skill in forecasts derived solely from external estimates of catchment wetness, the reliability and accuracy of the forecasts improved rapidly once the flood event commenced and the flood model was coupled with an updating scheme. Compared with forecasts made without any updating scheme, the conditioning and recalibration steps progressively improved the accuracy of the forecasts as measured by Nash-Sutcliffe efficiency from −0.14 to 0.88, bias was reduced by 78%, and root-mean square error reduced by 67%. The use of such schemes thus reinforces the advantages of using parsimonious models that have long been favored by practitioners for design and other purposes.
... Given these conditions, studies indicate an increase in the accuracy of the forecasts updated by KF (Mu and Zhang, 2007;Xie and Zhang, 2010). Madsen and Skotner (2005) presented a simultaneously updating model of river flood forecasting, which was a combination of the KF model and the fault forecasting model. In their model, the error of system state variables was first distributed on the location of measurement stations by the KF mode; after that, the resulting error values were distributed at the located of measurement stations in the forecasting area by using the error forecasting model. ...
... Early availability of flood warning information is a necessary key to reducing losses. However, the success of such forecasts depends significantly on the accuracy of river flow model prediction, which is related to initial state errors, forcing errors, and model structural errors (Madsen & Skotner, 2005). ...
Article
Full-text available
Operational forecast models require robust, computationally efficient, and reliable algorithms. We desire accurate forecasts within the limits of the uncertainties in channel geometry and roughness because the output from these algorithms leads to flood warnings and a variety of water management decisions. The current operational Water Model uses the Muskingum‐Cunge method, which does not account for key hydraulic conditions such as flow hysteresis and backwater effects, limiting its ability in situations with pronounced backwater effects. This situation most commonly occurs in low‐gradient rivers, near confluences and channel constrictions, coastal regions where the combined actions of tides, storm surges, and wind can cause adverse flow. These situations necessitate a more rigorous flow routing approach such as dynamic or diffusive wave approximation to simulate flow hydraulics accurately. Avoiding the dynamic wave routing due to its extreme computational cost, this work presents two diffusive wave approaches to simulate flow routing in a complex river network. This study reports a comparison of two different diffusive wave models that both use a finite difference solution solved using an implicit Crank–Nicolson (CN) scheme with second‐order accuracy in both time and space. The first model applies the CN scheme over three spatial nodes and is referred to as Crank–Nicolson over Space (CNS). The second model uses the CN scheme over three temporal nodes and is referred to as Crank–Nicolson over Time (CNT). Both models can properly account for complex cross‐section geometry and variable computational points spacing along the channel length. The models were tested in different watersheds representing a mixture of steep and flat topographies. Comparing model outputs against observations of discharges and water levels indicated that the models accurately predict the peak discharge, peak water level, and flooding duration. Both models are accurate and computationally stable over a broad range of hydraulic regimes. The CNS model is dependent on the Courant criteria, making it less computational efficient where short channel segments are present. The CNT model does not suffer from that constraint and is, thus, highly computationally efficient and could be more useful for operational forecast models.
... A well-established method for reducing such uncertainties is to periodically adjust these models by assimilating various available observations (Asch et al., 2016;Moradkhani et al., 2019). As a result, flood simulation and forecast capability have greatly improved thanks to the advances in data assimilation (DA) (Leedal et al., 2010;Madsen & Skotner, 2005;Neal & Jeffrey, 2007;Neal et al., 2009). Continuous time-series of gauged water levels and/or discharges recorded at sparse locations have been used for model calibration and validation. ...
Article
Full-text available
Flooding is one of the most devastating natural hazards to which our society worldwide must adapt, especially as its severity and occurrence tend to increase with climate changes. This research work focuses on the assimilation of two‐dimensional (2D) flood observations derived from remote‐sensing images acquired during overflowing events. To do so, the resulting binary wet/dry maps are expressed in terms of wet surface ratios (WSR) over a number of floodplain subdomains. This ratio is assimilated jointly with in‐situ water‐level gauge observations to improve the flow dynamics within the floodplain. An Ensemble Kalman Filter (EnKF) with a dual state‐parameter analysis approach is implemented on top of a TELEMAC‐2D hydrodynamic model. The EnKF control vector is composed of spatially‐distributed friction coefficients and a corrective parameter of the inflow discharge. It is extended with the hydraulic states within the floodplain subdomains. This data assimilation strategy was validated and evaluated over a reach of the Garonne river. The observation operator associated with the WSR observations, as well as the dual state‐parameter sequential correction, was first validated in the context of Observing System Simulation Experiments. It was then applied to two real flood events that occurred in 2019 and 2021. The merits of assimilating synthetic aperture radar‐derived WSR observations, in complement to the in‐situ water‐level observations, are shown in the parameter and observation spaces with assessment metrics computed over the entire flood events. It is also shown that the hydraulic state correction within the dual state‐parameter analysis approach significantly improves the flood dynamics, especially during the flood recess.
... Continuous time-series of gauged water levels and/or discharge recorded at discrete locations have traditionally been used for model calibration and validation as well as with the DA algorithm for real-time constraint of hydraulic flood forecasting models (e.g., [52]- [54]). In the present work, gauged water levels are assimilated with an EnKF algorithm, for T2D Garonne model, to sequentially correct the friction and inflow discharge. ...
Article
Flood simulation and forecast capability have been greatly improved, thanks to the advances in data assimilation (DA). Such an approach combines in situ gauge measurements with numerical hydrodynamic models to correct the hydraulic states and reduce the uncertainties in model parameters. However, these methods depend strongly on the availability and quality of observations, thus necessitating other data sources to improve the flood simulation and forecast performances. Using Sentinel-1 images, a flood extent mapping method was carried out by applying a Random Forest algorithm trained on past flood events using manually delineated flood maps. The study area concerns a 50-km reach of the Garonne Marmandaise catchment. Two recent flood events are simulated in analysis and forecast modes, with a +24-h lead time. This study demonstrates the merits of using synthetic aperture radar (SAR)-derived flood extent maps to validate and improve the forecast results based on hydrodynamic numerical models with Telemac2D-ensemble Kalman filter (EnKF). Quantitative 1-D and 2-D metrics were computed to assess water-level time-series and flood extents between the simulations and observations. It was shown that the free run experiment without DA underestimates flooding. On the other hand, the validation of DA results with respect to independent SAR-derived flood extent allows to diagnose a model–observation bias that leads to over-flooding. Once this bias is taken into account, DA provides a sequential correction of area-based friction coefficients and inflow discharge, yielding a better flood extent representation. This study paves the way toward a reliable solution for flood forecasting over poorly gauged catchments, thanks to the available remote sensing datasets.
... Continuous time-series of gauged water levels and/or discharge recorded at discrete locations have traditionally been used for model calibration and validation as well as with DA algorithm for real-time constraint of hydraulic flood forecasting models (e.g., [47]- [49]). In the present work, gauged water levels are assimilated with an Ensemble Kalman Filter algorithm in the previously presented T2D Garonne model to sequentially correct the friction and inflow discharge. ...
Preprint
Full-text available
Flood simulation and forecast capability have been greatly improved thanks to advances in data assimilation. Such an approach combines in-situ gauge measurements with numerical hydrodynamic models to correct the hydraulic states and reduce the uncertainties in the model parameters. However, these methods depend strongly on the availability and quality of observations, thus necessitating other data sources to improve the flood simulation and forecast performances. Using Sentinel-1 images, a flood extent mapping method was carried out by applying a Random Forest algorithm trained on past flood events using manually delineated flood maps. The study area concerns a 50-km reach of the Garonne Marmandaise catchment. Two recent flood events are simulated in analysis and forecast modes, with a +24h lead time. This study demonstrates the merits of using SAR-derived flood extent maps to validate and improve the forecast results based on hydrodynamic numerical models with Telemac2D-EnKF. Quantitative 1D and 2D metrics were computed to assess water level time-series and flood extents between the simulations and observations. It was shown that the free run experiment without DA under-estimates flooding. On the other hand, the validation of DA results with respect to independent SAR-derived flood extent allows to diagnose a model-observation bias that leads to over-flooding. Once this bias is taken into account, DA provides a sequential correction of area-based friction coefficients and inflow discharge, yielding a better flood extent representation. This study paves the way towards a reliable solution for flood forecasting over poorly gauged catchments, thanks to available remote sensing datasets.
... 155 more than 1 meter), numerical surging may occur. To avoid surging, a gain function is applied adopting an approach similar to Madsen and Skotner (2005): ...
Preprint
Full-text available
Hydro-meteo hazard Early Warning Systems (EWSs) are operating in many regions of the world to mitigate nuisance effects of floods. EWSs performances are majorly impacted by the computational burden and complexity affecting flood prediction tools, especially for ungauged catchments that lack adequate river flow gauging stations. Earth Observation (EO) systems may surrogate to the lack of fluvial monitoring systems supporting the setting up of affordable EWSs. But, EO data, constrained by spatial and temporal resolution limitations, are not sufficient alone, especially at medium-small scales. Multiple sources of distributed flood observations need to be used for managing uncertainties of flood models, but this is not a trivial task for EWSs. In this work, a near real-time flood modelling approach is developed and tested for the simultaneous assimilation of both water level observations and EO-derived flood extents. An integrated physically-based flood wave generation and propagation modelling approach, that implements a Ensemble Kalman Filter, a parsimonious geomorphic rainfall-runoff algorithm (WFIUH) and a Quasi-2D hydraulic algorithm, is proposed. A data assimilation scheme is tested that retrieves distributed observed water depths from satellite images to update 2D hydraulic modelling state variables. Performances of the proposed approach are tested on a flood event for the Tiber river basin in central Italy. The selected case study shows varying performances depending if local and distributed observations are separately or simultaneously assimilated. Results suggest that the injection of multiple data sources into a flexible data assimilation framework, constitute an effective and viable advancement for flood mitigation tackling EWSs data scarcity, uncertainty and numerical stability issues.
... Bao et al., 2011;Li et al., 2015), and comprehensive updating (multiple variables updating) (e.g. Madsen and Skotner, 2005;Chen et al., 2015). To improve realtime flood forecasting, we not only need more observations, but also need effective information utilization technologies. ...
Article
The differential response of hydrologic system (HSDR) can represent the relation between errors of model outputs and influencing variables, which has been proposed for error correction in flood forecasting. In this study, the HSDR method with stepwise approximation is established to update areal mean rainfall in whole basin (HSDR_AMP) and spatial distributed rainfall in each sub-basin (HSDR_SDP), respectively. The regularized least square algorithm is employed to calculate the solutions. The performance of the proposed method on improving the predictions by the Xinanjiang (XAJ) model is demonstrated in a synthetic case with different rainfall errors and in real basins with different flow characteristics. The results show that 1) the HSDR can improve flood predictions by updating rainfall; 2) the improved HSDR with stepwise approximation outperforms that without iteration; 3) the HSDR_SDP converges faster than the HSDR_AMP; 4) the HSDR_SDP with enough observations used for updating can perform better due to the consideration of rainfall spatial distribution; 5) when increasing lead time the HSDR performance generally deteriorates slowly than the autoregressive (AR) technique. To some extent, the HSDR for error updating can utilize physical mechanism of hydrological process represented by hydrological models. With clear concept, simple structure and stable performance, the enhanced method is worthy of further applications for real-time flood forecasting and ultimately extend the understanding of error feedback in hydrologic process.
... Jean-Baptiste et al. [2011] utilise un vecteur d'état augmenté par les forçages inconnus et estime les forçages latéraux non jaugés sur le Rhône, dans le sud-est de la France, de manière conjointe avec l'état du système à l'aide d'un KF et d'un filtre à particules. Hartnack et al. [2006] utilise un EnKF pour estimer le niveau d'eau et le débit d'un modèle d'inondation 1D-2D, améliorant ainsi l'état hydraulique et le forçage.Ricci et al. [2011], de même queMadsen and Skotner [2005] etShiiba et al. [2000], procèdent à une estimation duale (en deux étapes) des forçages en amont de l'Adour, fleuve côtier du sudouest de la France. Les auteurs utilisent un EKF qui tient compte des non-linéarités du modèle et utilisent une matrice de covariance invariante (IKF), ce qui permet de réduire le coût de calcul de l'algorithme. ...
Thesis
Full-text available
Cette thèse s'inscrit dans le cadre du développement de techniques d'assimilation de données dans les codes de calcul bidimensionnels d'hydraulique à surface libre et, plus particulièrement, de l'évaluation des possibilités d'amélioration de la prévision des hautes eaux sur l'estuaire de la Gironde par le développement et l'utilisation de ces méthodes. Les caractéristiques principales de ces travaux concernent le caractère instationnaire de l'hydrodynamique, liée à la fois à la propagation de la marée et à l'événement hydrométéorologique considéré, et la corrélation entre les observations des différentes stations de mesures disponibles. Les travaux réalisés pendant la première partie de la thèse ont porté sur la proposition d'une méthodologie de traitement des incertitudes par analyse globale de sensibilité par décomposition de variance pour un code d'hydrodynamique alimenté par des variables fonctionnelles. Cette méthodologie a permis la classification des sources d'incertitudes à partir d'un échantillon des réalisations du modèle Telemac2D de prévision des hautes eaux de l'estuaire de la Gironde, dans le but de hiérarchiser l'influence des forçages et des paramètres du modèle. Dans un second temps, en lien avec les résultats de l'étude précédente, une méthodologie a été proposée pour développer une chaîne de simulation intégrant le code de calcul étudié et une méthode ensembliste d'assimilation de données fondée sur le Filtre de Kalman d'Ensemble, dans l'objectif de réduire les incertitudes du modèle liées à ses forçages fonctionnels et à sa paramétrisation par l'utilisation d'observations. Dans un premier temps, cette méthodologie a été appliquée au modèle Telemac2D de l'estuaire de la Gironde sous la forme d'un protocole d'expériences jumelles dont l'objectif est, d'une part, de procéder à sa validation et, d'autre part, de déterminer de manière optimale le vecteur de contrôle, le réseau de mesures et les paramètres liés à la méthodologie proposée. Les gains de la méthodologie proposée sont ensuite évalués en utilisant les observations disponibles pour plusieurs événements réels
... During a flood event, these in situ sources provide only a one-dimensional, point-based set o f surface water data (Juracek and Fitzpatrick, 2009), without addressing the additional challenge o f extrapolating downstream volume after the water passes the monitor and predicting what the water will do if it rises and extends beyond its normal channel. The limited availability o f in situ streamflow measurement resources hampers flood detection in river areas and restricts the ability to validate real-time flood forecasting models(Madsen and Skotner, 2005). Furthermore, these streamgages are physically vulnerable and can stop transmitting critical flood water data during storm events or at other critical times.This risk was illustrated in the August 2016 flood event in Baton Rouge, Louisiana. ...
Thesis
Full-text available
Floods rank as the deadliest and most frequently occurring natural hazard worldwide, and in 2013 floods in the United States ranked second only to wind storms in accounting for loss of life and damage to property. While flood disasters remain difficult to accurately predict, more precise forecasts and better understanding of the frequency, magnitude and timing of floods can help reduce the loss of life and costs associated with the impact of flood events. There is a common perception that 1) local-to-national-level decision makers do not have accurate, reliable and actionable data and knowledge they need in order to make informed flood-related decisions, and 2) because of science--policy disconnects, critical flood and scientific analyses and insights are failing to influence policymakers in national water resource and flood-related decisions that have significant local impact. This dissertation explores these perceived information gaps and disconnects, and seeks to answer the question of whether flood data can be accurately generated, transformed into useful actionable knowledge for local flood event decision makers, and then effectively communicated to influence policy. Utilizing an interdisciplinary mixed-methods research design approach, this thesis develops a methodological framework and interpretative lens for each of three distinct stages of flood-related information interaction: 1) data generation—using machine learning to estimate streamflow flood data for forecasting and response; 2) knowledge development and sharing—creating a geoanalytic visualization decision support system for flood events; and 3) knowledge actualization—using heuristic toolsets for translating scientific knowledge into policy action. Each stage is elaborated on in three distinct research papers, incorporated as chapters in this dissertation, that focus on developing practical data and methodologies that are useful to scientists, local flood event decision makers, and policymakers. Data and analytical results of this research indicate that, if certain conditions are met, it is possible to provide local decision makers and policy makers with the useful actionable knowledge they need to make timely and informed decisions.
... State estimators update the initial states of the river model to match observations and improve the prediction accuracy of the river model. This improvement will, however, wash out with increasing lead time (Madsen & Skotner, 2005). Therefore, it is recommended to use these state estimators in combination with prediction error methods (Vermuyten et al., 2018b). ...
Article
Full-text available
The vegetation along a river reach varies throughout a year. Seasonal vegetation affects the hydrodynamic behaviour of the river system. Accordingly, flood studies should take this temporal variation into account. This also applies to real‐time flood forecasting and control. This paper studies the impact of seasonal vegetation when considering real‐time flood control performance, based on a model predictive control (MPC) scheme. The scheme makes use of a conceptual river model to limit the computational times, as well as a reduced genetic algorithm (RGA) for the optimization of the flood control gates. The impact of seasonal vegetation on the conceptual model accuracy was analysed and a flexible data assimilation approach developed, to adjust the model predictions to different vegetation scenarios. This method can successfully improve the efficiency of a control strategy, by strongly predicting and reducing the impact of seasonal vegetation changes on river conditions.
... Traditional hydrological forecasting methods are difficult to make up for some errors in the actual situation. Real-time forecasting is not appropriate for such considerations that were not considered in the original model, cannot be considered, or even considered, and there is a certain amount of flood estimation (Madsen, 2005). The factors that cause certain errors in the forecast, such as the structure, parameters, state variables, or input values of the model, are considered. ...
Article
Full-text available
In flood forecasting, general flood forecasting models or empirical forecasts reflect the average optimal value or relationship curve under the previous data. However, in the operation forecast, the forecast plan value often deviates from the actual situation. This paper takes Muskingum model as an example, and uses the Kalman filter algorithm to correct the forecast results. The algorithm structure and principles were described detailed, and the numerical simulation test was set to verify the efficiency of the Kalman filter algorithm. The correct results with corrected method were compared. The results indicated that the efficiency of the updating system using Kalman filter algorithm was improved. Conclusively, the proposed method could be widely applied in real-time flood forecast updating.
... One procedure is to import updated model inputs or parameters into the models mostly by using the Kalman filter and related methods (e.g. Bidwell and Griffiths 1994;Hsu et al. 2003;Hartnack et al. 2005;Madsen and Skotner 2005;Moradkhani et al. 2005;Komma et al. 2008;Bloschl et al. 2008;Wang and Bai 2008;Liu et al. 2010;Waller et al. 2018a, b). For example, Hartnack et al. (2005) employed the ensemble Kalman filter (EnKF) method to update the model state and boundary conditions used in the 1D-2D MIKE flood modelling system in accordance with the observed discharge and water-level measurement. ...
Article
Full-text available
This study develops a real-time error correction model named RTEC_2DIS for two-dimensional (2D) flood-inundation simulations. Inundation depth and flooding area provided by a 2D numerical hydraulic modeling (SOBEK model) are corrected in accordance with real-time observations at specific road-based water-level gauges. The proposed RTEC-2DIS method can estimate the forecast error of inundation depths for various lead times at both gauged grids and ungauged grids within the watershed/basin. By doing so, the real-time correction of the resulting inundation area composed of corrected water-depth forecast at all concerned grids can be achieved by adding the water-depth forecasts to the corresponding forecast errors. In this study, the demonstration of the model is made in the Tainan district in Southern Taiwan under the consideration of the design rainstorm events of specific return periods with a central-peak storm pattern and a real rainstorm event from Typhoon Megi in 2016. The results from the numerical demonstration indicate that the RTEC-2DIS method can effectively reduce the uncertainty in the estimating inundation depth at gauged grids to enhance its correction performance with the percentage of error reduction of 42% in the case of the real rainstorm event. Furthermore, regarding the design events, through the proposed RTEC-2DIS method, the accuracy of the simulated inundation area can be improved by 55%. In summary, the proposed RTEC-2DIS method not only can correct forecasted inundation depths at gauged grids, but also facilitate the accuracy and reliability of the resulting inundation area in accordance with real-time observations recorded at road-based water-level gauges.
... Received 10 January 2020; Received in revised form 6 March 2020; Accepted 9 April 2020 procedures were proposed to overcome this difficulty (Chen et al., 2015;Prakash et al., 2014;Shamseldin and O'Connor, 2001), it did not yield any significant improvement in the forecast accuracy, especially at higher lead times. The data assimilation methods (Fahimi et al., 2017;Madsen and Skotner, 2005;WMO, 1992) also brought only marginal improvement at higher lead time forecasts. Hybrid models that combined the strengths of two (or more) modelling approaches were also proposed (Abrahart et al., 2012;Khashei and Bijari, 2012;Maier et al., 2010;Srinivasulu and Jain, 2009), but no improved performance was achieved at higher lead times through this approach also. ...
Article
Artificial neural network has been acknowledged as a promising tool for accurately forecasting the streamflow. However, several constraints limit its application in operational hydrology; the major one being, the non-availability (practically not available) of observed streamflow to be used as input to the model. In addition, progressive reduction in the forecast accuracy along the increase in forecast lead time makes the ANN model less amenable for operational flood forecasting. This study proposes a hybrid model (M4) that combines the strength of a physically-based distributed hydrological model and a data-driven model (ANN), which purges the above concerns. The proposed model was tested using two different case studies (ST-I and ST-II), with respective time steps as fifteen minutes and one day. While the first case study demonstrated the application of the model on a real-time streamflow forecasting scenario, the second one represented a continuous streamflow simulation. The performance of the hybrid model was evaluated, and the results showed that the proposed hybrid model (M4t+4) performed reasonably good at higher lead times (NSE = 0.91 for ST-I and 0.77 for ST-II at time step't+4'). The model, M4t+4 was further tested for its ability to work with forecasted rainfall (synthetically generated), using the data of ST-I and the model performed well with NSE of 0.95. Though the performance of the models was drawn only up to't+4', it was illustrated that the proposed hybrid model could be used to generate forecasted streamflow hydrograph corresponding to full flood event well in advance. This characteristic of the proposed model suggests its utility in operational flood forecasting.
... This study uses the ensemble Kalman filter (EnKF) to update the initial states of the prediction model (Evensen 1994). Because the effect of such a state estimator reduces with increased lead time, a prediction error method was also applied to reduce the prediction errors due to uncertainties, as recommended by Madsen and Skotner (2005) The recently introduced flexible prediction error method (Flex PEM) by Vermuyten et al. (2018b) analyzes past model residuals and applies an appropriate error correction scheme to prior model predictions based on this analysis. The combined approach of the EnKF and Flex PEM reduces the loss in control performance due to hydrodynamic model uncertainty by about 75% (Vermuyten et al. 2018b). ...
... In this process, the system generates an ensemble of river flow forecasts for the same forecast period considering a range of probabilistic assessment of future river flow instead of only one projection. Several case studies are published in the literature indicating its decent performance in forecasting flood particularly in the case of issuing flood alert with more confidence (Cloke and Pappenberger, 2009 Recent advances in computation power enable both hydrological connected with hydraulic modelling forecasts to improve forecast through data assimilation (Barthélémy et al., 2018;Costabile and Macchione, 2015;Madsen and Skotner, 2005). This method enables better forecasting through updating the model state, parameters, initial and boundary states at every forecast repetition and with more reliable model results. ...
Thesis
Full-text available
Quantifying and reducing uncertainty in a numerical model improve model reliability and enhance model performance. For an urban drainage flow, the state-of-the-art modelling method is dual drainage modelling where the surface drainage system is represented as two dimensional, and the buried piped network is represented as one dimensional. Most of these models use the St. Venant equation to calculate the flow, which does not help with the modelling of drainage structures such as manholes and gullies as they have complex three-dimensional flow. These structures are represented as point entities using head loss coefficients and discharge coefficients. This involves uncertainty in results as the values of these coefficients are highly case dependent. Urban drainage models also require hydrological input boundaries such as discharge and water levels, for which they are dependent on measurements or outputs of a hydrological model. A major part of this thesis focuses on finding the uncertainty of manhole head loss coefficients and gully discharge coefficients (Chapter 3 to Chapter 6). The last part of the thesis (Chapter 7) focuses on quantifying uncertainties in rainfall-runoff hydrological model results to ensure efficient flood forecasting. In analysing the uncertainty of the manhole and gully modelling, the primary focus was given to understanding the hydraulics and the flow through these structures correctly. Three different sets of laboratory experiments were conducted in characterising the manhole and the gully flow. In the first measurement set, velocity measurements were done in a small-scale manhole. Stereoscopic Particle Image Velocimetry was utilised to record the three velocity components in three different two-dimensional planes at various hydraulic conditions. In the second experimental measurement set, a prototype scale manhole was utilised to measure its flow and surcharge depths at different surcharges. In the third experimental work, velocity measurements were conducted in a prototype gully using ADV. Computational fluid dynamics (CFD) was used to investigate the flow behaviour and inspect the manhole head loss coefficients and gully discharge coefficients at distinct conditions. The scaled manhole model was replicated primarily utilising OpenFOAM® CFD modelling tools. A multiphase Volume of Fluid (VOF) model was constructed by applying four different Reynolds Averaged Navier Stokes (RANS) turbulence models. The PIV measurement data was used to validate the CFD modelling procedure. The comparison showed that among four different RANS models the RNG k-ε model is the best choice considering model result quality of velocity and pressure as well as time requirement in model simulation. Later, RNG k-ε model was applied in replication of the prototype scale manhole used in the laboratory. The model also showed good performance when model data were compared with the experimental measurement of discharge and pressure. Lastly, the validated CFD model was used to find uncertainty of manhole head loss coefficients in prototype scale manhole models. Three different manhole moulds found in the drainage systems were analysed numerically: Type A with sump zone, Type B without a sump and Type C with benching. The effect of small bending angles and manhole of inlet diameter ratios were further checked for Type A manhole. Results showed that at very small bending angles, jet flow core region can still dissipate through the outlet pipe and causes a minimal increase of head loss at the post threshold surcharge. At higher bending angles, the core jet flow disappears in the manhole, and higher head loss coefficients are observed for all surcharge. The Type A manhole showed an alternation of the hydraulic regime at manhole diameter ratio of 3.0. Over this ratio, manholes showed indication of threshold surcharge zone, which was absent in lower ratios. CFD model of prototype gully showed good agreement when compared to ADV velocity data. The validated gully model was used to check different gully discharge coefficients for different gully outlet water levels. Three different downstream surcharge levels were identified for which the gully flows show different discharge coefficients. The last part of the dissertation focuses on quantifying uncertainty in a rainfall-runoff model. A tool was developed which can be used for automated flood forecasting including estimation of input and parameter uncertainties. It can also forecast flood discharge using ensemble-based Monte Carlo method. The tool was successfully applied at the city of Kulmbach located within Upper Main river catchment in Germany. A new methodology is proposed for flood forecasting named ‘Discharge Interval Method’. When compared with the traditional Monte Carlo method, the Discharge Interval Method was also found to efficiently hindcast historical flood events in a fraction of time without downgrading forecast quality.
... One of main limitations of the output correction approach implemented in this study is the lack of any spatial error covariance structure for a distributed structure of the Musk3p model. The recommendation for future study is to include such spatial covariance error using a time invariant gain function to distribute model errors (based on the error at the measurement point) across the entire state of the river system, in a fashion similar to the one presented by Madsen and Skotner (2005). Moreover, in case of additional river reaches, more analysis would be needed to attribute the different performances and sensitivities of the proposed model updating depending on the locations and characteristics of these river reaches (e.g., presence of hydraulic structures, human footprint, climatic region, etc.). ...
Article
Full-text available
This study aims at proposing novel approaches for integrating qualitative flow observations in a lumped hydrologic routing model and assessing their usefulness for improving flood estimation. Routing is based on a three‐parameter Muskingum model used to propagate streamflow in five different rivers in the USA. Qualitative flow observations, synthetically generated from observed flow, are converted into fuzzy observations using flow characteristic for defining fuzzy classes. A model states updating method and a model output correction technique are implemented. An innovative application of Interacting Multiple Models, which use was previously demonstrated on tracking in ballistic missile applications, is proposed as state updating method, together with the traditional Kalman filter. The output corrector approach is based on the fuzzy error corrector, which was previously used for robots navigation. This study demonstrates the usefulness of integrating qualitative flow observations for improving flood estimation. In particular, state updating methods outperform the output correction approach in terms of average improvement of model performances, while the latter is found to be less sensitive to biased observations and to the definition of fuzzy sets used to represent qualitative observations.
... The correction of the model states in the 2D hydraulic domain is enforced into the floodplain cell where the observation is located, but also to both the closest channel cell and to the floodplain cells that are hydraulically connected to the channel cell. A propagation of the correction upstream and downstream is, thus, performed adopting a gain function similar to Madsen and Skotner (2005) since the correction of only one channel cell would bring instability to the flood wave routing model. This correction propagation model is also weighted by implementing a spatially varying correction factor that varies as a function of the distance of the corrected location. ...
Article
Full-text available
Crowdsourced data can effectively observe environmental and urban ecosystem processes. The use of data produced by untrained people into flood forecasting models may effectively allow Early Warning Systems (EWS) to better perform while support decision-making to reduce the fatalities and economic losses due to inundation hazard. In this work, we develop a Data Assimilation (DA) method integrating Volunteered Geographic Information (VGI) and a 2D hydraulic model and we test its performances. The proposed framework seeks to extend the capabilities and performances of standard DA works, based on the use of traditional in situ sensors, by assimilating VGI while managing and taking into account the uncertainties related to the quality, and the location and timing of the entire set of observational data. The November 2012 flood in the Italian Tiber River basin was selected as the case study. Results show improvements of the model in terms of uncertainty with a significant persistence of the model updating after the integration of the VGI, even in the case of use of few-selected observations gathered from social media. This will encourage further research in the use of VGI for EWS considering the exponential increase of quality and quantity of smartphone and social media user worldwide.
... The flood forecast was subsequently improved for lead times up to 48 h. Other studies have focussed on further improving the assimilation procedure by either describing the spatial distribution of error statistics ( Madsen and Skotner, 2005 ) or, more recently, defining the impact of sensor locations on the forecast skill ( Mazzoleni et al., 2015 ). Given that the use of in situ river observations has always been of common practice ( Barthélémy et al., 2017;Mazzoleni et al., 2017;Ricci et al., 2011 ), in situ floodplain observations have only been used occasionally. ...
Article
Reliable flood forecasting systems are the prerequisite for proper flood warning systems. Currently, satellite remote sensing (SRS) observations are widely used to improve model forecasts. Although they provide distributed information, they are sometimes unable to satisfy flood modellers’ needs due to low overpass frequencies and high measuring uncertainties. This paper assesses the potential of sparsely distributed, in situ floodplain water level sensors to provide accurate, near-real time flood information as a means to enhance flood predictions. A synthetic twin experiment evaluates the assimilation of different sensor network configurations, designed through time series clustering and Voronoi spacing. With spatio-temporal RMSEs reaching up to 1 cm, the study demonstrates great potential. Adequate sensor placement proved crucial for improved performance. In practice, observation locations should be chosen such that they are located rather close to the river, to increase the likelihood of early flooding and thus acquiring valuable information at an early stage of flooding. Furthermore, high measuring frequencies benefit the simulations, though one should be careful not to overcorrect water levels as these may result in inconsistencies. Lastly, a network size of 5 to 7 observations yields good results, while an increasing number of observations generally diminishes the importance of extra observations. Our findings could greatly contribute to future flood observing systems to either compensate for ungauged areas, or complement current SRS practices.
... The aim of DA techniques is the efficient use of observational information to enhance predictions of model state variables using both model and observation uncertainties (Madsen and Skotner, 2005). The EnKF method of DA, used in this research, is widely employed in hydrological applications, since it enables a simplified implementation of hydrological models where non-linear models are predominant. ...
Article
Adequate and accurate hydrological data is necessary to manage water resources, which are critical in developing countries where such information is limited. In recent years, global reanalysis datasets have been developed to provide this information in climatic fields and, more recently, in hydrologic fields. Nevertheless, these latest efforts have been limited to temporal coverage (∼30 years) and mostly simplified hydraulic scheme routing in rivers, which can be inadequate for regional and long term scale objectives. In this article, a dataset (called hydrological reanalysis across the 20th century [HRXX] in the Amazon Basin) was developed as a case study spanning back to the year 1900 through the use of: 1) a large-scale hydrologic-hydrodynamic model (MGB) forced by a long-term climatic reanalysis of rainfall (ERA-20CM) with bias removed; and 2) a data assimilation (DA) technique coupled with a localization method (LEnKF) to use several ground observations of daily discharge within a radius of influence. Several tests were assessed to find the best bias removal method, the optimal radius of influence for the localization method and the final HRXX dataset. A total of 114 hydrological ground observations of daily information were used for assimilation and validation purposes, and several statistics indexes were employed to assess their performance. Results indicate that both bias correction and the DA with localization method greatly improved the simulations. Overestimations of the peaks in the open-loop (OL) (free run) simulation, mainly in the southern and northern regions of the Amazon Basin, were removed, and recession timing in the east-central region, as seen at the Óbidos gauge station, were corrected. An average performance of ∼0.6 and ∼0.7 of the Nash–Sutcliffe and Kling–Gupta indexes was reached, even when only a few of the longest ground observations were used, which can be representative of the oldest periods (since ∼1930). To assess extreme events, the Pearson correlation coefficient was used for maximum and minimum annual water level anomaly values, reaching 0.6 and 0.7, respectively, at the Manaus gauge station, which is remarkable considering that the analysis covers approximately 110 years. Considering the results of this case study and the global coverage of rainfall datasets, this methodology can be transferred to other regions in order to better estimate and create a hydrological reanalysis that adequately represents the hydrologic and hydraulic spatio-temporal fields.
...  Uncertainty due to initial conditions: Uncertainty due to initial conditions relates to the uncertainty of the land surface state, including the soil moisture, snow cover, initial state of the river and other waterbodies in the catchment (Madsen and Skotner, 2005;Gotzinger and Bardossy, 2007;Li et al., 2009). Land surface state measurements are often not in proportion to the heterogeneity of the land surface and this is a source of uncertainty. ...
Article
Full-text available
The scientific literature has many methods for estimating uncertainty, however, there is a lack of information about the characteristics, merits and limitations of the individual methods, particularly for making decisions in practice. This paper provides an overview of the different uncertainty methods for flood forecasting that are reported in literature, concentrating on two established approaches defined as the ensemble and the statistical approach. Owing to the variety of flood forecasting and warning systems in operation, the question ‘which uncertainty method is most suitable for which application’ is difficult to answer readily. The paper aims to assist practitioners in understanding how to match an uncertainty quantification method to their particular application using two flood forecasting system case studies in Belgium and Canada. These two specific applications of uncertainty estimation from the literature are compared, illustrating statistical and ensemble methods, and indicating the information and output that these two types of methods offer. The advantages, disadvantages and application of the two different types of method are identified. Although there is no one ‘best’ uncertainty method to fit all forecasting systems, this review helps to explain the current commonly used methods from the available literature for the non‐specialist.
... This model updating approach improves the initial conditions of the prediction model and thus also the prediction accuracy. The effect of this update will, however, wash out with increasing lead time as no observations are yet available for this forecast period (Madsen and Skotner, 2005). Prediction error methods analyze the past model residuals and apply an error correction scheme to improve the prediction accuracy. ...
Article
Recently, a combination of model predictive control and a reduced genetic algorithm (RGA-MPC) has shown to be an efficient control technique for real-time flood control, making use of fast conceptual river models. This technique was so far only tested under ideal circumstances of perfect model predictions. Prediction errors originating from hydrodynamic model mismatches, however, result in a deterioration of the real-time control performance. Therefore, this paper presents two extensions of the RGA-MPC technique. First, a new type of conceptual model is introduced to further increase the computational efficiency. This reduced conceptual model is specially tailored for real-time flood control applications by eliminating all unnecessary intermediate calculations to obtain the flood control objectives and by introducing a new transport element by means of flow matrices. Furthermore, the RGA-MPC technique is extended with a flexible data assimilation approach that analyzes the past observed errors and applies an appropriate error prediction scheme. The proposed approach largely compensates for the loss in control performance due to the hydrodynamic model uncertainty.
Article
The accurate prediction of wave height attenuation due to vegetation is crucial for designing effective and efficient natural and nature-based solutions for flood mitigation, shoreline protection, and coastal ecosystem preservation. Central to these predictions is the estimation of the vegetation drag coefficient (Cd). The present study undertakes a comprehensive evaluation of three distinct methodologies for estimating the drag coefficient: traditional manual calibration, calibration using a novel application of state-of-the-art metaheuristic optimization algorithms, and the integration of an empirical bulk drag coefficient formula (Tanino and Nepf, 2008) into the XBeach non-hydrostatic wave model. These methodologies were tested using a series of existing laboratory experiments involving nearshore vegetation on a sloping beach. A key innovation of the study is the first application of metaheuristic optimization algorithms for calibrating the drag coefficient, which enables efficient automated searches to identify optimal values aligning with measurements. We found that the optimization algorithms rapidly converge to precise drag coefficients, enhancing accuracy and overcoming limitations in manual calibration which can be laborious and inconsistent. While the integrated empirical formula also demonstrates reasonable performance, the optimization approach exemplifies the potential of computational techniques to transform traditional practices of model calibration. Comparing these strategies provides a framework to determine effective methodology based on constraints in determining the vegetation drag coefficient.
Chapter
The updating scheme with high precision and strong robustness is one of the most important factors affecting the real-time flood forecasting system. The standard Kalman filter algorithm is often used to real-time updating, because of its timeliness and strong tracking. However, it is sensitive to outliers, a small number of outliers can cause seriously collapse. In order to withstand the destruction of outliers on updating process, a robust Kalman filter method is put forward. The robust weight function is introduced to adjust the weight of the measured data recursively. By compressing the weight of the suspicious observations and resulting in a decreased filter gain, the harmful influence of the abnormal observations on the determination of the state variables can be resisted effectively and the robustness of the updating can be achieved. The performances of the proposed method have been compared with the standard Kalman filter by both data with and without outliers. The robust method shows the robust results and the filters the impact of the abnormal observations.KeywordsReal-time flood forecastingOutlierRobustKalman filter
Article
Full-text available
Water level simulation for complex water river networks is complex, and existing forecasting models are mainly used for single-channel rivers. In this paper, we present a new data assimilation model based on the ensemble Kalman filter (EnKF) for accurate water level simulation in complex river networks. The EnKF-based data model was tested on simulated water level data from a river network hydrodynamic model and optimized through parameter analysis. It was then applied to a real mountainous single-channel river and plain river network and compared with a data assimilation model based on the extended Kalman filter (EKF). The results showed that the EnKF-based model, with a medium ensemble sample size of 100–150, normal observation noise of 0.0001–0.01 m, and a high standard deviation of 0.01–0.1 m, outperformed the EKF-based model, with a 49% reduction in simulation errors, a 45% reduction in calculation cost, and a 43% reduction in filtering time. Furthermore, the EnKF-based data assimilation model predicted the water level in the plain river network better than the mountainous single-channel river. Around 5 to 8 h were required for data assimilation; afterwards, the model could make accurate predictions covering 20 to 30 h. The EnKF-based data assimilation model offers a potential solution for water level predictions in river networks.
Technical Report
Full-text available
This report presents a review of flash flood forecasting methods and systems as an initial phase of a study aiming to develop a forecasting system that will be used by the Bureau of Meteorology, Australia, to provide a flash flood warning service. In the report, characteristics of flash floods and causal factors such as hydro-meteorological processes, and hydrologic and hydraulic processes are described. Intense rainfall is the most common causal factor for flash flood formation. Advances in rainfall forecasting techniques and usage of remotely sensed data for flash flood forecasting are presented. Advance techniques for merging different sources of information (e.g. ground data, radar and satellite observations, numerical weather prediction model outputs) for producing better rainfall estimates and long lead-time rainfall forecasts are discussed. A number of flash flood forecasting models and methods are briefly described. Authors recognize that Physically-based Distributed Hydrological Models (PDHM) are more appropriate than data driven models and conceptual hydrological models for flash flood forecasting. The ability of PDHMs considering spatial distribution of rainfall and catchment state variables (e.g. soil moisture), and the potential for application to poorly-gauged catchments are advantages. Real-time updating of hydrological model parameters in the flash flood forecasting system is recommended. The two potential methods for flash flood forecasting in Australia are recognized as the flash flood guidance (FFG) method (section 4.2) and the statistical-distributed modelling (SDM) method (section 4.1.5). Their characteristics and operational feasibility are discussed. Further a probabilistic forecast approach that better represents the uncertainty associated with forecasts is recommended. Some of the operational flood forecasting systems used in different countries are described. Their advantages and limitations are suggested. Finally, future research directions to improve the quality of flash flood forecasts are discussed.
Thesis
Cette thèse s’inscrit dans le cadre d’un partenariat entre le CEREMA (Centre d’Études et d’expertise sur les Risques, l’Environnement, la Mobilité et l’Aménagement), EDF R&D, le CERFACS (Centre Européen de Recherche et de Formation Avancée en Calcul Scientifique) et le SCHAPI (Service Central d'Hydrométéorologie et d’Appui à la Prévision des Inondations). Afin de réaliser les cartes de vigilance bi-quotidiennes, le SCHAPI et les 19 SPC (Services de Prévision des Crues) répartis sur le territoire utilisent entre autres des résultats de modèles numériques généralement lancés de manière déterministe (prévisions météorologiques, modélisations hydrologique et hydraulique). L’objectif de la thèse est la mise en place et l’évaluation de prévisions d’ensembles hydrologiques et hydrauliques dans le cadre de la vigilance crue-inondation réalisée par les services de l’État afin de mieux appréhender et réduire les incertitudes dans un contexte de prévision à courte et moyenne échéance (24 heures). L’originalité de ce travail réside dans l’utilisation hybride de modèles à base physique et de modèles d’apprentissage sur un important volume de données. Dans cet objectif, les prévisions météorologiques forcent un modèle chaîné hydrologie-hydraulique afin de fournir des prévisions de débit et de hauteurs d’eau. Afin de prendre en compte les diverses sources d’incertitude liées aux modèles numériques, aux paramètres des modèles et aux données associées, l’approche déterministe est remplacée par une approche ensembliste ; on fournit ainsi un ensemble de prévisions de débits et hauteurs d’eau.Le bassin d’étude est le bassin versant de l’Odet situé dans le Finistère. La partie amont du bassin est modélisée par un modèle hydrologique (GRP ou MORDOR-TS). Il fournit une prévision de débit qui sert de forçage au modèle hydraulique 1D MASCARET, qui lui prévoit des hauteurs d’eau aux stations de vigilance en aval.Dans un premier temps, une étude de sensibilité globale (GSA) est menée sur les modèles hydrologiques et hydrauliques. Ceci est un préalable à la génération des prévisions d’ensemble. La GSA permet d’identifier les sources principales d’incertitude et ainsi de perturber les paramètres incertains significatifs pour la représentation des débits et des hauteurs d’eau prévus. La propagation de ces incertitudes aboutit à la création d’un ensemble brut pour l’hydrologie et pour l’hydraulique, les ensembles hydrologiques étant utilisés pour forcer les ensembles hydrauliques. Deux méthodes de correction des ensembles sont alors investiguées dans la thèse : la calibration statistique via la méthode des forêts aléatoires « Quantile Regression Forest » et la calibration par assimilation de données via un filtre de Kalman d’ensemble (EnKF). On a montré que ces deux approches améliorent significativement les performances de l’ensemble en termes de fiabilité et résolution. Enfin la comparaison des performances des prévisions d'ensemble est finalement réalisée pour l’hydrologie et l’hydraulique et des préconisations sont émises pour la génération opérationnelle de prévisions d’ensemble au sein des SPC
Article
In recent decades, the DC resistivity method has been applied to geophysical monitoring because of its sensitivity to hydrogeological properties. However, existing inversion algorithms cannot give a reasonable image if fluid migration is sudden and unpredictable. Additionally, systematic or measurement errors can severely interfere with accurate object location. To address these issues, we propose an improved time series inversion method for cross-hole electrical resistivity tomography (cross-hole ERT) based on the Extended Kalman Filter (EKF). Traditional EKF includes two steps to obtain the current model state: prediction and correction. We improved the prediction step by introducing the grey time series prediction method to create a new regular model sequence that can infer the potential trend of underground resistivity changes and provide a prior estimation state for reference during the next moment. To include more current information in the prior estimation state and decrease the non-uniqueness, the prediction model needs to be further updated by the least-squares method. For the correction step, we used single time-step multiple filtering to better deal with the case of sudden and rapid changes. We designed three different numerical tests simulating rapid changes in a fluid to validate the proposed method. The proposed method can capture rapid changes in the groundwater transport rate and direction of the groundwater movement for real-time imaging. Model and field experiments were performed. The inversion results of the model experiment were generally consistent with the results of dye tracing, and the groundwater behavior in the field experiment was consistent with the predicted groundwater evolution process.
Article
Study region The Sirikit Dam in the Nan River Basin is located on a main tributary of the Chao Phraya River in Thailand. Study focus This study investigates forecasting river flows and real-time optimization of dam release using a distributed hydrological model with ensemble weather forecasting for reservoir operations which provide hydropower and irrigation facilities in Thailand during a case study of the 2019 drought event. Medium-range ensemble precipitation forecasts were employed using a hydrological model to predict the real-time reservoir inflow. Real-time optimization of the water release strategy determined a week in advance with an effective initial condition for hydropower generation and irrigation was conducted with different scenarios using dynamic programming considering inflow predictions. New hydrological insights for the region The medium-range ensemble precipitation forecast conducted by the European Centre for Medium Range Weather Forecasts was used to quantify precipitation for the study basin. The ensemble precipitation forecast with the hydrological model was employed for inflow prediction of the study basin which was located in a tropical climate with a distinct wet and dry season. The initial conditions of the hydrological model largely influenced the real-time inflow forecast. To determine the initial conditions of the model, the empirical data assimilation considering a drainage area factor was utilized and observed precipitation data were used for model input forcing data during the initial analysis period. This method improved the reservoir inflow prediction and real-time reservoir optimization using dynamic programming with considering ensemble forecasts provided more efficient operating decisions than employing historical data. The resulting information will be useful for water resource management, which may be adapted to other basins in the study region.
Article
Full-text available
Increasing renewable energy usage puts extra pressure on decision-making in river hydropower systems. Decision support tools are used for near-future forecasting of the water available. Model-driven forecasting used for river state estimation often provides bad results due to numerous uncertainties. False inflows and poor initialization are some of the uncertainty sources. To overcome this, standard data assimilation (DA) techniques (e.g., ensemble Kalman filter) are used, which are not always applicable in real systems. This paper presents further insight into the novel, tailor-made model update algorithm based on control theory. According to water-level measurements over the system, the model is controlled and continuously updated using proportional–integrative–derivative (PID) controller(s). Implementation of the PID controllers requires the controllers’ parameters estimation (tuning). This research deals with this task by presenting sequential, multi-metric procedure, applicable for controllers’ initial tuning. The proposed tuning method is tested on the Iron Gate hydropower system in Serbia, showing satisfying results. HIGHLIGHTS Uncertainty of the boundary and initial conditions affects model-driven forecasting.; Data Assimilation is used to overcome these problems.; Research presents potential of using novel, tailor-made, PID controllers-based data assimilation method for river hydraulic models update.; Method could be used as a decision-support tool for hydropower systems control.; Sequential, multi-metric tuning procedure has been introduced.;
Article
Full-text available
In this study, we proposed and tested a method based on system response curve (SRC) to extract the error information of areal mean rainfall. These extracted rainfall error are then used to update the rainfall time series data and propagated through the model to update the stream-flow forecast. The method is evaluated via synthetic and real-data cases. After our statistical analysis in different application scenarios, we found that the SRC method can effectively improve the accuracy of real-time flood forecasting when an optimal updating width is selected (the average relative error was reduced from 1.65% to 0.86%). In addition, we benchmark our results against a more conventional AR (Auto-Regression) streamflow-updating method. The average accuracy improvement of SRC method is 6.3% higher than that of AR method. More importantly, we found that the optimal updating width of SRC method is highly correlated with the average lead time of the basin, it has guiding significance for selecting the optimal updating width in practical application.
Article
Full-text available
Multiple factors including rainfall and underlying surface conditions make river basin real-time flood forecasting very challenging. It is often necessary to use real-time correction techniques to modify the forecasting results so that they reach satisfactory accuracy. There are many such techniques in use today; however, they tend to have weak physical conceptual basis, relatively short forecast periods, unsatisfactory correction effects, and other problems. The mechanism that affects real-time flood forecasting error is very complicated. The strongest influencing factors corresponding to this mechanism affect the runoff yield of the forecast model. This paper proposes a feedback correction algorithm that traces back to the source of information, namely, modifies the watershed runoff. The runoff yield error is investigated using the principle of least squares estimation. A unit hydrograph is introduced into the real-time flood forecast correction; a feedback correction model that traces back to the source of information. The model is established and verified by comparison with an ideal model. The correction effects of the runoff yield errors are also compared in different ranges. The proposed method shows stronger correction effect and enhanced prediction accuracy than the traditional method. It is also simple in structure and has a clear physical concept without requiring added parameters or forecast period truncation. It is readily applicable in actual river basin flood forecasting scenarios.
Article
The accurate monitoring and early warning of river water level is an important measure to ensure the safety of life and property of river basin residents, and the high precision forecasting of river water level is a vital prerequisite to realize this requirement. Therefore, a hybrid model based on Singular Spectrum Analysis (SSA) method, Group Method of Data Handling (GMDH) neural network, Weighted Integration based on Accuracy and Diversity (WIAD) and Kernel Extreme Learning Machine (KELM) algorithm, namely SSA-WIAD-GMDH-KELM model, is proposed to achieve the forecasting of river water level. The original data collected continuously and real-timely from four monitoring stations of two rivers in China are chosen to prove the high-quality of the proposed hybrid model. In order to investigate the forecasting performance of the proposed hybrid model, several comparison models are selected, which consist of the single GMDH model, SSA-GMDH model, SSA-WIAD-GMDH model and SSA-GMDH-KELM model. The experimental results show that: (1) the prediction effect of the SSA-WIAD-GMDH-KELM model on river water level is satisfactory, which has been verified in four groups of original water level series, and the proposed model has good generalization ability; (2) in the proposed hybrid model, SSA can efficiently extract the principal component of the original series, GMDH has good prediction stability, and both WIAD and KELM can effectively improve the prediction accuracy of the model.
Article
Full-text available
Model-driven forecasting, used for flood risks or big hydropower systems management, can produce results of unsatisfying accuracy even with best-calibrated hydrodynamic models. One of the biggest uncertainty sources is the inflow data, either produced by different hydrological models or obtained using unreliable rating curves. To keep the model in the up-to-date state, data assimilation techniques are used. The aim of the assimilation is to reduce the difference between simulated and observed state of selected variables by updating hydrodynamic model state variables according to observed water levels. The widely used data assimilation method applicable for nonlinear hydrodynamic models is Ensemble Kalman Filter (EnKF). However, this method can often increase the computational time due to complexity of mathematical apparatus, making it less applicable in everyday operations. This paper presents the novel, fast, tailor-made data assimilation method, suitable for 1D open channel hydraulic models, based on control theory. Using Proportional-Integrative-Derivative (PID) controllers, the difference between measured levels and simulated levels obtained by hydrodynamic model is reduced by adding or subtracting the flows in the junctions/sections where water levels are measured. The novel PID control-based data assimilation (PID-DA) is compared to EnKF. Benchmarking shows that PID-DA can be used for data assimilation, even coupled with simplified 1D hydraulic models, without significant sacrifice of stability and accuracy, and with reduction of computational time up to 63 times.
Article
Full-text available
The paper presents a classification and a review of the updating procedures currently used in real-time flood forecasting modelling. On the basis of results from the WMO project 'Simulated Real-Time Intercomparion of Hydrological Models', comprising more than 10 commonly used hydrological models and a variety of different updating procedures, an analysis of the relative importance of updating procedures and hydrological simulation models is provided. In particular, an intercomparison is made beteween two models (NAMS11/MIKE11 and NAMKAL) consisting of the same hydrological model (NAM conceptual rainfall-runoff) but containing different routing modules (linear reservoirs versus hydraulic routing) and different updating procedures (error prediction versus state variable updating based on an extended Kalman filter). A main conclusion is that updating procedures significantly improve the performances of hydrological models for short-range forecasting. Furthermore, there are no clear conclusions regarding which type of updating procedure performs the better. However, intercomparison of the NAMS11 and NAMKAL models indicates that the extended Kalman filter is marginally better than an error prediction model in cases where the basic hydrological model simulation is good. Finally, it is concluded that the basic simulation is very essential for accurate forecasts, and that the better the basic simulations are the better the updating routines in general function. This puts emphasis on the importance of thoroughly calibrating and validating the hydrological simulation models before applying them together with updating routines in operational real-time forecasting.
Article
Full-text available
The successful application of a conceptual rainfall-runoff (CRR) model depends on how well it is calibrated. Despite the popularity of CRR models, reports in the literature indicate that it is typically difficult, if not impossible, to obtain unique optimal values for their parameters using automatic calibration methods. Unless the best set of parameters associated with a given calibration data set can be found, it is difficult to determine how sensitive the parameter estimates (and hence the model forecasts) are to factors such as input and output data error, model error, quantity and quality of data, objective function used, and so on. Results are presented that establish clearly the nature of the multiple optima problem for the research CRR model SIXPAR. These results suggest that the CRR model optimization problem is more difficult than had been previously thought and that currently used local search procedures have a very low probability of successfully finding the optimal parameter sets. Next, the performance of three existing global search procedures are evaluated on the model SIXPAR. Finally, a powerful new global optimization procedure is presented, entitled the shuffled complex evolution (SCE-UA) method, which was able to consistently locate the global optimum of the SIXPAR model, and appears to be capable of efficiently and effectively solving the CRR model optimization problem.
Article
Full-text available
This paper describes a somewhat alternative approach to combining observations and numerical model results in order to produce a more accurate forecast routine. The approach utilizes artificial neural networks to analyze and forecast the errors created by numerical models. The resulting hybrid model provides very good forecast skills that can be extended over a forecasting horizon of considerable length. The method has been developed for the purpose of operational forecasting of current speeds in the Danish empty setresund Strait. The forecast system was used as a planning tool during the construction of the 16 km-long fixed link across the empty setresund Strait, linking the countries of Denmark and Sweden.
Article
Full-text available
The NAM rainfall-runoff model (a lumped, conceptual model developed in Denmark) has been re­ formulated in a state space form, and the Kaiman filter­ ing algorithm has been incorporated. Uncertainties on rainfall input and on the measured discharges are taken into account, as well as the uncertainties on the most important model parameters. When the Kaiman filtering algorithm is applied as an updating procedure, the model can be used for real time operation. Further, due to the inclusion of the most important sources of uncertainty, the state space model can be used for calculation of uncertainty bands on the simulated streamflows. For instance, the effects of parameter uncertainty and rainfall uncertainty, respectively, can be evaluated and compared. The general approach and the fundamental principles of the modelling are described. Furthermore, the functioning of the updating procedure and the uncertainty analyses are illustrated by simulation results.
Article
Full-text available
A number of algorithms to solve large-scale Kalman filtering problems have been introduced recently. The ensemble Kalman filter represents the probability density of the state estimate by a finite number of randomly generated system states. Another algorithm uses a singular value decomposition to select the leading eigenvectors of the covariance matrix of the state estimate and to approximate the full covariance matrix by a reduced-rank matrix. Both algorithms, however, still require a huge amount of computer resources. In this paper the authors propose to combine the two algorithms and to use a reduced-rank approximation of the covariance matrix as a variance reductor for the ensemble Kalman filter. If the leading eigenvectors explain most of the variance, which is the case for most applications, the computational burden to solve the filtering problem can be reduced significantly (up to an order of magnitude).
Article
This paper investigaes the adaptive use of a simple conceptual lumped rainfall-runoff model based on a Probability Distributed Model complemented with a Geomorphological Unit Hydrograph. Three different approaches for updating the model and for its use for real time flood forecasting are compared: the first two are based on a parameter updating approach; in the third procedure the model is cast into a state-space form and an Extended Kalman Filter is applied for the on-line estimation of the state variables. The comparison shows that the procedure based on the filtering techniques provides more reliable results; acceptable results are also obtained by using a parameter updating approach based on the on-line adjustment of the initial conditions.
Article
Data assimilation in operational forecasting systems is a discipline undergoing rapid development. Despite the ever increasing computational resources, it requires efficient as well as robust assimilation schemes to support online prediction products. The parameter considered for assimilation here is water levels from tide gauge stations. The assimilation approach is Kalman filter based and examines the combination of the Ensemble Kalman Filter with spatial and dynamic regularization techniques. Further, both a Steady Kalman gain approximation and a dynamically evolving Kalman gain are considered. The estimation skill of the various assimilation schemes is assessed in a 4-week hindcast experiment using a setup of an operational model in the North Sea and Baltic Sea system. The computationally efficient dynamic regularization works very well and is to be encouraged for water level nowcasts. Distance regularization gives much improved results in data sparse areas, while maintaining performance in areas with a denser distribution of tide gauges.
Article
A digital model has been developed for the simulation of the rainfall-runoff process of rural watersheds. Input data are daily values of precipitation and temperature together with mean monthly potential evapotranspiration. The model produces daily values of streamflow as well as information on the time variation of the soil moisture content. In all, ten model parameters have to be identified, seven of which have a major influence on the performance of the model. The model operates by accounting continuously for the moisture content in four different and mutually interrelated storages representing physical elements in the watershed. It has been applied to three different Danish watersheds. Several statistical measures of accuracy have been utilized for a quantitative evaluation of the simulation results. The simulations demonstrate that the main shortcomings of the model are due to the lack of a procedure accounting for frozen ground during extended periods of frost, which could improve some of the simulation results during winter and spring.
Article
This chapter focuses upon real-time hydrological forecasting, and the mathematical apparatus necessary to carry out such forecasting. To successfully design real-time forecasting models, one requires an in-depth understanding of both hydrology and statistics. The chapter emphasizes the statistical and systems theory aspects of forecasting since the rest of this book emphasizes the hydrological aspects. Three aspects are crucial to successful forecasting One is the importance of representing the hydrological model within the feedback structure of a state vector model. This leads us to the second aspect of the chapter - the importance of data collection to model and parameter identification. The third part of this chapter presents a number of useful algorithms for estimating the parameters from noisy data. Refs.
Article
A procedure is presented for assimilation of water levels and fluxes in the MIKE 11 Flood Forecasting (FF) system. The procedure implemented is based on the ensemble Kalman filter that provides a cost-effective and efficient updating and uncertainty propagation scheme for real-time applications. Up to the time of forecast, the model is updated according to the Kalman filter algorithm using the available measurements. In forecast mode, the Kalman filter provides an ensemble forecast that is used for estimation of water levels and fluxes in the river system and the associated uncertainties. A test example is presented where the MIKE 11 FF system is applied for flood forecasting in the Piedmont region in the northwestern part of Italy. Application of the ensemble Kalman filter significantly improves the forecast skills as compared to forecasting without data assimilation.
Article
Kalman Filtering with Real-Time Applications presents a thorough discussion of the mathematical theory and computational schemes of Kalman filtering. The filtering algorithms are derived via different approaches, including a direct method consisting of a series of elementary steps, and an indirect method based on innovation projection. Other topics include Kalman filtering for systems with correlated noise or colored noise, limiting Kalman filtering for time-invariant systems, extended Kalman filtering for nonlinear systems, interval Kalman filtering for uncertain systems, and wavelet Kalman filtering for multiresolution analysis of random signals. Most filtering algorithms are illustrated by using simplified radar tracking examples. The style of the book is informal, and the mathematics is elementary but rigorous. The text is self-contained, suitable for self-study, and accessible to all readers with a minimum knowledge of linear algebra, probability theory, and system engineering.
Article
Data assimilation in a two-dimensional hydrodynamic model for bays, estuaries and coastal areas is considered. Two different methods based on the Kalman filter scheme are presented. These include (1) an extended Kalman filter in which the error covariance matrix is approximated by a matrix of reduced rank using a square root factorisation (RRSQRT KF), and (2) an ensemble Kalman filter (EnKF) based on a Monte Carlo simulation approach for propagation of errors. The filtering problem is formulated by utilising a general description of the model noise process related to errors in the model forcing, i.e. open boundary conditions and meteorological forcing. The performance of the two Kalman filters is evaluated using a twin experiment based on a hypothetical bay region. For both filters, the error covariance approximation sufficiently resolves the error propagation in the model at a computational load that is significantly smaller than required by the full Kalman filter algorithm. Furthermore, the Kalman filters are shown to be very robust with respect to defining the errors. Even in the case of a severely biased model error, the filters are able to efficiently correct the model. In general, the use of coloured model noise provides a numerically more efficient algorithm as well as a better performance of the filter. The error covariance approximation in the RRSQRT KF is shown to be more efficient than the error representation in the EnKF. For strongly non-linear dynamics, however, the EnKF is preferable. Copyright © 1999 John Wiley & Sons, Ltd.
Article
A tank model for rainfall-runoff is used for application of the Kalman filter. The state vector, representing the parameters of the tank model and its initial values, was estimated by trial and error. The tank model using the Kalman filter with a recursive algorithm accurately predicted runoff in a basin in Korea. The filter allowed the model parameters to vary in time and reduced the physical uncertainty of the rainfall-runoff process in the river basin.
Article
The development and implementation of the Piemonte RegionSs real-time Flood Fore- casting System is described. The area of interest is the Upper Po River basin (North- west Italy) of approximately 37000 km2 and its river network of about 3000 Km and 3 big lakes. FloodWatch, a GIS-based decision support system for real-time flood fore- casting has been developed and operationally used since June 2000 at the Piemonte RegionSs Room for the Situation of Natural Hazards in Torino, Italy. FloodWatch is based on MIKE 11 modules which provide a continuos lumped hydrological model- ing of 187 tree-structured subcatchments connected by a 1D distributed hydrodynamic model. It is directly linked to the existing telemetric system, which provides measured data from more than 270 meteorological stations (rainfall and temperature) and about 80 water level gauging stations. In addition, FloodWatch uses quantitative precipita- tion and temperature forecasts daily issued by the Regional Meteorological Service on the 11 zones in which the study area is subdivided. At present, FloodWatch auto- matically supplies operational forecasts of water-level and discharge at 73 locations for up to 48 hours. The development of a fast and reliable flow forecasting system for this large and heterogeneous river basin required careful balance between the need for rapid and accurate forecasts and of a correct representation of run-off generation, flood propagation, baseflows, snow accumulation and melting. Strengths and limits of the system are focused addressing the need for future development. Some results are presented with particular regard to the October 2000 flood event, when the northwest of Italy experienced one of the largest floods on record. Heavy and prolonged rainfall fell across the entire Po river basin. The flood inundated vast areas causing widespread damage and thousands of people were warned and alerted to evacuate.
Article
A sequential data assimilation technique (utilising a reduced-rank square-root filter) has been incorporated into the complete non-linear equations of a hydrodynamical module of a standard two dimensional hydrodynamic modelling tool. The deterministic model output is corrected using the available data and the best model initial conditions are generated. The deterministic model is then used to generate the water level forecast. A description of the general technique that is applied and its adaptation to this particular problem are provided. The application to a real storm that occurred during February 1993 in the North sea is presented.
Article
Data assimilation in operational forecasting systems is a discipline undergoing rapid development. Despite the ever increasing computational resources, it requires efficient as well as robust assimilation schemes to support online prediction products. The parameter considered for assimilation here is water levels from tide gauge stations. The assimilation approach is Kalman filter based and examines the combination of the Ensemble Kalman Filter with spatial and dynamic regularization techniques. Further, both a Steady Kalman gain approximation and a dynamically evolving Kalman gain are considered. The estimation skill of the various assimilation schemes is assessed in a 4-week hindcast experiment using a setup of an operational model in the North Sea and Baltic Sea system. The computationally efficient dynamic regularization works very well and is to be encouraged for water level nowcasts. Distance regularization gives much improved results in data sparse areas, while maintaining performance in areas with a denser distribution of tide gauges.
Article
Genetic programming (GP), a relatively new evolutionary technique, is demonstrated in this study to evolve codes for the solution of problems. First, a simple example in the area of symbolic regression is considered. GP is then applied to real-time runoff forecasting for the Orgeval catchment in France. In this study, GP functions as an error updating scheme to complement a rainfall-runoff model, MIKE11/NAM. Hourly runoff forecasts of different updating intervals are performed for forecast horizons of up to nine hours. The results show that the proposed updating scheme is able to predict the runoff quite accurately for all updating intervals considered and particularly for updating intervals not exceeding the time of concentration of the catchment. The results are also compared with those of an earlier study, by the World Meteorological Organization, in which autoregression and Kalman filter were used as the updating methods. Comparisons show that GP is a better updating tool for real-time flow forecasting. Another important finding from this study is that nondimensionalizing the variables enhances the symbolic regression process significantly.
Article
A new sequential data assimilation method is discussed. It is based on forecasting the error statistics using Monte Carlo methods, a better alternative than solving the traditional and computationally extremely demanding approximate error covariance equation used in the extended Kalman filter. The unbounded error growth found in the extended Kalman filter, which is caused by an overly simplified closure in the error covariance equation, is completely eliminated. Open boundaries can be handled as long as the ocean model is well posed. Well-known numerical instabilities associated with the error covariance equation are avoided because storage and evolution of the error covariance matrix itself are not needed. The results are also better than what is provided by the extended Kalman filter since there is no closure problem and the quality of the forecast error statistics therefore improves. The method should be feasible also for more sophisticated primitive equation models. The computational load for reasonable accuracy is only a fraction of what is required for the extended Kalman filter and is given by the storage of, say, 100 model states for an ensemble size of 100 and thus CPU requirements of the order of the cost of 100 model integrations. The proposed method can therefore be used with realistic nonlinear ocean models on large domains on existing computers, and it is also well suited for parallel computers and clusters of workstations where each processor integrates a few members of the ensemble.
Article
An initial implementation of an operational system for the Danish waters has been carried out. The system consists of an observational network of water level stations, a two-dimensional model that computes simultaneously areas with different grid resolution, and a sequential data assimilation method. For assimilation of water level measurements an approximate Kalman filter algorithm, the ensemble Kalman filter, has been implemented. The ensemble Kalman filter has a computational cost much lower than the cost associated with a full Kalman filter application, and therefore it is suitable to be applied in an operational system. The error covariance matrix in this specific case tends to a quasi-steady state after a few days of assimilation. Thus, a Kalman filter with a constant weighting matrix has been applied during a 13-day test period to assimilate observed water levels in a model covering the entire North Sea and Baltic Sea area. The corrections achieved by the assimilation procedure are significant in most of the validation stations. Moreover, the error estimation provided by the filter is very accurate, especially in the inner Danish waters.
Article
In this work, we propose a modified form of the extended Kalman filter (KF) for assimilating oceanic data into numerical models. Its development consists essentially of approximating the error covariance matrix by a singular low rank matrix, which amounts in practice to making no correction in those directions for which the error is the most attenuated by the system. This not only reduces the implementation cost but may also improve the filter stability as well. These `directions of correction' evolve with time according to the model evolution, which constitutes the most original feature of this filter and distinguishes it from other sequential assimilation methods based on the projection onto a fixed basis of functions. A method for initializing the filter based on the empirical orthogonal functions (EOF) is also described. An example of assimilation based on the quasi-geostrophic (QG) model for a square ocean domain with a certain wind stress forcing pattern is given. Although this is only a simple test case designed to assess the feasibility of the method, the results are very encouraging.
Article
This work is directed toward approximating the evolution of forecast error covariances for data assimilation. The performance of different algorithms based on simplification of the standard Kalman filter (KF) is studied. These are suboptimal schemes (SOSs) when compared to the KF, which is optimal for linear problems with known statistics. The SOSs considered here are several versions of optimal interpolation (OI), a scheme for height error variance advection, and a simplified KF in which the full height error covariance is advected. To employ a methodology for exact comparison among these schemes, a linear environment is maintained, in which a beta-plane shallow-water model linearized about a constant zonal flow is chosen for the test-bed dynamics. The results show that constructing dynamically balanced forecast error covariances rather than using conventional geostrophically balanced ones is essential for successful performance of any SOS. A posteriori initialization of SOSs to compensate for model - data imbalance sometimes results in poor performance. Instead, properly constructed dynamically balanced forecast error covariances eliminate the need for initialization. When the SOSs studied here make use of dynamically balanced forecast error covariances, the difference among their performances progresses naturally from conventional OI to the KF. In fact, the results suggest that even modest enhancements of OI, such as including an approximate dynamical equation for height error variances while leaving height error correlation structure homogeneous, go a long way toward achieving the performance of the KF, provided that dynamically balanced cross-covariances are constructed and that model errors are accounted for properly. The results indicate that such enhancements are necessary if unconventional data are to have a positive impact.
Article
The Kalman filter algorithm can be used for many data assimilation problems. For large systems, that arise from discretizing partial differential equations, the standard algorithm has huge computational and storage requirements. This makes direct use infeasible for many applications. In addition numerical difficulties may arise if due to finite precision computations or approximations of the error covariance the requirement that the error covariance should be positive semi-definite is violated. In this paper an approximation to the Kalman filter algorithm is suggested that solves these problems for many applications. The algorithm is based on a reduced rank approximation of the error covariance using a square root factorization. The use of the factorization ensures that the error covariance matrix remains positive semi-definite at all times, while the smaller rank reduces the number of computations and storage requirements. The number of computations and storage required depend on the problem at hand, but will typically be orders of magnitude smaller than for the full Kalman filter without significant loss of accuracy. The algorithm is applied to a model based on a linearized version of the two-dimensional shallow water equations for the prediction of tides and storm surges. For non-linear models the reduced rank square root algorithm can be extended in a similar way as the extended Kalman filter approach. Moreover, by introducing a finite difference approximation to the Reduced Rank Square Root algorithm it is possible to prevent the use of a tangent linear model for the propagation of the error covariance, which poses a large implementational effort in case an extended kalman filter is used.
Data assimilation in the MIKE 11 flood forecasting system using Kalman filtering, Water Resource Systems-Hydrological Risk, Management and Development
  • H Madsen
  • D Rosbjerg
  • J Damgå
  • F S Hansen
Madsen, H., Rosbjerg, D., Damgå, J., Hansen, F.S., 2003. Data assimilation in the MIKE 11 flood forecasting system using Kalman filtering, Water Resource Systems-Hydrological Risk, Management and Development (Proceedings of symposium HS02b held during IUGG2003 at Sapporo, July 2003). IAHS Publ. no. 281, 75–81.
Data assimilation in river flow modelling Proceedings of the Fourth DHI Software Conference MIKE 11—a generalized river modelling package Computer Models of Watershed Hydrology
  • J Hartnack
  • H K Madsen
  • M N Madsen
  • J Dørge
Hartnack, J., Madsen, H., 2001. Data assimilation in river flow modelling Proceedings of the Fourth DHI Software Conference, (June 2001, Helsingør, Denmark). http://www.dhisoftware.com/uc2001/ Abstracts_Proceedings/Proceedings/water_ resources_track.htm. Havnø, K., Madsen, M.N., Dørge, J., 1995. MIKE 11—a generalized river modelling package, In: Singh, V.P. (Ed.), Computer Models of Watershed Hydrology. Water Resources Publications, Colorado, pp. 733–782.
Data assimilation in rainfall-runoff forecasting
  • H Madsen
  • M B Butts
  • S T Khu
  • S Y Liong
Madsen, H., Butts, M.B., Khu, S.T., Liong, S.Y., 2000. Data assimilation in rainfall-runoff forecasting, Hydroinformatics 2000, Fourth International Conference on Hydroinformatics, Cedar Rapids, Iowa, USA, 23-27 July, 2000.
Computational Hydraulics, second ed Neural networks as routine for error updating of numerical models
  • M B Abbott
  • A W Minns
  • Aldershot
  • Ashgate
  • V Babovic
  • R Cañ
  • H R Jensen
  • A Klinting
Abbott, M.B., Minns, A.W., 1998. Computational Hydraulics, second ed. Aldershot, Ashgate. Babovic, V., Cañ, R., Jensen, H.R., Klinting, A., 2001. Neural networks as routine for error updating of numerical models. J. Hydraul. Eng., ASCE 127 (3), 181–193.
MIKE 11. A modelling system for rivers and channels, Reference Manual
DHI, 2003. MIKE 11. A modelling system for rivers and channels, Reference Manual. DHI Water & Environment.
Data assimilation in the MIKE 11 flood forecasting system using Kalman filtering, Water Resource Systems-Hydrological Risk, Management and Development (Proceedings of symposium HS02b held during IUGG2003 at Sapporo
  • H Madsen
  • D Rosbjerg
  • J Damgård
  • F S Hansen
Madsen, H., Rosbjerg, D., Damgård, J., Hansen, F.S., 2003. Data assimilation in the MIKE 11 flood forecasting system using Kalman filtering, Water Resource Systems-Hydrological Risk, Management and Development (Proceedings of symposium HS02b held during IUGG2003 at Sapporo, July 2003). IAHS Publ. no. 281, 75-81.
Simulated real-time intercomparison of hydrological models
  • Wmo
WMO, 1992. Simulated real-time intercomparison of hydrological models, Operational Hydrology Report No. 38. World Meteorological Organisation, Geneva.
Data assimilation in river flow modelling
  • Hartnack
Data assimilation in rainfall-runoff forecasting
  • Madsen