Victor Venema's research while affiliated with University of Bonn and other places

Publications (107)

Article
The homogenization of climate observational series is a needed process before undertaking confidently any study of their internal variability, since changes in the observation methods or in the surroundings of the observatories, for instance, can introduce biases in the data of the same order of magnitude than the underlying climate variations and...
Article
Full-text available
There is considerable import in creating more complete, better understood holdings of early meteorological data. Such data permit an improved understanding of climate variability and long-term changes. Early records are particularly incomplete in the tropics, with implications for estimates of global and regional temperature. There is also a relati...
Preprint
Full-text available
There is considerable import in creating more complete, better understood, holdings of early meteorological data. Such data permit an improved understanding of climate variability and long-term changes. Early records are particularly incomplete in the tropics, with implications for estimates of global and regional temperature. There is also a relat...
Article
We use symbolic regression to estimate daily precipitation amounts at six stations in the Alpine region from a global reanalysis. Symbolic regression only prescribes the set of mathematical expressions allowed in the regression model, but not its structure. The regression models are generated by genetic programming (GP) in analogy to biological evo...
Article
The aim of time series homogenization is to remove non-climatic effects, such as changes in station location, instrumentation, observation practices, etc., from observed data. Statistical homogenization usually reduces the non-climatic effects, but does not remove them completely. In the Spanish MULTITEST project, the efficiencies of automatic homo...
Technical Report
Full-text available
This is a WMO publication, WMO-No. 1245.
Article
Subsurface hydrodynamics are an important component of the hydrological cycle and a key factor in the partitioning of land surface water and energy fluxes. Because of computational reasons they are often neglected, or strongly simplified, in numerical weather prediction and climate models. Particularly in regions where the water table is shallow, s...
Article
Inhomogeneities in station series are a large part of the uncertainty budget of long‐term temperature trend estimates. This paper introduces two analytical equations for the dependence of the station trend uncertainty on the statistical properties of the inhomogeneities. One equation is for inhomogeneities that act as random deviations (RD) from a...
Article
Full-text available
The early twentieth-century warming (EW; 1910–45) and the mid-twentieth-century cooling (MC; 1950–80) have been linked to both internal variability of the climate system and changes in external radiative forcing. The degree to which either of the two factors contributed to EW and MC, or both, is still debated. Using a two-box impulse response model...
Preprint
Full-text available
Climate data is affected by inhomogeneities due to historical changes in the way the measurements were performed. Understanding these inhomogeneities is important for accurate estimates of long-term changes in the climate. These inhomogeneities are typically characterized by the number of breaks and the size of the jumps or the variance of the brea...
Article
Climate data is affected by inhomogeneities due to historical changes in the way the measurements were performed. Understanding these inhomogeneities is important for accurate estimates of long‐term changes in the climate. These inhomogeneities are typically characterized by the number of breaks and the size of the jumps or the variance of the brea...
Preprint
Full-text available
Draft guidance on the homogenisation of climate station data of the World Meteorological Organisation.
Article
Inhomogeneities in climate data are the main source of uncertainty for secular warming estimates. To reduce the influence of inhomogeneities in station data statistical homogenization compares a candidate station to its neighbours to detect and correct artificial changes in the candidate. Many studies have quantified the performance of statistical...
Conference Paper
Symbolic regression is used to estimate daily time series of local station precipitation amounts from global climate model output with a coarse spatial resolution. Local precipitation is of high importance in climate impact studies. Standard regression, minimizing the RMSE or a similar point-wise error, by design underestimates temporal variability...
Article
Three homogenization methods (ACMANT, MASH and HOMOP) have been evaluated for their efficiency in homogenizing daily relative humidity data. A homogeneous surrogate data set based on Austrian stations was created and perturbed to simulate inhomogeneous, realistic time series ("validation data sets"). Two validation data sets ("simple" and "complex"...
Article
Full-text available
There is overwhelming evidence that the climate system has warmed since the instigation of instrumental meteorological observations. The Fifth Assessment Report of the Intergovernmental Panel on Climate Change concluded that the evidence for warming was unequivocal. However, owing to imperfect measurements and ubiquitous changes in measurement netw...
Preprint
Full-text available
Inhomogeneities in climate data are the main source of uncertainty for secular warming estimates. To reduce the influence of inhomogeneities in station data statistical homogenization compares a candidate station to its neighbors to detect and correct artificial changes in the candidate. Many studies have quantified the performance of statistical b...
Preprint
Full-text available
As part of the COST Action HOME a dataset has been generated that will serve as a benchmark for homogenisation algorithms. Members of the Action and third parties have been invited and are still welcome to homogenise this dataset. The results of this exercise was analysed to obtain recommendations for a standard homogenisation procedure and are des...
Preprint
Full-text available
Instrumental climate records of the last centuries suffer from multiple breaks due to relocations and changes in measurement techniques. These breaks are detected by relative homogenization algorithms using the difference time series between a candidate and a reference. Modern multiple changepoint methods use a decomposition approach where the segm...
Article
Full-text available
Global surface temperature changes are a fundamental expression of climate change. Recent, much-debated variations in the observed rate of surface temperature change have highlighted the importance of uncertainty in adjustments applied to sea surface temperature (SST) measurements. These adjustments are applied to compensate for systematic biases a...
Conference Paper
Coupled models of the soil-vegetation-atmosphere systems are increasingly used to investigate interactions between the system components. Due to the different spatial and temporal scales of relevant processes and computational restrictions, the atmospheric model generally has a lower spatial resolution than the land surface and subsurface models. W...
Article
Full-text available
Long historical climate records usually contain non-climatic changes that can influence the observed behaviour of meteorological variables. The availability of parallel measurements offers an ideal occasion to study these discontinuities as they record the same climate. The transition from manual to automatic measurements has been analysed in this...
Presentation
Full-text available
A description of the main progress made by the Parallel Observations Scientific Team (POST).
Article
Long instrumental climate records suffer from inhomogeneities due to, e.g. relocations of the stations or changes in instrumentation, which may introduce sudden jumps into the time series. These inhomogeneities may mask or strengthen true trends. Relative homogenization algorithms use the difference time series of a candidate station with neighbori...
Article
Full-text available
Described herein is the first version release of monthly temperature holdings of a new Global Land Surface Meteorological Databank. Organized under the auspices of the International Surface Temperature Initiative (ISTI), an international group of scientists have spent three years collating and merging data from numerous sources to create a merged h...
Article
Full-text available
The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank....
Article
Full-text available
The coupling of models for the different components of the Soil-Vegetation-Atmosphere-System is required to investigate component interactions and feedback processes. However, the component models for atmosphere, land-surface and subsurface are usually operated at different resolutions in space and time owing to the dominant processes. The computat...
Article
Full-text available
The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank....
Article
As part of the COST Action HOME (Advances in homogenisation methods of climate series: an integrated approach), a dataset was generated that serves as a validation tool for correction of daily inhomogeneities. The dataset contains daily air temperature data and was generated based on the temperature series from the Czech Republic. The validation da...
Article
Coupling models for the different components of the Soil-Vegetation-Atmosphere-System requires up-and downscaling procedures. Subject of our work is the downscaling scheme used to derive high resolution forcing data for land-surface and subsurface models from coarser atmospheric model output. The current downscaling scheme [Schomburg et. al. 2010,...
Article
Daily datasets have become a focus of climate research because they are essential for studying the variability and extremes in weather and climate. However, all long observational climate records are usually affected by changes due to nonclimatic factors and looking at the known physical causes of inhomogeneities, one would even expect that many ca...
Article
Since the computational burden of radiative transfer parameterisations is considerable, operational atmospheric models use various sampling, coarsening and interpolation techniques to reduce this load, which, however, introduce errors. An adaptive radiative transfer scheme combines an accurate with a fast parameterisation. The task of the computati...
Article
Full-text available
Handling complexity to the smallest detail in atmospheric radiative transfer models is unfeasible in practice. On the one hand, the properties of the interacting medium, i.e., the atmosphere and the surface, are only available at a limited spatial resolution. On the other hand, the computational cost of accurate radiation models accounting for thre...
Article
The earth’s surface is characterized by small-scale heterogeneity attributable to variability in land cover, soil characteristics and orography. In atmospheric models, this small-scale variability can be partially accounted for by the so-called mosaic approach, i.e., by computing the land-surface processes on a grid with an explicit higher horizont...
Article
Lack of homogeneity of long-term series of in-situ precipitation observations is a known problem and requires time consuming manual data correction in order to allow for a robust trend analysis. This work is focused on the development of an algorithm for automatic data correction of multiple stations. The algorithm relies on the similarity of clima...
Article
As part of the COST Action HOME (Advances in homogenisation methods of climate series: an integrated approach) a dataset was generated that serves as a benchmark for homogenisation algorithms. This presentation will shortly describe this benchmark dataset and focus on the results and lessons learned. Based upon a survey among homogenisation experts...
Article
Full-text available
In recent years increasing effort has been devoted to objectively evaluate the efficiency of homogenisation methods for climate data; an important effort was the blind benchmarking performed in the COST Action HOME (ES0601). The statistical characteristics of the examined series have significant impact on the measured efficiencies, thus it is diffi...
Article
Full-text available
To handle complexity to the smallest detail in atmospheric radiative transfer models is in practice unfeasible. On the one hand, the properties of the interacting medium, i.e. the atmosphere and the surface, are only available at a limited spatial resolution. On the other hand, the computational cost of accurate radiation models accounting for thre...
Article
The computational burden of radiative transfer parametrization is considerable, and hence operational atmospheric models use various sampling, coarsening and interpolation techniques to reduce this load; this, however, introduces new errors. An adaptive radiative transfer scheme takes advantage of the spatial and temporal correlations in the optica...
Poster
A coupled modeling system integrating the atmosphere with land-surface and groundwater components has become one of the key tools to understand the patterns and structures in land-atmosphere interactions. The COSMO-DE(Consortium for Small Scale Modeling -DE) is currently running with a resolution of 2.8 km but ParFlow and CLM (Community Land Model)...
Article
Full-text available
The paper presents an approach for conditional airmass classification based on local precipitation rate distributions. The method seeks, within the potential region, three-dimensional atmospheric predictor domains with high impact on the local scale phenomena. These predictor domains are derived by an algorithm consisting of a clustering method, na...
Article
Full-text available
The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance...
Article
Full-text available
Precipitation downscaling improves the coarse res-olution and poor representation of precipitation in global climate models, and helps end users to as-sess the likely hydrological impacts of climate change. This paper integrates perspectives from meteorolo-gists, climatologists, statisticians and hydrologists, to identify generic end user (in parti...
Article
For driving soil–vegetation–transfer models or hydrological models, high-resolution atmospheric forcing data is needed. For most applications the resolution of atmospheric model output is too coarse. To avoid biases due to the non-linear processes, a downscaling system should predict the unresolved variability of the atmospheric forcing. For this p...
Article
The earth's surface is characterized by heterogeneity at a broad range of scales. Weather forecast models and climate models are not able to resolve this heterogeneity at the smaller scales. Many processes in the soil or at the surface, however, are highly nonlinear. This holds, for example, for evaporation processes, where stomata or aerodynamic r...
Article
Full-text available
Two main groups of statistical methods used in the Earth sciences are geostatistics and stochastic modelling. Geostatistical methods, such as various kriging algorithms, aim at estimating the mean value for every point as well as possible. In case of sparse measurements, such fields have less variability at small scales and a narrower distribution...
Article
Common classification approaches usually operate on both a fixed regular domain like the NCEP/NCAR reanalysis data and a predefined number of predictors. Such assumptions are convenient for many traditional large-scale weather type classification. However, for finding linkages between atmospheric characteristics and local phenomena there is no rigo...
Article
Downscaling provides end users with a means to assess the likely regional impacts of climate change. Although downscaling adds considerable value to projections from general circulation models, crucial gaps are the representation of extreme summer pre- cipitation, sub-daily processes, full precipitation fields and small scale processes and feedback...
Article
As part of the COST Action HOME (Advances in homogenisation methods of climate series: an integrated approach) a dataset was generated that serves as a benchmark for homogenisation algorithms. Members of the Action and third parties have been invited to homogenise this dataset. The results of this exercise are analysed by the HOME Working Groups (W...
Article
We present a novel algorithm for the downscaling of three-dimensional cloud fields. The goal of the algorithm is to add realistic subscale variability to a coarse field taking the resolved variability into account. The method is tested by coarse graining high-resolution sparse cumulus and broken stratocumulus clouds in the horizontal plane, downsca...
Article
Dans un grand nombre d'endroits, on peut constituer des séries d'observations climatologiques relativement complètes, remontant au XIXème siècle. Cependant, les conditions de mesure ont été profondément modifiées au cours du temps. Les changements d'emplacement, d'instrumentation se traduisent par autant de biais dans les séries de données. Or ces...
Conference Paper
New technologies permit going continuously down into the spatial scale when observing the Earth's atmosphere from spaceborne instruments. At the same time, new computers are equipped with large main memories that allow to allocate bigger data arrays of optical properties. Historically, strongly approximative approaches have being used for the compu...
Article
We are developing a new multi-station weather generator, i.e. an algorithm that generates time series for a number of climate stations in a region conditioned on the large-scale circulation. The algorithm is based on the so-called surrogate data approach. It is very similar to the Iterative Amplitude Adjusted Fourier Transform (IAAFT) algorithm use...
Article
We implemented a PI (Physical Initialization) method in the non hydrostatic limited-area model COSMO (version 4.2) of the DWD (German Meteorological Service). The goal is the improvement of quantitative rain nowcasting with a high resolution NWP model. Input radar data is a DWD product: the national radar composite for 16 radars with a spatial reso...
Article
The COST Action ES0601: Advances in homogenisation methods of climate series: an integrated approach is nearing the end of its second year of life. The action is intended to provide the best possible tools for the homogenization of time series to the climate research community. The involved scientists have done remarkable progress since COST Action...
Article
Fields from dynamical models often have an insufficient resolution and need to be disaggregated. In our case we have atmospheric fields at 2.8 km (coarse) resolution and would like to couple these with a soil module running at 400 m (high) resolution. To avoid biases in the computation of the fluxes between the surface and the atmosphere we need to...
Article
Full-text available
Geostatistical methods (kriging) aim at estimating the average value. In case of sparse measurements, such fields are too smooth. This can lead to biases in radia- tive transfer calculations on such a kriged field. Sto- chastic modelling, e.g. surrogate data, aims at repro- ducing the structure of data. Surrogate clouds from (profiling) measurement...
Article
Cloud fields from dynamical models often have resolutions that are insufficient for exact 3-dimensional radiative transfer calculations. To solve this problem, we have developed a downscaling algorithm that produces higher resolution fields, while preserving the original coarse resolution fields of the mean liquid water content and cloud fraction....
Article
As part of the COST Action HOME (Advances in homogenisation methods of climate series: an integrated approach) a dataset is generated that will serve as a benchmark for homogenisation algorithms. Members of the Action and third parties are invited to homogenise this dataset. The results of this exercise will be analysed by the HOME Working Groups (...
Article
The quality of numerical precipitation prediction depends on the accuracy of the model reproducing the true initial state of the atmosphere prior to the forecast. Typically a numerical model needs a spin-up time of several hours until its hydrological cycle is established. Assimilation of precipitation data can reduce the spin-up time significantly...
Article
Full-text available
Air temperature records are commonly subjected to inhomogeneities, e.g., sudden jumps caused by a relocation of the measurement station or by installing a new type of shelter. We study the effect of these inhomogeneities on the estimation of the Hurst exponent and show that they bias the estimates towards larger values. The Hurst exponent is a para...
Article
Full-text available
Radiative transfer calculations in atmospheric models are computationally expensive, even if based on simplifications such as the δ-two-stream approximation. In most weather prediction models these parameterisation schemes are therefore called infrequently, accepting additional model error due to the persistence assumption between calls. This paper...
Article
Full-text available
Radiative transfer calculations in atmospheric models are computationally expensive, even if based on simplifications such as the δ-two-stream approximation. In most weather prediction models these parameterisation schemes are therefore called infrequently, accepting additional model error due to the persistence assumption between calls. This paper...
Article
Using three cloud generators, three-dimensional (3D) cloud fields are reproduced from microphysical cloud data measured in situ by aircraft. The generated cloud fields are used as input to a 3D radiative transfer model to calculate the corresponding fields of downward and upward irradiance, which are then compared with airborne and ground-based rad...
Article
Most natural complex are characterised by variability on a large range of temporal and spatial scales. The two main methodologies to generate such structures are Fourier/FARIMA based algorithms and multifractal methods. The former is restricted to Gaussian data, whereas the latter requires the structure to be self-similar. This work will present so...
Article
Full-text available
In this study, the statistical properties of a range of measurements are compared with those of their surrogate time series. Seven different records are studied, amongst others, historical time series of mean daily temperature, daily rain sums and runoff from two rivers, and cloud measurements. Seven different algorithms are used to generate the su...
Article
Full-text available
A stochastic version of the Iterative Amplitude Adjusted Fourier Transform (IAAFT) algorithm is presented. This algorithm is able to generate so-called surrogate time series, which have the amplitude distribution and the power spectrum of measured time series or fields. The key difference between the new algorithm and the original IAAFT method is t...
Article
Full-text available
A new method of generating two-dimensional and three-dimensional cloud fields is presented, which share several important statistical properties with real measured cloud fields. Well-known algorithms such as the Fourier method and the Bounded Cascade method generate fields with a specified Fourier spectrum. The new iterative method allows for...
Article
Full-text available
Clouds affect our daily life in many ways. They dominate our perception of weather and, thus, have an enormous influence on our everyday activities and our health. This fact is completely at odds with our knowledge about clouds, their representation in climate and weather forecast models, and our ability to predict clouds. It is their high variabil...
Article
Full-text available
Clouds cause uncertainties in the determination of climate sensitivity to either natural or anthropogenic changes. Furthermore, clouds dominate our perception of the weather, and the relatively poor forecast of cloud and precipitation parameters in numerical weather prediction (NWP) models is striking. In order to improve modeling and forecasting o...
Article
Full-text available
This paper describes two new methods to generate 2D and 3D cloud fields based on 1D and 2D ground based profiler meas-urements. These cloud fields share desired statistical properties with real cloud fields. As they, however, are similar but not the same as real clouds, we call them surrogate clouds. One important advantage of the new methods is th...
Article
Using only lidar or radar an accurate cloud boundary height estimate is often not possible. The combination of lidar and radar can give a reliable cloud boundary estimate in a much broader range of cases. However, also this combination with standard methods still can not measure the cloud boundaries in all cases. This will be illustrated with data...
Article
Full-text available
To understand and model the radiative transport in a cloudy atmosphere, information on the cloud structure, optical properties and microphysics is indispensable. In order to obtain a complete data set, four national institutes joined their efforts in the CLARA project. After the start of the preparations for the first campaign, the total number of...
Article
High resolution spectroscopy of the oxygen A-band (760-780 nm) in zenith-scattered light is a powerful tool to infer path length distributions (PDF) of solar photons transmitted to the ground. Solar photon PDF s may provide information on multiple scattering statistics of the cloudy sky radiative transport (RT), in particular when the method is com...
Article
The 4D-clouds project aims at capturing the radiative influence of inhomogeneous clouds and at implementing these influences in the modelling of transport and exchange processes in dynamical atmospheric models. The measurement component of this project was executed together with the EU-project CLIWA-NET in the Baltex Bridge Campaign (BBC), which wa...
Article
Full-text available
For 3D radiative transfer calculations in the cloudy atmosphere, one needs 3-dimensional cloud fields as input data. One can use cloud models for this, e.g. LES models with cloud physics. However, these LES clouds are not easily usable together with empirical data, as it is hard to model a cloud that is really similar to the observed one. Statistic...
Article
Full-text available