Conference PaperPDF Available

Investigation of methods for hydroclimatic data homogenization

Authors:

Abstract

We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant. From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. One of the most common homogenization methods, ‘SNHT for single shifts’, was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence. The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.
Investigation of methods for hydroclimatic data
homogenization
E. Steirou and D. Koutsoyiannis
Department of Water Resources and Environmental Engineering,
National Technical University of Athens, Greece
Presentation available online: itia.ntua.gr/1212
European Geosciences
Union General
Assembly 2012
Vienna, Austria,
22 ‐ 27 April 2012
Session:
HS7.4/AS4.17/CL2.10
Climate, Hydrology and
Water Infrastructure
Temperature increase during the last century
The dominant view concerning the climate change is summarised by the IPCC
(Intergovernmental Panel on Climate Change) Assessment Reports.
Fourth Assessment Report (2007): a non-uniform but clear temperature
increase of 0.6 - 0.7οC is estimated during the last hundred years
Estimations are not
based on raw data but on
data adjusted in order to
remove errors.
Different estimates of global
temperature changes (IPCC, 2007)
6
8
10
12
1840 1860 1880 1900 1920 1940 1960 1980 2000
οC
The problem
Historical and contemporary climatic time series contain inhomogeneities
errors introduced by changes of instruments, location etc.
The homogenization of climatic time series is made with mainly statistical
methods of identification and correction of recorded and non-recorded
inhomogeneities and is a subject of debate.
De Bilt station The Netherlands
Source: Database
GHCN-Monthly Version 2
(aggregated to annual)
homogenized
(adjusted) data
raw data
The difference between the
trends of the raw and the
homogenized data is often
very large.
Aim of our work
1. To classify and evaluate the observed inhomogeneities in historical and modern
time series, as well as their adjustment methods.
2. To investigate if and how the homogenization procedure affected temperature
trends worldwide.
3. To investigate the behaviour of common homogenization methods, when applied
to synthetic time series with specified statistical characteristics.
In this presentation we focus on points 2 and 3.
Inhomogeneities
Different types (shifts, trends, outliers)
Different causes (thermometer/recording errors,
changes in measurement conditions, differences in
observational hours and in the methods used to
calculate the mean temperature)
Changes of instruments
shelters in the USA in the
1980s
Α: initial wooden Cotton
Region Shelters
Β: modern plastic shelters
Discontinuities in the
air-temperature time
series at the National
Observatory of Athens:
instrument change
in June 1995
calibration of the
new thermometer in
January 1997
Homogenization methods
The homogenization procedure usually consists of three basic steps:
1. Removal of outliers usually values out of a range of ±(to ) are rejected
2. Corrections to account for different data/methods to estimate mean daily
temperatures and corrections for recorded changes of measurement conditions
3. Application of statistical methods to remove shifts or false trends identified in a
single time series (absolute methods) or in comparison of a “candidate” time
series to one or more “reference” time series (relative methods, more common)
Common assumption of homogenization methods is that temperature data
(and generally hydroclimatic data) are independent and normally distributed.
Relative methods: they require high statistical correlation between candidate
and reference series.
Discussion on the homogenization-1
Homogenization results are usually not supported by metadata or experiments (a
known exception in literature is the experiment at the Kremsmünster Monastery,
Austria).
Example: change of thermometersshelters in the USA in the 1980s (Quayle et al., 1991)
No single case of an old and a new observation station running for some time
together for testing of results is available!
On the contrary, comparison and correction were made using statistics of remote
(statistically correlated) stations.
Two neighbouring stations are corrected based on two groups of
reference stations located at distances of hundreds of km.
candidate stations reference stations
(Quayle et al., 1991)
Discussion on the homogenization-2
Homogenization methods do not take into consideration some characteristics of
hydroclimatic data (long-term persistence, microclimatic changes, time lags).
Some inhomogeneities detected are statistically non-significant and they can
lead to false corrections.
Inhomogeneities not reflecting systematic instrumentation changes in a specific
period are expected to have a random character, not introducing a consistent bias
in long time series that needs to be corrected.
REGION
CORRECTION
Western Mediterranean
+0.03 ±0.38 οC
Central Mediterranean
+0.16±0.52 οC
Eastern Mediterranean
+0.19±0.30 οC
Example: Adjustments of daily summer
maximum temperatures in the Greater
Mediterranean Region (Kuglitch et al., 2009)
Corrections may introduce bigger errors than the errors they try to remove.
Evaluation of homogenization results
REGION
STATIONS
Africa
3
Europe
44
Asia
40
South America
5
North America
54
Oceania
17
In the USA, due to the large number
of stations satisfying the criteria, we
divided the region into 7 sections
and selected a number of stations in
proportion to their area.
Data selection:
From the total number of stations of the database
GHCN-Monthly Version 2 we examined 163 stations
worldwide satisfying certain criteria:
They have both raw and adjusted data.
Each time series contains ≥ 100 years of data.
Each time series contains ≤ 4 successive missing
values.
In each time series the percentage of missing
years does not exceed 10%.
Time series end at or later than 1990.
Data analysis
We calculated annual values from monthly values (a year with more than 4
missing months in total or 3 consecutive missing months was considered
‘missing’).
We calculated trends for both raw and adjusted data.
We calculated the Hurst coefficient in two cases of stations with a big
difference between the trends of raw and adjusted data.
6
8
10
12
14
1860 1880 1900 1920 1940 1960 1980 2000
Sulina station Romania
Source: Database
GHCN-Monthly Version 2
raw data
adjusted data
The Hurst coefficient
increased due to the trend
increase of the time series.
Trend difference due to homogenization
Increase Decrease
Results
Homogenization has amplified the estimation of
global temperature increase.
In 2/3 of the stations
examined the
homogenization
procedure increased
positive temperature
trends, decreased
negative trends or
changed negative
trends to positive.
Global Temperature Increase
(from the examined series)
Raw data
0.42°C
Adjusted data
0.76°C
The expected
proportion would be 1/2.
Time series Q with an
offset (simplified)
Test statistic Τa: offset
Evaluation of the SNHT performance
inhomogeneity point
A time series Q is formed as a function
of the candidate (tested) time series Y and
a number of reference time series Xj.
The time series Q is normalised to time
series Z.
The test creates a test statistic Τa which
at the point of a shift takes its maximum
value.
Standard Normal Homogeneity Test
(SNHT) for single shifts is one of the most
common homogenization methods (GHCN -
Version 3) for temperature data. A version of
the method is used for precipitation data.
SNHT for single shifts
We created two time series X, Y each one containing 100 elements and time series W
as a linear function of X, Y.
Time series X, Y: μ=0 and σ=1
Data with long-term persistence: H=0.85, SMA model (Koutsoyiannis, 2000)
The coefficients κ, λ were calculated so that ρWY=0.9 and σW=1.
W: candidate series
Y: reference series
The method was applied in three different cases of synthetic time series:
1. independent data normally distributed with a shift
2. homogeneous data with long-term persistence
3. data with long-term persistence and a shift
1. Independent data normally distributed with a shift
SNHT seems to be satisfactory when applied
to independent data normally distributed.
SNHT located and corrected
the shift of 0.5οC.
The original trend of the time
series was recovered.
We induced a shift of 0.5 οC to
the candidate time series.
Time series
W (original)
W (adjusted)
The time series is considered homogeneous
2. Homogeneous data with long-term
persistence
The method detected two false (non
existing) inhomogeneities. The time
series was corrected in two steps even
if it was already homogeneous.
Step 1
Step 2
Step 3
Time series
Hurst coef.
W (initial)
0.76
W (1st correction)
0.88
W (2nd correction)
0.86
The observed increase of the Hurst
coefficient is caused by the increase of
the trend of the time series.
The homogenization changed
the trend of the time series.
3. Data with long-term persistence and shift
We induced a shift of 0.5οC after time 40.
We applied the homogenization method
until a homogenous time series was derived.
1st step false inhomogeneity
2nd step real inhomogeneity
3rd step false inhomogeneity
4th step false inhomogeneity
The homogenization changed the trend of
the time series. Statistical characteristics
similar to the homogenized time series of
the previous example.
false
inhomogeneity
false
inhomogeneity
false
inhomogeneity
real
inhomogeneity
Time series considered
homogeneous
SNHT does not seem to have a
satisfactory behaviour when applied
to data with long-term persistence.
1
2
3
4
5
Conclusions
1. Homogenization is necessary to remove errors introduced in climatic time
series.
2. Homogenization practices used until today are mainly statistical, not well
justified by experiments and are rarely supported by metadata. It can be
argued that they often lead to false results: natural features of hydroclimatic
time series are regarded errors and are adjusted.
3. While homogenization is expected to increase or decrease the existing
multiyear trends in equal proportions, the fact is that in 2/3 of the cases the
trends increased after homogenization.
4. The above results cast some doubts in the use of homogenization procedures
and tend to indicate that the global temperature increase during the
last century is smaller than 0.7-0.8°C.
5. A new approach of the homogenization procedure is needed, based on
experiments, metadata and better comprehension of the stochastic
characteristics of hydroclimatic time series.
References
Alexandersson H, Moberg A. (1997) ‘Homogenization of Swedish temperature data Part I: homogeneity
test for linear trends’, Int J Climatol 17:25–34.
Founda, D., Kambezidis, H.D., Petrakis, M., Zanis, P., Zerefos, C. (2009) ‘A correction of the recent air-
temperature record at the historical meteorological station of the National Observatory of Athens
(NOA) due to instrument change’, Theoretical and Applied Climatology, 97 (3-4), pp. 385-389.
IPCC (2007) Summary for Policymakers, Climate Change 2007: The Physical Science Basis. Contribution
of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate
Change, Cambridge: Cambridge University.
Koutsoyiannis, D. (2000) ‘A generalized mathematical framework for stochastic simulation and forecast
of hydrologic time series’, Water Resources Research, 36 (6), 1519–1533.
Kuglitsch, F.G., Toreti, A., Xoplaki, E., Della-Marta, P.M., Luterbacher, J., Wanner, H. (2009)
‘Homogenization of daily maximum temperature series in the Mediterranean’, Journal of Geophysical
Research D: Atmospheres, 114 (15), art. no. D15108 2.
Quayle, R. G., D. R. Easterling, T. R. Karl, and P. Y. Hughes (1991) ‘Effects of recent thermometer
changes in the Cooperative Station Network” Bull. Amer. Meteor. Soc., 72, 1718–1723.
... daily times in the calculation of daily mean temperature etc. There are three main types of inhomogeneities: point errors, shifts in the mean and trends [4]. ...
Conference Paper
Full-text available
Climate data almost always comprehend disunions, which are considered as bias and may occur due to many factors, such as the source of these data (land surface, sea surface, satellites) or the way and the conditions of the data recording (type of measuring instruments, distance between measuring instrument and land surface etc.). The effort to deal with the problems caused by such bias, drove to the development of many bias detection methods as well as many data correction methods. The aim of this paper is to review and compare all the different available methods that are easily implemented and applied to climate data in order to correct them, and also to assess the impact of the bias correction on the final corrected data reliability. There many corrective methods, able to correct from the simplest to the most advanced disunion, which are based rather on statistic and much less on experimental methods.
... Although the above findings are in favour of the existence of a stronger dependence 300 structure than the one typically assumed in literature [Potter, 1979; It is also important to consider the uncertainty induced due to measurement errors or false 331 homogenization techniques which may introduce bias to the estimation of LRD [Steirou, 2011]. 332 GHCN-Daily highlights the potential bias provoked by changes in instrumentation over the years 333 and it is possible that this kind of bias could also affect the estimation of H. 334 ...
Article
Full-text available
Long-range dependence (LRD), the so-called Hurst–Kolmogorov behaviour, is considered to be an intrinsic characteristic of most natural processes. This behaviour manifests itself by the prevalence of slowly decaying autocorrelation function and questions the Markov assumption, often habitually employed in time series analysis. Herein, we investigate the dependence structure of annual rainfall using a large set, comprising more than a thousand stations worldwide of length 100 years or more, as well as a smaller number of paleoclimatic reconstructions covering the last 12,000 years. Our findings suggest weak long-term persistence for instrumental data (average H = 0.59), which becomes stronger with scale, i.e. in the paleoclimatic reconstructions (average H = 0.75).
... Homogenization consists of adjusting the baseline of sections of a temperature or rainfall series up or down in an effort to mitigate the effects of changes in location or instrumentation. Recent audits of surface temperature networks have found that official, homogenized networks show more warming than the raw temperature data: in Australia +0.9C vs +0.7C per century [1], in New Zealand +0.9C vs +0.3C per century [2], and globally +0.7C vs +0.4C [3] respectively. A recent study by the Australian Bureau of Meteorology (BoM) also reported a similar variation of +1.09C vs +0.69C between the homogenized ACORN and the non-homogenized WNAWAP networks respectively [4,5]. ...
Conference Paper
Full-text available
In this paper, we prove a logical circularity undermines the validity of a commonly used method of homogenizing surface temperature networks. High rates of type I error due to circularity may explain the exaggeration of surface warming found in official temperature networks .
... Potentially strong implications may exist for flood frequency analysis, drought duration assessment, climate data quality-control and homogenization, and long-term trend analysis including climate change signal detection (e.g. Booy and Morgan 1985, Pelletier and Turcotte 1997, Bunde et al. 2005, Cohn and Lins 2005, Rybski et al. 2006, Steirou and Koutsoyiannis 2012. Ability to reproduce low-frequency climate dynamics of this type can also be an important consideration in performance characterization for both mechanistic global climate models and stochastic climate models (e.g. ...
Article
Power-law spectral scaling violates assumptions of standard analyses such as statistical change detection. However, hydroclimatic data sets may be too short to differentiate the spectra of 1/f αvs low-order linear memory processes, an ambiguity exacerbated by the ubiquity of both process types. We explore this non-uniqueness problem by applying a heuristic tool to four examples from each of four hydroclimatic data types: circulation indices, station climate, river and aquifer conditions, and glacier mass balance. This selection spans much of the globe and includes some of the longest instrumental data sets available. The most common outcome is that power-law scaling is apparent, but the record is insufficiently long to discriminate between underlying mechanisms. The use of palaeoclimatic data to extend the instrumental record was investigated, but produced mixed results. Conversely, a balance-of-evidence approach, additionally incorporating physical process considerations, may help us recognize variate classes for which 1/f α scaling can be concluded. Practical recommendations are offered.Editor D. KoutsoyiannisCitation Fleming, S.W., 2013. A non-uniqueness problem in the identification of power-law spectral scaling for hydroclimatic time series. Hydrological Sciences Journal, 59 (1), 73–84.
Article
Full-text available
A generalized framework for single-variate and multivariate simulation and forecasting problems in stochastic hydrology is proposed. It is appropriate for short-term or long-term memory processes and preserves the Hurst coefficient even in multivariate processes with a different Hurst coefficient in each location. Simultaneously, it explicitly preserves the coefficients of skewness of the processes. The proposed framework incorporates short-memory (autoregressive moving average) and long-memory (fractional Gaussian noise) models, considering them as special instances of a parametrically defined generalized autocovariance function, more comprehensive than those used in these classes of models. The generalized autocovariance function is then implemented in a generalized moving average generating scheme that yields a new time-symmetric (backward-forward) representation, whose advantages are studied. Fast algorithms for computation of internal parameters of the generating scheme are developed, appropriate for problems including even thousands of such parameters. The proposed generating scheme is also adapted through a generalized methodology to perform in forecast mode, in addition to simulation mode. Finally, a specific form of the model for problems where the autocorrelation function can be defined only for a certain finite number of lags is also studied. Several illustrations are included to clarify the features and the performance of the components of the proposed framework.
Article
Full-text available
During the past five years, the National Weather Service (NWS) has replaced over half of its liquid-in-glass maximum and minimum thermometers in wooden Cotton Region Shelters (CRSS) with thermistor-based Maximum-Minimum Temperature Systems (MMTSS) housed in smaller plastic shelters. Analyses of data from 424 (of the 3300) MMTS stations and 675 CRS stations show that a mean daily minimum temperature change of roughly +0.3°C, a mean daily maximum temperature change of 0.4°C, and a change in average temperature of 0.1°C were introduced as a result of the new instrumentation. The change of 0.7°C in daily temperature range is particularly significant for climate change studies that use this element as an independent variable. Although troublesome for climatologists, there is reason to believe that this change (relative to older records) represents an improvement in absolute accuracy. The bias appears to be rather sharp and well defined. Since the National Climatic Data Center (NCDC) station history database contains records of instrumentation, adjustments for this bias can be readily applied, and we are reasonably confident that the corrections we have developed can be used to produce homogeneous time series of area-average temperature.
Article
Full-text available
1] Homogenization of atmospheric variables to detect and attribute past and present climate trends and to predict scenarios of future meteorological extreme events is a crucial issue for the reliability of analysis results. Here we present a quality control and new homogenization method (PENHOM) based on a penalized log likelihood procedure and a nonlinear model applied to 174 daily summer maximum temperature series in the Greater Mediterranean Region covering the last 50–100 years. The break detection method does not rely on homogeneous reference stations and was chosen owing to the lack of metadata available. The correction procedure allows the higher-order moments of the candidate distribution to be corrected, which is important if the homogenized series are to be used to quantify temperature extremes. Both procedures require a set of highly correlated neighboring stations to correct climate series reliably. After carrying out the homogeneity procedure, 84% of all time series were found to contain at least one artificial breakpoint. Time series of the eastern Mediterranean (one breakpoint in 24 years on average) show significantly more breakpoints than do series of the Western Basin (one breakpoint in 36 years on average). The mean adjustment (standard error) of all daily summer maximum temperatures is +0.03°C (±0.38°C) for the western Mediterranean, +0.16°C (±0.52°C) for the central Mediterranean, and +0.19°C (±0.30°C) for the eastern Mediterranean, indicating a reduced increase in mean summer daytime temperature compared to that detected by analyzing raw data. The adjustments for higher-order moments were not uniform. Most significant mean changes due to homogenization were detected for both: the hottest (+0.15°C ± 0.66°C) and coldest decile (À0.83°C ± 1.28°C) compared to the raw data in the central Mediterranean. This study demonstrates that homogenization of daily temperature data is necessary before any analysis of temperature-related extreme events such as heat waves, cold spells, and their impacts on human health, agriculture, and ecosystems can be studied.
Article
A new test for the detection of linear trends of arbitrary length in normally distributed time series is developed. With this test it is possible to detect and estimate gradual changes of the mean value in a candidate series compared with a homogeneous reference series. The test is intended for studies of artificial relative trends in climatological time series, e.g. an increasing urban heat island effect. The basic structure of the new test is similar to that of a widely used test for abrupt changes, the standard normal homogeneity test. The test for abrupt changes is found to remain unaltered after an important generalization. © 1997 by the Royal Meteorological Society.
Article
Two discontinuities were detected in the air-temperature time series at the meteorological station of the National Observatory of Athens. The first discontinuity reflects the instrumental change, which took place in June 1995 and the second discontinuity (and most pronounced) the application of a correction factor to the temperature values (in January 1997), after a calibration of the new thermometers. As a result, a cooling bias was observed after June 1995 and a warming bias after January 1997. The magnitude of bias exhibited a seasonal variability being more pronounced and reaching up to 0.67°C during the warm period of the year. The common period of operation of the ‘old’ and ‘new’ instrumentation was used for the estimation of monthly correction factors and the removal of the bias. The application of the new correction factors, restore the continuity in the air temperature record after June 1995.
Summary for Policymakers, Climate Change
IPCC (2007) Summary for Policymakers, Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Cambridge: Cambridge University.