Article

A semi-parametric model for multivariate extreme values

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Threshold methods for multivariate extreme values are based on the use of asymptotically justified approximations of both the marginal distributions and the dependence structure in the joint tail. Models derived from these approximations are fitted to a region of the observed joint tail which is determined by suitably chosen high thresholds. A drawback of the existing methods is the necessity for the same thresholds to be taken for the convergence of both marginal and dependence aspects, which can result in inefficient estimation. In this paper an extension of the existing models, which removes this constraint, is proposed. The resulting model is semi-parametric and requires computationally intensive techniques for likelihood evaluation. The methods are illustrated using a coastal engineering application.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Simultaneous estimation of all the parameters is then possible and is illustrated later. The related but more complicated scheme of Dixon and Tawn (1995) can be used for cases where separate thresholds are needed for the marginal and dependence modelling. ...
... By construction, our approach ensures that the thresholds that are used in specifying the domain of the GPD coincide on the quantile scale with the threshold u that defines the domain of the dependence structure. This may not always be appropriate and in such cases the techniques of Dixon and Tawn (1995), which allow these thresholds to be separately specified, may be of use. The basis of our estimation is as follows: since the bivariate random variable .X, Y/ ≡ {−1= log{F 1 .X Å /}, − 1= log{F 2 .Y Å /}} has unit-Fréchet-distributed margins and u = −1= log{F 1 .u 1 /} = −1= log{F 2 .u 2 /}, we model the joint survivor function of .X Å , Y Å / for x Å > u 1 and y Å > u 2 as ...
Article
A fundamental issue in applied multivariate extreme value analysis is modelling dependence within joint tail regions. The primary focus of this work is to extend the classical pseudopolar treatment of multivariate extremes to develop an asymptotically motivated representation of extremal dependence that also encompasses asymptotic independence. Starting with the usual mild bivariate regular variation assumptions that underpin the coefficient of tail dependence as a measure of extremal dependence, our main result is a characterization of the limiting structure of the joint survivor function in terms of an essentially arbitrary non-negative measure that must satisfy some mild constraints. We then construct parametric models from this new class and study in detail one example that accommodates asymptotic dependence, asymptotic independence and asymmetry within a straightforward parsimonious parameterization. We provide a fast simulation algorithm for this example and detail likelihood-based inference including tests for asymptotic dependence and symmetry which are useful for submodel selection. We illustrate this model by application to both simulated and real data. In contrast with the classical multivariate extreme value approach, which concentrates on the limiting distribution of normalized componentwise maxima, our framework focuses directly on the structure of the limiting joint survivor function and provides significant extensions of both the theoretical and the practical tools that are available for joint tail modelling. Copyright (c) 2009 Royal Statistical Society.
... The approximation entails that the marginal distributions of excesses over corresponding thresholds are modeled by a GPD distribution and the dependence structure between two margins is modeled by a bivariate asymmetric logic copula. Marginal and dependence considerations sometimes generate different thresholds (Dixon and Tawn, 1995); thus, the optimal thresholds should be selected with caution. In search of the optimal threshold for a GPD model, it is known that the shape parameter does not change with the threshold, and the scale parameter has a close relationship with the threshold, that is, s 0 = s + j u 0 À u ð Þ, where u 0 and s 0 are the new threshold and scale parameters, respectively. ...
Article
Multi-lane factor (MLF) is a probability reduction reflecting unfavorable traffic loads over multiple lanes acting simultaneously on the most adverse position of a bridge. It is one of the key components of traffic load models for bridges. The most recent research established a multi-coefficient MLF model that clearly illustrated the lane load disparity and the probability reduction of their simultaneous actions. However, it used the block maxima (BM) method for extreme value modeling, which requires a large amount of traffic data. This study aims to adopt the peaks-over-threshold (POT) method to obtain more information from short-term traffic data and model the extreme coincident lane load effects (LLEs) for multi-coefficient MLF calibration. First, the multi-coefficient MLF model was reviewed. Thereafter, the bivariate POT method for coincident LLEs modeling using generalized Pareto distribution was proposed and formulated. Critical issues such as bivariate threshold selection and parameter estimation were addressed. Numerical examples were demonstrated to verify and validate the approach. Finally, the proposed approach was applied for calibrating the MLF of an experimental site with four traffic lanes. The results indicated that the coincident LLEs modeling using the POT approach was accurate and more effective than using the BM method when applied to limited data. The calibrated MLFs from the experimental site effectively revealed the lane load disparity of traffic loads over multiple lanes, which is not involved in the traffic load models of current bridge design specifications. Furthermore, the influence of other problems such as weight restriction on coincident LLEs modeling and MLF calibration were discussed. The proposed technique provides a sound approach for multi-coefficient MLF calibration of bridge assessment with short-term site-specific traffic data.
... The approximation entails that the marginal distributions of excess over corresponding thresholds are modeled by a GP distribution while the dependence structure between two margins by that of a bivariate extreme value distribution. Marginal and dependence considerations sometimes generate different thresholds (Dixon and Tawn, 1995), thus the optimal thresholds should be selected with caution. There is not a universal approach to determine such optimal thresholds. ...
Article
Full-text available
Surrogate safety measures have been advocated as a complementary approach to study safety from a broader perspective than relying on crash data alone. This study proposes an approach to incorporate different surrogate safety measures in a unified framework for road safety estimation within the bivariate extreme value theory framework. The model structure, model specification, threshold selection method, and parameter estimation method of the bivariate threshold excess model are introduced. Two surrogate safety measures, post encroachment time (PET) and length proportion of merging (LPM), are chosen to characterize the severity of merging events on freeway entrance merging areas. Based on the field data collected along Highway 417 in the City of Ottawa, Ontario, Canada, the bivariate modelling methods with seven distribution functions are applied and compared, and the model with logistic distribution function is selected as the best model. The best bivariate models’ estimation results are then evaluated by comparing them to their two marginal (univariate Generalized Pareto distribution) models. The results show that the bivariate models tend to generate crash estimates that are much closer to observed crashes than univariate models. A more important finding is that incorporating two surrogate safety measures into the bivariate models can significantly reduce the uncertainty of crash estimates. The efficiency of a bivariate model is not evidently better than either of its marginal models, but it is expected to be improved with data of a prolonged observation period. This study is also a step forward in the direction of developing multivariate safety hierarchy models, since models of the safety hierarchy have been predominantly univariate.
... 20th and 21st century) sea-level rise that has resulted in increased overtopping events (see Dawson et al., 2016), as the distance between mean sea level and the crest of the defences is gradually reduced. Furthermore, based on observations and analysis, current projections of future sea-level rise will result in further increases in the frequency of these events (Dixon and Tawn, 1995;Haigh et al., 2011). This will result in higher associated repair costs to the network operator, as well as the disruption to passenger travel, and the prospect of the southwest region of England being left periodically without a main railway line for This study utilises estimates of the impacts of future SLR that incorporates impact/cost data of track incidents derived from an empirical-based trend that is extrapolated forward based on projections of future SLR (Lowe et al., 2009;Dawson et al., 2016). ...
Article
Full-text available
Traditional methods of investment appraisal have been criticized in the context of climate change adaptation. Economic assessment of adaptation options needs to explicitly incorporate the uncertainty of future climate conditions and should recognise that uncertainties may diminish over time as a result of improved understanding and learning. Real options analysis (ROA) is an appraisal tool developed to incorporate concepts of flexibility and learning that relies on probabilistic data to characterise uncertainties. It is also a relatively resource-intensive decision support tool. We test whether, and to what extent, learning can result from the use of successive generations of real life climate scenarios, and how non-probabilistic uncertainties can be handled through adapting the principles of ROA in coastal economic adaptation decisions. Using a relatively simple form of ROA on a vulnerable piece of coastal rail infrastructure in the United Kingdom, and two successive UK climate assessments, we estimate the values associated with utilising up-dated information on sea-level rise. The value of learning can be compared to the capital cost adaptation investment, and may be used to illustrate the potential scale of the value of learning in coastal protection, and other adaptation contexts.
... In the British context, it is estimated that coastal defence structures protect around 1200 kmroughly one thirdof the English and Welsh coastlines, with a particular concentration in southern England (Environment Agency, 1999;Hall et al., 2006;de la Vega-Leinert and Nicholls, 2008). Defence structures are built to a design standard based on the statistical return period of extreme water levels -1 in 50 years, 1 in 200 years, etc.but it is estimated that even small changes in sea level, of the order of centimetres, can have a significant effect on these return periods and the future probability of coastal flooding (Dixon and Tawn, 1995;Gehrels, 2006;Church et al., 2008;Haigh et al., 2011). It is thus unsurprising that future sea-level projections provide an important tool for strategic coastal planning (Hall et al., 2006;Nicholls et al., 2013). ...
Article
Full-text available
Future climate change is likely to increase the frequency of coastal storms and floods, with major consequences for coastal transport infrastructure. This paper assesses the extent to which projected sea-level rise is likely to impact upon the functioning of the Dawlish to Teignmouth stretch of the London to Penzance railway line, in England. Using a semi-empirical modelling approach, we identify a relationship between sea-level change and rail incidents over the last 150years and then use model-based sea-level predictions to extrapolate this relationship into the future. We find that days with line restrictions (DLRs) look set to increase by up to 1170%, to as many as 84–120 per year, by 2100 in a high sea-level rise scenario (0.55–0.81m). Increased costs to the railway industry deriving from maintenance and line restrictions will be small (£ millions) in comparison with damage caused by individual extreme events (£10s of millions), while the costs of diversion of the railway are higher still (£100s of millions to billions). Socio-economic costs to the region are likely to be significant although they are more difficult to estimate accurately. Finally, we explain how our methodology is applicable to vulnerable coastal transport infrastructure worldwide.
... In the federal state of Schleswig-Holstein, the latest policy is to statistically derive design water levels associated with a 200-year return period using an extreme value analysis of the largest value per year superimposed with a projected mean sea-level rise (LKN, 2012 In England, estimates of extreme water level probabilities used to be determined for different stretches of the coastline by the different Environment Agency (EA) regional departments responsible for that area. However, on behalf of the EA, Dixon and Tawn (1994, 1995, 1997 provided a single coherent estimate of extreme still water level probabilities at high resolution all around the UK coastline using their Spatially Revised Joint Probability Method which was based on both tide gauge data and a multi-decadal predicted water level hindcast. A major update of that study has recently been completed (Environment Agency, 2011;Batstone et al., 2013), which improved the basic statistical assumptions (resulting in the Skew Surge Joint Probability Method) and used longer tide gauge records that are now available. ...
Article
Over the past five decades, several approaches for estimating probabilities of extreme still water levels have been developed. Currently, different methods are applied not only on transnational, but also on national scales, resulting in a heterogeneous level of protection. Applying different statistical methods can yield significantly different estimates of return water levels, but even the use of the same technique can produce large discrepancies, because there is subjective parameter choice at several steps in the model setup. In this paper, we compare probabilities of extreme still water levels estimated using the main direct methods (i.e. the block maxima method and the peaks over threshold method) considering a wide range of strategies to create extreme value dataset and a range of different model setups. We primarily use tide gauge records from the German Bight but also consider data from sites around the UK and Australia for comparison. The focus is on testing the influence of the following three main factors, which can affect the estimates of extreme value statistics: (1) detrending the original data sets; (2) building samples of extreme values from the original data sets; and (3) the record lengths of the original data sets. We find that using different detrending techniques biases the results from extreme value statistics. Hence, we recommend using a 1-year moving average of high waters (or hourly records if these are available) to correct the original data sets for seasonal and long-term sea level changes. Our results highlight that the peaks over threshold method yields more reliable and more stable (i.e. using short records leads to the same results as when using long records) estimates of probabilities of extreme still water levels than the block maxima method. In analysing a variety of threshold selection methods we find that using the 99.7th percentile water level leads to the most stable return water level estimates along the German Bight. This is also valid for the international stations considered. Finally, to provide guidance for coastal engineers and operators, we recommend the peaks over threshold method and define an objective approach for setting up the model. If this is applied routinely around a country, it will help overcome the problem of heterogeneous levels of protection resulting from different methods and varying model setups.
Article
Floods are costly to global economies and can be exceptionally lethal. The ability to produce consistent flood hazard maps over large areas could provide a significant contribution to reducing such losses, as the lack of knowledge concerning flood risk is a major factor in the transformation of river floods into flood disasters. In order to accurately reproduce flooding in river channels and floodplains, high spatial resolution hydrodynamic models are needed. Despite being computationally expensive, recent advances have made their continental to global implementation feasible, although inputs for long-term simulations may require the use of reanalysis meteorological products especially in data-poor regions. We employ a coupled hydrologic/hydrodynamic model cascade forced by the 20CRv2 reanalysis dataset and evaluate its ability to reproduce flood inundation area and volume for Australia during the 1973-2012 period. Ensemble simulations using the reanalysis data were performed to account for uncertainty in the meteorology, and compared with a validated benchmark simulation. Results show that the reanalysis ensemble capture the inundated areas and volumes relatively well, with correlations for the ensemble mean of 0.82 and 0.85 for area and volume respectively, although the meteorological ensemble spread propagates in large uncertainty of the simulated flood characteristics.
Article
Our case study focuses on Milan. Italian law specifies strict guidelines for the permissibility of high levels of a variety of air pollutants in cities. In Milan, a highly sophisticated network of recording stations has been constructed to monitor pollutant levels. The aim of this paper is to obtain a summary of the temporal behaviour of the pollutant series, with particular reference to extreme levels. Simple exploratory analysis reveals a number of sources of stochastic variation and possible dependence on covariate effects, which are subsequently modelled, exploiting recent developments in the modelling and inference for temporal extremes. Using this methodology, we examine the issues of data trends, non-stationarity, meteorological effects and temporal dependence, all of which have substantive implications in the design of pollution control regulations. Moreover, the asymptotic basis of these extreme value models justifies the interpretation of our results, even at levels that are exceptionally high.
Article
Compte tenu de l'importance de l'étude de la dépendance extrême, nous avons essayé de déterminer l'approche qui semble être la meilleure pour l'étude des risques extrêmes. Pour atteindre ce but, nous avons mené une étude du coefficient de dépendance de queue pour les trois approche : paramétriques, semi-paramétriques et non paramétriques. Dans l'étude empirique, nous estimons le coefficient de dépendance de queue dans le cadre de ces différentes méthodes, d'abord par une implémentation numérique, ensuite par l’étude de dépendance entre le Hedge Fund Credit Suisse / Tremont Market Neutral et le S & P500 afin d’évaluer le degré de dépendance entre ces deux actifs, qui sont connus pour être décorrélés. Il existe peu d'études qui ont travaillé sur la dépendance non linéaire entre les Hedge Funds et l'indice du marché. L'article de Denuit et Scaillet (2004) traite d'un cas général de la détection de la dépendance du quadrant positif (PQD) entre les HFR et CSFB / Tremont Market Neutral et l’indice S & P 500 index. Le résultat de ce papier est pertinent car on trouve que le niveau de dépendance au niveau des pertes avec l'indice de marché est moins important que celui au niveau des gains, alors que les Hedge Funds l'indice du marché sont généralement considérés comme décorrélés.
Article
The classical treatment of multivariate extreme values is through componentwise ordering, though in practice most interest is in actual extreme events. Here the point process of observations which are extreme in at least one component is considered. Parametric models for the dependence between components must satisfy certain constraints. Two new techniques for generating such models are presented. Aspects of the statistical estimation of the resulting models are discussed and are illustrated with an application to oceanographic data.
Article
Extremes Values, Regular Variation and Point Processes is a readable and efficient account of the fundamental mathematical and stochastic process techniques needed to study the behavior of extreme values of phenomena based on independent and identically distributed random variables and vectors. It presents a coherent treatment of the distributional and sample path fundamental properties of extremes and records. It emphasizes the core primacy of three topics necessary for understanding extremes: the analytical theory of regularly varying functions; the probabilistic theory of point processes and random measures; and the link to asymptotic distribution approximations provided by the theory of weak convergence of probability measures in metric spaces. The book is self-contained and requires an introductory measure-theoretic course in probability as a prerequisite. Almost all sections have an extensive list of exercises which extend developments in the text, offer alternate approaches, test mastery and provide for enjoyable muscle flexing by a reader. The material is aimed at students and researchers in probability, statistics, financial engineering, mathematics, operations research, civil engineering and economics who need to know about: * asymptotic methods for extremes; * models for records and record frequencies; * stochastic process and point process methods and their applications to obtaining distributional approximations; * pervasive applications of the theory of regular variation in probability theory, statistics and financial engineering. "This book is written in a very lucid way. The style is sober, the mathematics tone is pleasantly conversational, convincing and enthusiastic. A beautiful book!" ---Bulletin of the Dutch Mathematical Society "This monograph is written in a very attractive style. It contains a lot of complementary exercises and practically all important bibliographical reference." ---Revue Roumaine de Mathématiques Pures et Appliquées
Article
Rates of convergence are derived for the convergence in distribution of renormalised sample maxima to the appropriate extreme-value distribution. Related questions which are discussed include the estimation of the principal error term and the optimality of the renormalising constants. Throughout the paper a close parallel is drawn with the theory of slow variation with remainder. This theory is used in proving most of the results. Some applications are discussed, including some models of importance in reliability.
Article
Bivariate extreme value distributions arise as the limiting distributions of renormalized componentwise maxima. No natural parametric family exists for the dependence between the marginal distributions, but there are considerable restrictions on the dependence structure. We consider modelling the dependence function with parametric models, for which two new models are presented. Tests for independence, and discriminating between models, are also given. The estimation procedure, and the flexibility of the new models, are illustrated with an application to sea level data.
Article
Multivariate extreme value distributions are reviewed. Both the finite and infinite-dimensional case are considered. Some estimation procedures are reviewed and a new one proposed by Balkema is presented. There is also a new result on path continuity in the continuous time case.
Article
Bivariate extreme value distributions contain parameters of two types; those that define the marginal distributions, and parameters defining the dependence between suit-ably standardized variates. As an alternative to full maximum likelihood based on the joint distribution, we consider a "marginal estimation" method in which the margin and dependence parameters are estimated separately. This method is simpler to implement computationally, but may be inefficient. Asymptotic results allow the inefficiency to be quantified. The concepts are relevant to a large class of families of multivariate distribu-tions, but the detailed analysis is restricted to Gumbel's logistic model with Gumbel or Generalized Extreme Value margins.
Article
For many structural design problems univariate extreme value theory is applied to quantify the risk of failure due to extreme levels of some environmental process. In practice, amny forms of structure fail owing to a combination of various processes at extreme levels. Recent developments in statistical methodology for multivariate extremes enable the modelling of such behaviour. The aim of this paper is to demonstrate how these ideas can be exploited as part of the design process.
Article
Rates of convergence are derived for the convergence in distribution of renormalised sample maxima to the appropriate extreme-value distribution. Related questions which are discussed include the estimation of the principal error term and the optimality of the renormalising constants. Throughout the paper a close parallel is drawn with the theory of slow variation with remainder. This theory is used in proving most of the results. Some applications are discussed, including some models of importance in reliability.
Article
We discuss the analysis of the extremes of data by modelling the sizes and occurrence of exceedances over high thresholds. The natural distribution for such exceedances, the generalized Pareto distribution, is described and its properties elucidated. Estimation and model-checking procedures for univariate and regression data are developed, and the influence of and information contained in the most extreme observations in a sample are studied. Models for seasonality and serial dependence in the point process of exceedances are described. Sets of data on river flows and wave heights are discussed, and an application to the siting of nuclear installations is described.
Article
The classical treatment of multivariate extreme values is through componentwise ordering, though in practice most interest is in actual extreme events. Here the point process of observations which are extreme in at least one component is considered. Parametric models for the dependence between components must satisfy certain constraints. Two new techniques for generating such models are presented. Aspects of the statistical estimation of the resulting models are discussed and are illustrated with an application to oceanographic data.
Article
We propose a multivariate extreme value threshold model for joint tail estimation which overcomes the problems encountered with existing techniques when the variables are near independence. We examine inference under the model and develop tests for independence of extremes of the marginal variables, both when the thresholds are fixed, and when they increase with the sample size. Motivated by results obtained from this model, we give a new and widely applicable characterization of dependence in the joint tail which includes existing models as special cases. A new parameter which governs the form of dependence is of fundamental importance to this characterization. By estimating this parameter, we develop a diagnostic test which assesses the applicability of bivariate extreme value joint tail models. The methods are demonstrated through simulation and by analyzing two previously published data sets.
Article
In recent research on extreme value statistics, there has been an extensive development of thresh­ old methods, first in the univariate case but subsequently in the multivariate case as well. In this paper, we develop an alternative methodology for extreme values of univariate time series, by assuming that the time series is Markovian and using bivariate extreme value theory to sug­ gest appropriate models for the transition distributions. We develop an alternative form of the likelihood representation for threshold methods, and then show how this can be applied to a Markovian time series. A major motivation for developing this kind of theory, in comparison with existing methods based on cluster maxima, is the possibility of calculating probability dis­ tributions for extremal functionals more complicated than the maxima or extreme quantiles of the series. In the latter part of the paper, we develop this theme, showing how a theory of compound Poisson limits for additive functionals can be combined with simulation to obtain numerical solutions for problems of practical interest.
Article
An extension of the threshold method for extreme values is developed, to consider the joint distribution of extremes of two variables. The methodology is based on the point process representation of bivariate extremes. Both parametric and nonparametric models are considered. The simplest case to handle is that in which both marginal distributions are known. For the more realistic case in which the marginal distributions are unknown, a mixed parametric-nonparametric method is proposed. The techniques are illustrated with data on sulphate and nitrate levels taken from a major study of acid rain.
Article
Bivariate extreme value distributions arise as the limiting distributions of renormalized componentwise maxima. No natural parametric family exists for the dependence between the marginal distributions, but there are considerable restrictions on the dependence structure. We consider modelling the dependence function with parametric models, for which two new models are presented. Tests for independence, and discriminating between models, are also given. The estimation procedure, and the flexibility of the new models, are illustrated with an application to sea level data.
Article
Time series of temperatures during periods of extreme cold display long-term seasonal variability and short-term temporal dependence. Classical approaches to extremes circumvent these issues, but in so doing cannot address questions relating to the temporal character of the process, though these issues are often the most important. In this paper a model is developed with the following features: periodic seasonal effects; consistency with asymptotic extreme value theory; Markov description of temporal dependence. Smith et al. studied the properties of such a model in the stationary case. Here, it is shown how such a model can be fitted to a non-stationary series, and consequently used to estimate temporal aspects of the extremal process of low temperatures which have most practical and scientific relevance.
Chapter
"At last, after a decade of mounting interest in log-linear and related models for the analysis of discrete multivariate data, particularly in the form of multidimensional tables, we now have a comprehensive text and general reference on the subject. Even a mediocre attempt to organize the extensive and widely scattered literature on discrete multivariate analysis would be welcome; happily, this is an excellent such effort, but a group of Harvard statisticians taht has contributed much to the field. Their book ought to serve as a basic guide to the analysis of quantitative data for years to come." -James R. Beninger, Contemporary Sociology "A welcome addition to multivariate analysis. The discussion is lucid and very leisurely, excellently illustrated with applications drawn from a wide variety of fields. A good part of the book can be understood without very specialized statistical knowledge. It is a most welcome contribution to an interesting and lively subject." -D.R. Cox, Nature "Discrete Multivariate Analysis is an ambitious attempt to present log-linear models to a broad audience. Exposition is quite discursive, and the mathematical level, except in Chapters 12 and 14, is very elementary. To illustrate possible applications, some 60 different sets of data have been gathered together from diverse fields. To aid the reader, an index of these examples has been provided. ...the book contains a wealth of material on important topics. Its numerous examples are especially valuable." -Shelby J. Haberman, The Annals of Statistics. © 2007 Springer Science+Business Media, LLC. All rights reserved.
Article
Several methods of analyzing extreme values are now known, most based on the extreme value limit distributions or related families. This paper reviews these techniques and proposes some extensions based on the point-process view of high-level exceedances. These ideas are illustrated with a detailed analysis of ozone data collected in Houston, Texas. There is particular interest in whether they is any trend in the data. The analysis reveals no trend in the overall levels of the series, but a marked downward trend in the extreme values.
Article
Some geometric properties of PD's are established, Kullback's $I$-divergence playing the role of squared Euclidean distance. The minimum discrimination information problem is viewed as that of projecting a PD onto a convex set of PD's and useful existence theorems for and characterizations of the minimizing PD are arrived at. A natural generalization of known iterative algorithms converging to the minimizing PD in special situations is given; even for those special cases, our convergence proof is more generally valid than those previously published. As corollaries of independent interest, generalizations of known results on the existence of PD's or nonnegative matrices of a certain form are obtained. The Lagrange multiplier technique is not used.
Article
A method is presented for making statistical inferences about the upper tail of a distribution function. It is useful for estimating the probabilities of future extremely large observations. The method is applicable if the underlying distribution function satisfies a condition which holds for all common continuous distribution functions.
Statistics for near indepen-dence in multivariate extreme values. Submitted. Pickands, J. (1975) Statistical inference using extreme order statistics
  • A Ledford
  • J A Tawn
Ledford, A. and Tawn, J. A. (1994) Statistics for near indepen-dence in multivariate extreme values. Submitted. Pickands, J. (1975) Statistical inference using extreme order statistics. Annals of Statistics, 3, 119-131.
Discussion of paper by
  • S G Coles
  • S. G. Coles
Discussion of paper by
  • M J Dixon
  • M. J. Dixon