Henry P Wynn

Henry P Wynn
London School of Economics and Political Science | LSE · Department of Statistics

MA, Oxford. PhD Imperial College, London

About

314
Publications
36,578
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
16,060
Citations
Introduction
Additional affiliations
February 2003 - present
London School of Economics and Political Science
Position
  • Professor of Statistics
Education
October 1967 - September 1970
Independent Researcher
Independent Researcher
Field of study
  • Mathematical Statistics
October 1964 - June 1967
University of Oxford
Field of study
  • Mathematics

Publications

Publications (314)
Article
Full-text available
We describe the notion of stability of coherent systems as a framework to deal with redundancy. We define stable coherent systems and show how this notion can help the design of reliable systems. We demonstrate that the reliability of stable systems can be efficiently computed using the algebraic versions of improved inclusion-exclusion formulas an...
Preprint
We describe the notion of stability of coherent systems as a framework to deal with redundancy. We define stable coherent systems and show how this notion can help the design of reliable systems. We demonstrate that the reliability of stable systems can be efficiently computed using the algebraic versions of improved inclusion-exclusion formulas an...
Article
Full-text available
Leveraging Artificial Intelligence (AI) techniques to empower decision-making can promote social welfare by generating significant cost savings and promoting efficient utilization of public resources, besides revolutionizing commercial operations. This study investigates how AI can expedite dispute resolution in road traffic accident (RTA) insuranc...
Article
Full-text available
In numerical analysis, sparse grids are point configurations used in stochastic finite element approximation, numerical integration and interpolation. This paper is concerned with the construction of polynomial interpolator models in sparse grids. Our proposal stems from the fact that a sparse grid is an echelon design with a hierarchical structure...
Article
Full-text available
The paper covers the design and analysis of experiments to discriminate between two Gaussian process models with different covariance kernels, such as those widely used in computer experiments, kriging, sensor location and machine learning. Two frameworks are considered. First, we study sequential constructions, where successive design (observation...
Article
After a rich history in medicine, randomized control trials (RCTs), both simple and complex, are in increasing use in other areas, such as web-based A/B testing and planning and design of decisions. A main objective of RCTs is to be able to measure parameters, and contrasts in particular, while guarding against biases from hidden confounders. After...
Preprint
Binary data are highly common in many applications, however it is usually modelled with the assumption that the data are independently and identically distributed. This is typically not the case in many real-world examples and such the probability of a success can be dependent on the outcome successes of past events. The de Bruijn process (DBP) was...
Preprint
Binary data are very common in many applications, and are typically simulated independently via a Bernoulli distribution with a single probability of success. However, this is not always the physical truth, and the probability of a success can be dependent on the outcome successes of past events. Presented here is a novel approach for simulating bi...
Preprint
Full-text available
The paper covers the design and analysis of experiments to discriminate between two Gaussian process models, such as those widely used in computer experiments, kriging, sensor location and machine learning. Two frameworks are considered. First, we study sequential constructions, where successive design (observation) points are selected, either as a...
Article
Full-text available
The use of statistical and AI methods in civil litigation is an area likely to expand. As with many areas of social science, the data requirements are high but complex, because of the complexity of the legal process and the nature of the causal connections. This paper looks at the early stage of the process where the initial establishment of liabil...
Conference Paper
Computer models are widely used in decision support for energy systems operation, planning and policy. A system of models is often employed, where model inputs themselves arise from other computer models, with each model being developed by different teams of experts. Gaussian Process emulators can be used to approximate the behaviour of complex, co...
Article
Full-text available
Lattice conditional independence models [Andersson and Perlman, Lattice models for conditional independence in a multivariate normal distribution, Ann. Statist. 21 (1993), 1318–1358] are a class of models developed first for the Gaussian case in which a distributive lattice classifies all the conditional independence statements. The main result is...
Article
Full-text available
District heating is expected to play an important role in the decarbonisation of the energy sector in the coming years since low carbon sources such as waste heat and biomass are increasingly being used to generate heat. The design of district heating often has competing objectives: the need for inexpensive energy and meeting low carbon targets. In...
Preprint
Full-text available
Computational models are widely used in decision support for energy system operation, planning and policy. A system of models is often employed, where model inputs themselves arise from other computer models, with each model being developed by different teams of experts. Gaussian Process emulators can be used to approximate the behaviour of complex...
Article
Full-text available
This paper continues the application of circuit theory to experimental design started by the first two authors. The theory gives a very special and detailed representation of the kernel of the design model matrix named circuit basis. This representation turns out to be an appropriate way to study the optimality criteria referred to as robustness: t...
Article
Full-text available
Decision making in the face of a disaster requires the consideration of several complex factors. In such cases, Bayesian multi-criteria decision analysis provides a framework for decision making. In this paper, we present how to construct a multi-attribute decision support system for choosing between countermeasure strategies, such as lockdowns, de...
Preprint
Full-text available
Recent results in the theory and application of Newton-Puiseux expansions, i.e. fractional power series solutions of equations, suggest further developments within a more abstract algebraic-geometric framework, involving in particular the theory of toric varieties and ideals. Here, we present a number of such developments, especially in relation to...
Preprint
This paper continues the application of circuit theory to experimental design started by the first two authors. The theory gives a very special and detailed representation of the kernel of the design model matrix. This representation turns out to be an appropriate way to study the optimality criteria referred to as robustness: the sensitivity of th...
Article
Full-text available
Urban Waste Heat Recovery, heat recovery from low-temperature urban sources such as data centres and metro systems, has a great deal of potential in terms of meeting domestic and commercial heat demands whilst significantly reducing carbon emissions. Urban sources of heat are advantageous in that they tend to be close to areas of high heat demand a...
Preprint
Full-text available
Lattice Conditional Independence models are a class of models developed first for the Gaussian case in which a distributive lattice classifies all the conditional independence statements. The main result is that these models can equivalently be described via a transitive acyclic graph (TDAG) in which, as is normal for causal models, the conditional...
Preprint
After a rich history in medicine, randomisation control trials both simple and complex are in increasing use in other areas such as web-based AB testing and planning and design decisions. A main objective is to be able to measure parameters, and contrasts in particular, while guarding against biases from hidden confounders. After careful definition...
Article
Full-text available
This paper derives an explicit formula for a type of fractional power series, known as a Puiseux series, arising in a wide class of applied problems in the physical sciences and engineering. Detailed consideration is given to the gaps which occur in these series (lacunae); they are shown to be determined by a number-theoretic argument involving the...
Article
The probability density function (PDF) of a random variable associated with the solution of a partial differential equation (PDE) with random parameters is approximated using a truncated series expansion. The random PDE is solved using two stochastic finite element methods, Monte Carlo sampling and the stochastic Galerkin method with global polynom...
Preprint
Full-text available
In this paper we present a multi-attribute decision support framework for choosing between countermeasure strategies designed to mitigate the effects of COVID-19 in the UK. Such an analysis can evaluate both the short term and long term efficacy of various candidate countermeasures.The expected utility scores of a countermeasure strategy captures t...
Article
Cognitive bias is thought to play an important role in cost and time overrun in infrastructure projects. It is common for project appraisers to make overly optimistic assessments of project assumptions which, in turn, leads to unrealistic assessments of time and cost. These sorts of appraisals are usually made using simple models that take both tec...
Preprint
Organisations, whether in government, industry or commerce, are required to make decisions in a complex and uncertain environment. The way models are used is intimately connected to the way organisations make decisions and the context in which they make them. Typically, in a complex organisation, multiple related models will often be used in suppor...
Preprint
Full-text available
Majorisation, also called rearrangement inequalities, yields a type of stochastic ordering in which two or more distributions can be then compared. This method provides a representation of the peakedness of probability distributions and is also independent of the location of probabilities. These properties make majorisation a good candidate as a th...
Article
In this paper, we review different definitions that multi-state k -out-of- n systems have received along the literature and study them in a unified way using the algebra of monomial ideals. We thus obtain formulas and algorithms to compute their reliability and bounds for it. We provide formulas and computer experiments for simple and generalized m...
Article
Full-text available
Polarization is a powerful technique in algebra which provides combinatorial tools to study algebraic invariants of monomial ideals. We study the reverse of this process, depolarization which leads to a family of ideals which share many common features with the original ideal. Given a squarefree monomial ideal, we describe a combinatorial method to...
Article
Full-text available
This paper introduces a new way to define a genome rearrangement distance, using the concept of mean first passage time from probability theory. Crucially, this distance provides a genuine metric on genome space. We develop the theory and introduce a link to a graph-based zeta function. The approach is very general and can be applied to a wide vari...
Preprint
Full-text available
The primary objective of this work is to model and compare different exit scenarios from the lock-down for the COVID-19 UK epidemic. In doing so we provide an additional modelling basis for laying out the strategy options for the decision-makers. The main results are illustrated and discussed in Part I. In Part II, we describe the stochastic model...
Article
Full-text available
Urban waste heat recovery, in which low temperature heat from urban sources is recovered for use in a district heat network, has a great deal of potential in helping to achieve 2050 climate goals. For example, heat from data centres, metro systems, public sector buildings and waste water treatment plants could be used to supply 10% of Europe’s heat...
Article
Full-text available
Urban waste heat recovery, in which low temperature heat from urban sources is recovered for use in a district heat network, has a great deal of potential in helping to achieve 2050 climate goals. For example, heat from data centres, metro systems, public sector buildings and waste water treatment plants could be used to supply 10% of Europe’s heat...
Preprint
Full-text available
Strategies for meeting low carbon objectives in energy are likely to take greater account of the benefits of district heating. Currently, district heating schemes typically use combined heat and power (CHP) supplemented with heat pumps attached to low temperature waste heat sources, powered either by electricity from the CHP itself or from the Nati...
Preprint
Full-text available
In this paper we review different definitions that multi-state $k$-out-of-$n$ systems have received along the literature and study them in a unified way using the algebra of monomial ideals. We thus obtain formulas and algorithms to compute their reliability and bounds for it.
Article
Full-text available
A methodology is developed for data analysis based on empirically constructed geodesic metric spaces. The population version defines distance by the amount of probability mass accumulated on traveling between two points and geodesic metric arises from the shortest path version. Such metrics are then transformed in a number of ways to produce famili...
Article
Full-text available
We develop algorithms for the analysis of multi-state k-out-of-n systems and their reliability based on commutative algebra.
Preprint
Full-text available
Urban waste heat recovery, in which low temperature heat from urban sources is recovered for use in a district heat network, has a great deal of potential in helping to achieve 2050 climate goals. For example, heat from data centres, metro systems, public sector buildings and waste water treatment plants could be used to supply ten percent of Europ...
Preprint
Full-text available
Scenario Analysis is a risk assessment tool that aims to evaluate the impact of a small number of distinct plausible future scenarios. In this paper, we provide an overview of important aspects of Scenario Analysis including when it is appropriate, the design of scenarios, uncertainty and encouraging creativity. Each of these issues is discussed in...
Chapter
Full-text available
Data can be collected in scientific studies via a controlled experiment or passive observation. Big data is often collected in a passive way, e.g. from social media. In studies of causation great efforts are made to guard against bias and hidden confounders or feedback which can destroy the identification of causation by corrupting or omitting coun...
Article
Full-text available
Approximately 1.2 EJ of energy are potentially available for recovery each year from urban heat sources in the EU. This corresponds to more than 10 percent of the EU’s total energy demand for heat and hot water. There are, however, a number of challenges to be met before urban waste heat recovery can be performed on a wide scale. This paper focuses...
Article
Maximum entropy sampling (MES) criteria provide a useful framework for studying sequential designs for computer experiments in a Bayesian framework. However, there is some technical difficulty in making the procedure fully adaptive in the sense of making proper use of previous output as well as input data. In the simple Gaussian set-up only previou...
Preprint
This paper introduces a new way to define a genome rearrangement distance, using the concept of mean first passage time from control theory. Crucially, this distance estimate provides a genuine metric on genome space. We develop the theory and introduce a link to a graph-based zeta function. The approach is very general and can be applied to a wide...
Article
Full-text available
In previous work the authors defined the k-th order simplicial distance between probability distributions which arises naturally from a measure of dispersion based on the squared volume of random simplices of dimension k. This theory is embedded in the wider theory of divergences and distances between distributions which includes Kullback–Leibler,...
Preprint
The probability density function (PDF) of a random variable associated with the solution of a stochastic partial differential equation (SPDE) is approximated using a truncated series expansion. The SPDE is solved using two stochastic finite element (SFEM) methods, Monte Carlo sampling and the stochastic Galerkin method with global polynomials. The...
Preprint
In previous work the authors defined the k-th order simplicial distance between probability distributions which arises naturally from a measure of dispersion based on the squared volume of random simplices of dimension k. This theory is embedded in the wider theory of divergences and distances between distributions which includes Kullback-Leibler,...
Article
The average squared volume of simplices formed by k independent copies from the same probability measure μ on Rd defines an integral measure of dispersion ψk(μ), which is a concave functional of μ after suitable normalization. When k=1 it corresponds to tr(Σμ) and when k=d we obtain the usual generalized variance det(Σμ), with Σμ the covariance mat...
Preprint
Full-text available
We study the family of depolarizations of a squarefree monomial ideal $I$, i.e. all monomial ideals whose polarization is $I$. We describe a method to find all depolarizations of $I$ and study some of the properties they share and some they do not share. We then apply polarization and depolarization tools to study the reliability of multi-state coh...
Article
Full-text available
Gaussian processes (GP) are widely used as a metamodel for emulating time-consuming computer codes.We focus on problems involving categorical inputs, with a potentially large number L of levels (typically several tens),partitioned in G << L groups of various sizes. Parsimonious covariance functions, or kernels, can then be defined by block covarian...
Article
Full-text available
Data can be collected in scientific studies via a controlled experiment or passive observation. Big data is often collected in a passive way, e.g. from social media. Understanding the difference between active and passive observation is critical to the analysis. For example in studies of causation great efforts are made to guard against hidden conf...
Article
Full-text available
We study multiple simultaneous cut events for k-out-of-n:F and linear consecutive k-out-of-n:F systems in which each component has a constant failure probability. We list the multicuts of these systems and describe the structural differences between them. Our approach, based on combinatorial commutative algebra, allows complete enumeration of the s...
Article
We propose an optimal experimental design for a curvilinear regression model that minimizes the band-width of simultaneous confidence bands. Simultaneous confidence bands for nonlinear regression are constructed by evaluating the volume of a tube about a curve that is defined as a trajectory of a regression basis vector (Naiman, 1986). The proposed...
Article
Full-text available
We consider functionals measuring the dispersion of a d-dimensional distribution which are based on the volumes of simplices of dimension k ≤ d formed by k + 1 independent copies and raised to some power δ. We study properties of extremal measures that maximize these functionals. In particular, for positive δ we characterize their support and for n...
Article
Full-text available
The definition of Good Experimental Methodologies (GEMs) in robotics is a topic of widespread interest due also to the increasing employment of robots in everyday civilian life. The present work contributes to the ongoing discussion on GEMs for Unmanned Surface Vehicles (USVs). It focuses on the definition of GEMs and provides specific guidelines f...
Chapter
The contrast and tension between controlled experiment and passive observation is an old area of debate to which philosophers of science have made contributions. This paper is a discussion of the issue in the context of modern Bayesian optimal experimental design. It is shown with simple examples that a mixture of controlled and less controlled exp...
Article
Confidence nets, that is, collections of confidence intervals that fill out the parameter space and whose exact parameter coverage can be computed, are familiar in nonparametric statistics. Here, the distributional assumptions are based on invariance under the action of a finite reflection group. Exact confidence nets are exhibited for a single par...
Article
A general method of quadrature for uncertainty quantification (UQ) is introduced based on the algebraic method in experimental design. This is a method based on the theory of zero-dimensional algebraic varieties. It allows quadrature of polynomials or polynomial approximands for quite general sets of quadrature points, here called “designs.” The me...
Article
We consider a measure ψ k of dispersion which extends the notion of Wilk's generalised variance for a d-dimensional distribution, and is based on the mean squared volume of simplices of dimension k ≤ d formed by k + 1 independent copies. We show how ψ k can be expressed in terms of the eigenvalues of the covariance matrix of the distribution, also...
Article
The present paper studies multiple failure and signature analysis of coherent systems using the theory of monomial ideals. While system reliability has been studied using Hilbert series of monomial ideals, this is not enough to understand in a deeper sense the ideal structure features that reflect the behavior of the system under multiple simultane...
Article
Full-text available
We apply the methods of algebraic reliability to the study of percolation on trees. To a complete $k$-ary tree $T_{k,n}$ of depth $n$ we assign a monomial ideal $I_{k,n}$ on $\sum_{i=1}^n k^i$ variables and $k^n$ minimal monomial generators. We give explicit recursive formulae for the Betti numbers of $I_{k,n}$ and their Hilbert series, which allow...
Article
Almost all appointments of CEOs to major non-financial firms involve at least some headhunter involvement. Final selection always rests with the company board, but frequently the board chooses between candidates most of whom have been suggested for consideration by a head-hunter. The existing literature covers such issues as the scope and role of h...
Article
The algebraic approach to the analysis of system reliability associates an algebraic object, a monomial ideal, to a coherent system (CS), and studies the reliability of the system using the Hilbert series of the monomial ideal. New capabilities of the algebraic method in system design are shown, in particular related to enumeration of working state...
Conference Paper
It has been known that the curvature of data spaces plays a role in data analysis. For example, the Frechet mean (intrinsic mean) always exists uniquely for a probability measure on a non-positively curved metric space. In this paper, we use the curvature of data spaces in a novel manner. A methodology is developed for data analysis based on empiri...
Book
Full-text available
The phrase corporate social responsibility (CSR) is almost invariably used to suggest that corporations should pay significant attention to the impact their activities have upon their physical and social environments, usually going beyond the requirements of the law and any competitive pressures. It is often believed that CSR serves as a bulwark ag...
Article
Full-text available
Classical dimensional analysis in its original form starts by expressing the units for derived quantities, such as force, in terms of power products of basic units [Formula: see text] etc. This suggests the use of toric ideal theory from algebraic geometry. Within this the Graver basis provides a unique primitive basis in a well-defined sense, whic...
Article
Full-text available
We consider a measure $\psi$ k of dispersion which extends the notion of Wilk's generalised variance, or entropy, for a d-dimensional distribution, and is based on the mean squared volume of simplices of dimension k $\le$ d formed by k + 1 independent copies. We show how $\psi$ k can be expressed in terms of the eigenvalues of the covariance matrix...
Article
Smooth supersaturated models are interpolation models in which the underlying model size, and typically the degree, is higher than would normally be used in statistics, but where the extra degrees of freedom are used to make the model smooth using a standard second derivative measure of smoothness. Here, the solution is derived from a closed-form q...
Article
We define two quantities associated to each of the vertices of a simple graph, based on the collection of minimal vertex covers of the graph. They are called covering degree and covering index. We use them to describe new strategies for measuring the robustness of a network. We study the correlation between the defined quantities and other quantiti...
Chapter
Screening is a term applied to the general approach of detecting active or significant factors, elements, units, and so on when there are a large number of inactive factors. The meaning of ‘active’ depends on the context: effective treatments, presence of a disease, fault, etc. The methods are of considerable importance in medical and industrial ap...
Article
Full-text available
A general approach to Bayesian learning revisits some classical results, which study which functionals on a prior distribution are expected to increase, in a preposterior sense. The results are applied to information functionals of the Shannon type and to a class of functionals based on expected distance. A close connection is made between the latt...
Article
Confidence nets --- that is, collections of confidence intervals that fill out parameter space and whose exact coverage can be computed --- are familiar in nonparametric statistics. Here the distributional assumptions are based on invariance under the action of a finite reflection group. Exact confidence nets are exhibited for a single parameter, b...
Article
Full-text available
A strong link between information geometry and algebraic statistics is made by investigating statistical manifolds which are algebraic varieties. In particular it it shown how first and second order efficiency estimators can be constructed, such as bias corrected Maximum Likelihood and more general estimators, and for which the estimating equations...
Book
Multi-state coherent systems, such as networks, share properties with mononomial ideals which are a cornerstone of modern computational algebra. By exploiting this connection it is possible to obtain tight upper and lower bounds on network reliability which can be shown to dominate traditional Bonferroni bounds, at every truncation level. The key o...
Article
If $F$ is a full factorial design and $D$ is a fraction of $F$ , then for a given monomial ordering, the algebraic method gives a saturated polynomial basis for $D$ which can be used for regression. Consider now an algebraic basis for the complementary fraction of $D$ in $F$ , built under the same monomial ordering. We show that the bas...
Conference Paper
In this paper we propose a new technique of generating optimal designs by means of simulation. The method combines ideas from approximate Bayesian computation and optimal design of experiments and allows great flexibility in the employed criteria and models. We illustrate the idea by a simple expository example
Article
Full-text available
Classical dimensional analysis is one of the cornerstones of qualitative physics and is also used in the analysis of engineering systems, for example in engineering design. The basic power product relationship in dimensional analysis is identical to one way of defining toric ideals in algebraic geometry, a large and growing field. This paper exploi...