Andreas Tsanakas

Andreas Tsanakas
  • Lecturer at City, University of London

About

99
Publications
19,857
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,439
Citations
Current institution
City, University of London
Current position
  • Lecturer

Publications

Publications (99)
Preprint
Full-text available
In this paper, we study the risk sharing problem among multiple agents using Lambda Value-at-Risk as their preference functional, under heterogeneous beliefs, where beliefs are represented by several probability measures. We obtain semi-explicit formulas for the inf-convolution of multiple Lambda Value-at-Risk measures under heterogeneous beliefs a...
Article
Full-text available
This article summarizes the main topics, findings, and avenues for future work from the workshop Fairness with a view towards insurance held August 2023 in Copenhagen, Denmark.
Article
Full-text available
In applications of predictive modeling, such as insurance pricing, indirect or proxy discrimination is an issue of major concern. Namely, there exists the possibility that protected policyholder characteristics are implicitly inferred from non-protected ones by predictive models and are thus having an undesirable (and possibly illegal) impact on pr...
Preprint
Full-text available
Indirect discrimination is an issue of major concern in algorithmic models. This is particularly the case in insurance pricing where protected policyholder characteristics are not allowed to be used for insurance pricing. Simply disregarding protected policyholder information is not an appropriate solution because this still allows for the possibil...
Article
Full-text available
Two natural and potentially useful properties for capital allocation rules are top-down consistency and shrinking independence. Top-down consistency means that the total capital is determined by the aggregate portfolio risk. Shrinking independence means that the risk capital allocated to a given business line should not be affected by a proportiona...
Preprint
Full-text available
In applications of predictive modeling, such as insurance pricing, indirect or proxy discrimination is an issue of major concern. Namely, there exists the possibility that protected policyholder characteristics are implicitly inferred from non-protected ones by predictive models, and are thus having an undesirable (or illegal) impact on prices. A t...
Article
Full-text available
A vast and growing literature on explaining deep learning models has emerged. This paper contributes to that literature by introducing a global gradient-based model-agnostic method, which we call Marginal Attribution by Conditioning on Quantiles (MACQ). Our approach is based on analyzing the marginal attribution of predictions (outputs) to individu...
Article
Full-text available
The choice of a copula model from limited data is a hard but important task. Motivated by the visual patterns that different copula models produce in smoothed density heatmaps, we consider copula model selection as an image recognition problem. We extract image features from heatmaps using the pre-trained AlexNet and present workflows for model sel...
Article
In this short communication, we present a new, simple control-variate Monte Carlo procedure for enhancing the evaluation accuracy of alternative reinsurance strategies that an insurance company might adopt.
Article
Current approaches to fair valuation in insurance often follow a two-step approach, combining quadratic hedging with application of a risk measure on the residual liability, to obtain a cost-of-capital margin. In such approaches, the preferences represented by the regulatory risk measure are not reflected in the hedging process. We address this iss...
Article
Full-text available
We consider the following question: given information on individual policyholder characteristics, how can we ensure that insurance prices do not discriminate with respect to protected characteristics, such as gender? We address the issues of direct and indirect discrimination, the latter resulting from implicit learning of protected characteristics...
Article
We introduce an approach to sensitivity analysis of quantitative risk models, for the purpose of identifying the most influential inputs. The proposed approach relies on a change of measure derived by minimising the χ2-divergence, subject to a constraint (‘stress’) on the expectation of a chosen random variable. We obtain an explicit solution of th...
Article
Full-text available
In risk analysis, sensitivity measures quantify the extent to which the probability distribution of a model output is affected by changes (stresses) in individual random input factors. For input factors that are statistically dependent, we argue that a stress on one input should also precipitate stresses in other input factors. We introduce a novel...
Article
The Scenario Weights for Importance Measurement ( SWIM ) package implements a flexible sensitivity analysis framework, based primarily on results and tools developed by Pesenti et al . (2019). SWIM provides a stressed version of a stochastic model, subject to model components (random variables) fulfilling given probabilistic constraints (stresses)....
Preprint
Full-text available
Current approaches to fair valuation in insurance often follow a two-step approach, combining quadratic hedging with application of a risk measure on the residual liability, to obtain a cost-of-capital margin. In such approaches, the preferences represented by the regulatory risk measure are not reflected in the hedging process. We address this iss...
Article
Full-text available
We welcome Etzion et al.’s (2019) effort to critically assess the role of cat models in insurance markets, by combining a sociology of finance lens with statistical analysis. Nonetheless, we believe there are two flaws in their analysis. First, their interpretation of the model-as-engine metaphor, as well as the way they test for this metaphor, is...
Preprint
Full-text available
The SWIM package implements a flexible sensitivity analysis framework, based primarily on results and tools developed by Pesenti et al. (2019). SWIM provides a stressed version of a stochastic model, subject to model components (random variables) fulfilling given probabilistic constraints (stresses). Possible stresses can be applied on moments, pro...
Experiment Findings
Full-text available
Briefing note describing how (re)insurers and university-based environmental scientists could best work together to access government (UKRI) funding in order to improve the flow of science into the insurance sector. This study used a session at the Oasis Loss Modelling Framework (LMF) conference on 14th Sept 2018 to collect data objectively documen...
Article
Full-text available
In countries globally there is intense political interest in fostering effective university–business collaborations, but there has been scant attention devoted to exactly how an individual scientist's workload (i.e. specified tasks) and incentive structures (i.e. assessment criteria) may act as a key barrier to this. To investigate this an original...
Article
Full-text available
Sensitivity analysis is an important component of model building, interpretation and validation. A model comprises a vector of random input factors, an aggregation function mapping input factors to a random output, and a (baseline) probability measure. A risk measure, such as Value-at-Risk and Expected Shortfall, maps the distribution of the output...
Preprint
Full-text available
Major (2018) discusses Euler/Aumann–Shapley allocations for non-linear positively homogeneous portfolios. For such portfolio structures, plausibly arising in the context of reinsurance, he defines a distortion-type risk measure that facilitates assessment of ceded and net losses with reference to gross portfolio outcomes. Subsequently, Major (2018)...
Article
Full-text available
In countries globally (e.g. UK, Australia) there is intense political interest in fostering effective university-business collaborations, but there has been scant attention devoted to exactly how individual scientists' workload (i.e. specified tasks) and incentive structures (i.e. assessment criteria) may act as a key barrier to this. To investigat...
Preprint
Full-text available
Major (2018) discusses Euler/Aumann-Shapley allocations for non-linear portfolios. He argues convincingly that many (re)insurance portfolios, while non-linear, are nevertheless positively homogeneous, owing to the way that deductibles and limits are typically set. For such non-linear but homogeneous portfolio structures, he proceeds with defining a...
Preprint
Full-text available
Sensitivity analysis is an important component of model building, interpretation and validation. A model comprises a vector of random input factors, an aggrega-tion function mapping input factors to a random output, and a (baseline) probability measure. A risk measure, such as Value-at-Risk and Expected Shortfall, maps the distribution of the outpu...
Article
Full-text available
This paper presents latest thinking from the Institute and Faculty of Actuaries’ Model Risk Working Party and follows on from their Phase I work, Model Risk: Daring to Open the Black Box . This is a more practical paper and presents the contributors’ experiences of model risk gained from a wide range of financial and non-financial organisations wit...
Article
Full-text available
Existing risk capital allocation methods, such as the Euler rule, work under the explicit assumption that portfolios are formed as linear combinations of random loss/profit variables, with the firm being able to choose the portfolio weights. This assumption is unrealistic in an insurance context, where arbitrary scaling of risks is generally not po...
Article
Full-text available
One of risk measures’ key purposes is to consistently rank and distinguish between different risk profiles. From a practical perspective, a risk measure should also be robust, that is, insensitive to small perturbations in input assumptions. It is known in the literature [14, 39], that strong assumptions on the risk measure’s ability to distinguish...
Article
Taming the beast of uncertainty has been the grand project to which actuaries have dedicated much of their energy and skill over at least the last 50 years – roughly the time since, in Hans Bühlmann's (1989) famous term, “Actuaries of the Second Kind” emerged.
Article
Full-text available
The required solvency capital for a financial portfolio is typically given by a tail risk measure such as value-at-risk. Estimating the value of that risk measure from a limited, often small, sample of data gives rise to potential errors in the selection of the statistical model and the estimation of its parameters. We propose to quantify the effec...
Article
In a quantitative model with uncertain inputs, the uncertainty of the output can be summarized by a risk measure. We propose a sensitivity analysis method based on derivatives of the output risk measure, in the direction of model inputs. This produces a global sensitivity measure, explicitly linking sensitivity and uncertainty analyses. We focus on...
Technical Report
Full-text available
With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. Such risk is generated by the potential inaccuracy and inappropriate use of models in business applications, which can lead to substantial financial losses and reputational damage. In this paper we deal with...
Article
In this paper, we study the extent to which any risk measure can lead to superadditive risk assessments, implying the potential for penalizing portfolio diversification. For this purpose we introduce the notion of extreme-aggregation risk measures. The extreme-aggregation measure characterizes the most superadditive behavior of a risk measure by yi...
Article
The notion of residual estimation risk is introduced in order to study the impact of parameter uncertainty on capital adequacy, for a given risk measure and capital estimation procedure. Residual estimation risk is derived by applying the risk measure on a portfolio consisting of a random loss and a capital estimator, reflecting the randomness inhe...
Article
The potential for superadditivity that a risk measure displays, is directly linked to the potential for penalizing portfolio diversification. In this paper, we study the extent to which any risk measure can lead to superadditive risk assessments. For this purpose we introduce the notion of extreme-aggregation risk measures. The extreme-aggregation...
Article
We consider capital allocation in a hierarchical corporate structure where stakeholders at two organizational levels (e.g. board members vs line managers) may have conflicting objectives, preferences, and beliefs about risk. Capital allocation is considered as the solution to an optimization problem whereby a quadratic deviation measure between ind...
Article
The required solvency capital for a financial portfolio is typically given by a tail risk measure such as Value-at-Risk. Estimating the value of that risk measure from a limited, often small, sample of data gives rise to potential errors in the selection of the statistical model and the estimation of its parameters. We propose to quantify the effec...
Article
Optimal risk transfers are derived within an insurance group consisting of two separate legal entities, operating under potentially different regulatory capital requirements and capital costs. Consistent with regulatory practice, capital requirements for each entity are computed by either a value-at-risk or an expected shortfall risk measure. The o...
Article
We use mean-variance hedging in discrete time, in order to value a terminal insurance liability. The prediction of the liability is decomposed into claims development results, that is, yearly deteriorations in its conditional expected value. We assume the existence of a tradeable derivative with binary pay-off, written on the claims development res...
Article
We propose a sensitivity measure for risk models that is defined as the derivative of an output risk measure, in the direction of input risk factors. This produces a global sensitivity measure and provides an explicit link between sensitivity and uncertainty analyses. We focus on the case of distortion risk measures, defined as weighted averages of...
Article
We introduce a notion of residual estimation risk to measure the impact of parameter uncertainty on capital adequacy, when capital is calculated by a risk measure. The residual estimation risk reflects the volatility of the capital estimator and is positive for standard estimation procedures. For distributions in location-scale families, capital es...
Article
I present a brief practitioner-focused discussion of model error in the context of insurance solvency regulation. Particular focus is on the impact of model error on the interraction of regulators and regulated firms.
Article
In dynamic risk measurement the problem emerges of assessing the risk of a financial position at different times. Sufficient conditions are provided for conditional coherent risk measures, in order that the requirements of acceptance-, rejection- and sequential consistency are satisfied. It is shown that these conditions are often violated for stan...
Article
Full-text available
A bstract This article develops a unifying framework for allocating the aggregate capital of a financial firm to its business units. The approach relies on an optimization argument, requiring that the weighted sum of measures for the deviations of the business unit's losses from their respective allocated capitals be minimized. The approach is fair...
Article
Optimal risk transfers are derived within an insurance group consisting of two separate legal entities, operating under potentially different regulatory capital requirements and capital costs. Consistently with regulatory practice, capital requirements for each entity are computed by either a Value-at-Risk or an Expected Shortfall risk measure. The...
Article
The Generalized Pareto Distribution (GPD) is a widely used model in risk management, enabling the calculation of high percentiles (Values-at-Risk). The GPD is typically fitted to excesses over a high threshold. In this simulation study, we examine the performance of the GPD model with threshold selected adaptively by the Gertensgerbe-Werner (GW) me...
Article
In dynamic risk measurement the problem emerges of assessing the risk of a financial position at different times. Sufficient conditions are provided for conditional coherent risk measures, in order that the requirements of acceptance, rejection and sequential consistency are satisfied. It is shown that these conditions are often violated for standa...
Article
Actuaries are often faced with the task of estimating tails of loss distributions from just a few observations. Thus estimates of tail probabilities (reinsurance prices) and percentiles (solvency capital requirements) are typically subject to substantial parameter uncertainty. We study the bias and MSE of estimators of tail probabilities and percen...
Article
For solvency purposes insurance companies need to calculate so-called best-estimate reserves for outstanding loss liability cash flows and a corresponding risk margin for non-hedgeable insurance-technical risks in these cashflows. In actuarial practice, the calculation of the risk margin is often not based on a sound model but various simplified me...
Article
For solvency purposes insurance companies need to calculate so-called best-estimate reserves for outstanding loss liability cash flows and a corresponding risk margin for non-hedgeable insurance-technical risks in these cash flows. In actuarial practice, the calculation of the risk margin is often not based on a sound model but various simplified m...
Article
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain...
Article
Full-text available
This paper develops a unifying framework for allocating the aggregate capital of a financial firm to its business units. The approach relies on an optimisation argument, requiring that the weighted sum of measures for the deviations of the business unit’s losses from their respective allocated capitals be minimised. This enables the association of...
Chapter
This contribution relates to the use of risk measures for determining (re)insurers' economic capital requirements. Alternative sets of properties of risk measures are discussed. Furthermore, methods for constructing risk measures via indifference arguments, representation results, and reweighting of probability distributions are presented. How thes...
Article
A distortion-type risk measure is constructed, which evaluates the risk of any uncertain position in the context of a portfolio that contains that position and a fixed background risk. The risk measure can also be used to assess the performance of individual risks within a portfolio, allowing for the portfolio’s re-balancing, an area where standard...
Article
Convex risk measures were introduced by Deprez and Gerber [Deprez, O., Gerber, H.U., 1985. On convex principles of premium calculation. Insurance: Math. Econom. 4 (3), 179–189]. Here the problem of allocating risk capital to subportfolios is addressed, when convex risk measures are used. The Aumann–Shapley value is proposed as an appropriate alloca...
Article
An exchange economy is considered, where agents (insurers/banks) trade risks. Decision making takes place under distorted probabilities, which are used to represent either rank-dependence of preferences or ambiguity with respect to real-world probabilities. Pricing formulas and risk allocations, generalising the results of Buhlmann (1980, 1984) are...
Article
This contribution relates to the use of risk measures for determining (re)insurers' economic capital requirements. Alternative sets of properties of risk measures are discussed. Furthermore, methods for constructing risk measures via indifference arguments, representation results and re-weighting of probability distributions are presented. It is sh...
Article
A family of multivariate distributions, based on asymmetric normal mixtures, is introduced in order to model the dependence among insurance and financial risks. The model allows for straight-forward parameterisation via a correlation matrix and enables the modelling of radially asymmetric dependence structures, which are often of interest in risk m...
Article
An exchange economy is considered, where agents (insurers/banks) trade risks. Decision making takes place under distorted probabilities, which are used to represent either rank-dependence of preferences or ambiguity with respect to real-world probabilities. Pricing formulas and risk allocations, generalising the results of Bühlmann (1980, 1984) are...
Article
It is shown that for elliptically distributed bivariate random vectors, the riskiness and dependence strength of random portfolios, in the sense of the univariate convex and bivariate concordance stochastic orders respectively, can be simply characterised in terms of the vector's [Sigma]-matrix.
Article
The theory and practice of risk measurement provides a point of intersection between risk management, economic theories of choice under risk, financial economics, and actuarial pricing theory. This article provides a review of these interrelationships, from the perspective of an insurance company seeking to price the risks that it underwrites. We e...
Conference Paper
Stochastic generation is expected to take a large share of the energy production in future power systems. Two basic features of this type of generation distinguish it from the traditional centralized, conventional generation: it is highly distributed (large number of small-scale generators) and non-dispatchable (use of an uncontrolled prime mover)....
Conference Paper
A methodology for the modelling and analysis of horizontally-operated power systems, i.e. systems with a high penetration of stochastic renewable generation, is presented. The objective is to obtain insight in the steady-state of the transmission system when a high penetration level of stochastic distributed generation (in this study case wind powe...
Conference Paper
A probabilistic methodology for the planning and reliability evaluation of distributed power systems is introduced. The core of the technique is the steady state uncertainty analysis based on Monte Carlo Simulation. In particular, the extreme loading scenarios for the HV/MV transformer link are investigated, by the application of the Stochastic Bou...
Article
Full-text available
Stochastic orderings of elliptically distributed random vectors are stud-ied. It is shown that for random variables in the same elliptical family, riskiness, in the sense of the stop-loss and convex orders, can be sim-ply characterised by the ordering of variances. On the other hand, the strength of dependence between the elements of a bivariate el...
Conference Paper
In this paper a design methodology of 'distributed' energy systems is presented. These are defined as energy systems with unregulated distributed generators connected to the lower voltage levels. The cornerstone in their design is the steady-state analysis of distribution systems under uncertainty in energy in energy generation and consumption. Bas...
Conference Paper
In this paper, a Monte Carlo based design methodology is presented, for the analysis of wind energy distributed power systems. These are systems with a large-scale penetration of wind turbines connected at the lower voltage levels. In the design algorithm, the uncertainty related to the wind power production and energy consumption is modeled, based...
Article
Tsanakas and Barnett [Insurance: Mathematics and Economics 33 (2003) 239] employed concepts from cooperative game theory [Aumann and Shapley, Values of Non-Atomic Games. Princeton University Press, Princeton] for the allocation of risk capital to portfolios of pooled liabilities, when distortion risk measures [Insurance: Mathematics and Economics 2...
Article
The Aumann–Shapley [Values of Non-atomic Games, Princeton University Press, Princeton] value, originating in cooperative game theory, is used for the allocation of risk capital to portfolios of pooled liabilities, as proposed by Denault [Coherent allocation of risk capital, J. Risk 4 (1) (2001) 1]. We obtain an explicit formula for the Aumann–Shapl...
Article
We discuss classes of risk measures in terms both of their axiomatic definitions and of the economic theories of choice that they can be derived from. More specifically, expected utility theory gives rise to the exponential premium principle, proposed by Gerber (1974), Dhaene et al. (2003), whereas Yaari's (1987) dual theory of risk can be viewed a...
Article
The Aumann-Shapley value from non-atomic (fuzzy) cooperative game theory ([3], [2]) has been employed as a solution concept for cost allo-cation problems [4], as well as interpreted in the context of equilibrium in a monopolistic production economy [10]. As in [7] we are concerned with the application of the Aumann-Shapley value to the allocation o...
Article
A distortion-type risk measure is constructed, which evaluates the risk of a position in relation to the portfolio that the position belongs to. This allows a threefold interpretation: It can be viewed (i) as a mechanism for determining sensitivities of risk capital allocation and performance measurement to the re-balancing of portfolios; (ii) as a...
Article
Financial firms, such as insurance companies, face uncertainty about future losses from their business activity. Therefore they are forced by regulators to hold safely invested capital, such that the probability of insolvency (losses exceeding capital) is at an acceptably low level. Capital is estimated on the basis of a random sample, often of sma...

Network

Cited By