Article

Reward-Risk Ratios

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We introduce three new families of reward-risk ratios, study their properties and compare them to existing examples. All ratios in the three families are monotone and quasi-concave, which means that they prefer more to less and encourage diversification. Members of the second family are also scale invariant. The third family is a subset of the second one, and all its members only depend on the distribution of a return.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In another setting, where we restrict our performance measure to portfolios that do not allow arbitrage opportunities and irrationality we are able to work with a stronger formulation of suitability and to derive in this setting an existence and uniqueness result. In the following we will work with classes of reward-risk ratios, which were recently introduced in Cheridito and Kromer (2013). These are generalizations of mean-risk ratios from Tasche (2004). ...
... In contrast to Tasche (2004) we will consider reward-risk ratios that were recently introduced in Cheridito and Kromer (2013) as measures of performance. This leads to the next section. ...
... With different choices for ϕ we can model different attitudes of agents towards rewards. For additional information about RRRs induced by coherent or convex risk measures in conjunction with these reward measures we refer to Cheridito and Kromer (2013). Most important for us in this work are the properties (S) and (P) and the existence of interesting classes of reward measures with these properties. ...
Article
Capital allocation principles are used in various contexts in which a risk capital or a cost of an aggregate position has to be allocated among its constituent parts. We study capital allocation principles in a performance measurement framework. We introduce the notation of suitability of allocations for performance measurement and show under different assumptions on the involved reward and risk measures that there exist suitable allocation methods. The existence of certain suitable allocation principles generally is given under rather strict assumptions on the underlying risk measure. Therefore we show, with a reformulated definition of suitability and in a slightly modified setting, that there is a known suitable allocation principle that does not require any properties of the underlying risk measure. Additionally we extend a previous characterization result from the literature from a mean-risk to a reward-risk setting. Formulations of this theory are also possible in a game theoretic setting.
... This paper investigates a class of distributionally robust optimization problems DRR which maximizes a fractional function representing a reward-risk ratio and in which the ambiguity set includes the true probability distribution with a large specified probability. This class of problems has immediate applications in financial optimization, in which asset returns are random variables whose probability distribution is difficult to elicit, and investment strategies are frequently determined by using risk-adjusted return metrics (see, e.g., Bacon (2012), Cheridito and Kromer (2013)). ...
... A flurry of performance ratio measures, such as the Sortino (Sortino and Van Der Meer 1991), Sortino-Satchel (Sortino and Satchel 2001), or Omega (Keating and Shadwick 2002) ratios, have been introduced since. Bacon (2012) and Cheridito and Kromer (2013) provide in-depth analyses of ratio performance measures. ...
... On the other hand, although sparse data and sparsity has attracted a lot of attention in recent years, concentration, scaling, homogeneous growth and replication were originally applied in 1920 by [2] in a financial setting to measure the inequity of wealth distribution; [18] added bounds for dispersion functions; [30] [33] added symmetry and continuity for sparsity and fairness functions, respectively; and [38] added quasi-convexity for reward-risk ratio functions. ...
... S. Quasi-convexity prefers extremes than averages. [38] H. Quasi-concavity encourages diversification. [13] Lemma 2.2: II-A1, II-A2, II-A3 and II-A4 imply II-B2. ...
Article
Full-text available
Sparsity and entropy are pillar notions of modern theories in signal processing and information theory. However, there is no clear consensus among scientists on the characterization of these notions. Previous efforts have contributed to understand individually sparsity or entropy from specific research interests. This paper proposes a mathematical formalism, a joint axiomatic characterization, which contributes to comprehend (the beauty of) sparsity and entropy. The paper gathers and introduces inherent and first principles criteria as axioms and attributes that jointly characterize sparsity and entropy. The proposed set of axioms is constructive and allows to derive simple or \emph{core functions} and further generalizations. Core sparsity generalizes the Hoyer measure, Gini index and pq-means. Core entropy generalizes the R\'{e}nyi entropy and Tsallis entropy, both of which generalize Shannon entropy. Finally, core functions are successfully applied to compressed sensing and to minimum entropy given sample moments. More importantly, the (simplest) core sparsity adds theoretical support to the 1\ell_1-minimization approach in compressed sensing.
... The distinctive property of performance measures is scale invariance -a rescaled portfolio or cashflow is accepted at the same level. Performance and acceptability indices were studied in [26,13,9,18,11], and they are meant to provide assessment of how good a financial position is. In particular, [18] gives examples of performance indices that are not acceptability indices. ...
... Performance and acceptability indices were studied in [26,13,9,18,11], and they are meant to provide assessment of how good a financial position is. In particular, [18] gives examples of performance indices that are not acceptability indices. It needs to be noted that the theory developed in this paper can also be applied to sub-scale invariant dynamic performance indices studied in [55,12]. ...
Article
Full-text available
In this paper we provide a unified and flexible framework for study of the time consistency of risk and performance measures. The proposed framework integrates existing forms of time consistency as well as various connections between them. In our approach the time consistency is studied for a large class of maps that are postulated to satisfy only two properties -- monotonicity and locality. This makes our framework fairly general. The time consistency is defined in terms of an update rule -- a novel notion introduced in this paper. We design various updates rules that allow to recover several known forms of time consistency, and to study some new forms of time consistency.
... As shown by proponents of behavioral finance [2,3,[5][6][7]17], not all investors are nonsatiable and risk-averse. Thus, it is important to classify the optimal choices for any admissible ordering of preferences. ...
... Several return risk ratio can be used in this framework (see, among others,[17]). In this empirical analysis we use the Rachev ratio (see[30]). ...
Article
Full-text available
In this paper, we examine three portfolio-type problems where investors rank their choices considering each of the following: (1) risk, (2) uncertainty, and (3) the distance from a benchmark. For each problem, we analyze possible orderings for the choices and we propose several admissible portfolio optimization problems. Thus, we discuss the properties of several ─ risk measures, uncertainty measures and tracking error measures ─ and their consistency with investor choices. Furthermore, we propose several linearizable allocation problems consistent with a given ordering and demonstrate how many portfolio selection problems proposed in literature can be solved. The purpose of this paper is twofold. First, we show how to use the connection between ordering theory and the theory of probability functionals in portfolio selection problems. Second, we discuss the computational complexity of selection problems consistent with the preferences of investors. With these purposes in mind, we review several single-period portfolio problems proposed in the literature, emphasizing those which are computational simple (portfolio problems that can be reduced at least to
... The value at risk for the initial margins IM k , the overall risk measure R as well as the optimal capital allocation are all positive homogeneous. It follows that RC k (λX) = RC k (X) for every λ > 0, that is, the relative risk contribution is scaling invariant as for instance the Sharpe ratio, Minmax ratio or Gini ratio among others, see Cheridito and Kromer (2013). The scaling invariance property allows one to consider the allocation independently of the total size of the default fund. ...
Thesis
This thesis deals with various issues related to collateral management in the context of centralized trading through central clearing houses. In the first place, we present the notions of cost of capital and funding cost for a bank, placing them in an elementary Black–Scholes framework where the payoff of a standard call is used as the exposure at default of a counterparty. It is assumed that the bank can’t perfectly hedge this call and must face with a funding cost higher than the risk free rate, hence pricing corrections of the FVA and KVA type appear in top of the Black–Scholes price. Then, we look at the different costs that a bank has to face when trading in the CCP context. To this end, we transpose the well-known XVA analysis framework from the bilateral trading world to the central clearing one. The total cost for a member trading through a CCP is thus decomposed into aCVA corresponding to the cost for the member to reimburse its contribution to the guarantee fund in the event of losses due to the defaults of other members, a MVA which is the cost of financing its initial margin and a KVA corresponding to the cost of capital put at risk by the member in the form of its contribution to the guarantee fund. Afterwards, we question the previously used regulatory assumptions, focusing on alternatives in which members would borrow their initial margin to a third party who would post the margin instead of the member himself, and this, in exchange for remuneration. We also consider a method of computing the guarantee fund and its allocation taking into account the risk of the CCP in the sense of fluctuations of its P&L over the following year, as it results from the market risk and the counterparty risk of the members. Finally, we propose the application of multivariate risk measure methodologies for the computation of margins and/or the CCP guarantee fund. We introduce a notionof systemic risk measures in the sense that they are sensitive not only to the marginal risks of the components of a financial system (for example, but not necessarily the positions of the members of a CCP) but also to the dependence of their components.
... Some authors treat acceptability indices as the special subclass of performance measures, that satisfy the quasiconcavity axiom. In particular,[CK13] gives examples of performance indices that are not quasi-concave. Nevertheless, in this paper we have decided to use those two names interchangeably. ...
Article
Full-text available
In this paper, we provide a flexible framework allowing for a unified study of time consistency of risk measures and performance measures (also known as acceptability indices). The proposed framework not only integrates existing forms of time consistency, but also provides a comprehensive toolbox for analysis and synthesis of the concept of time consistency in decision making. In particular, it allows for in-depth comparative analysis of (most of) the existing types of time consistency - a feat that has not been possible before and, which is done in the companion paper by the authors. In our approach, the time consistency is studied for a large class of maps that are postulated to satisfy only two properties - monotonicity and locality. We call these maps LM-measures. The time consistency is defined in terms of an update rule. The form of the update rule introduced here is novel, and is perfectly suited for developing the unifying framework that is worked out in this paper. As an illustration of the applicability of our approach, we show how to recover almost all concepts of weak time consistency by means of constructing appropriate update rules.
Article
Full-text available
This is a study of decision problems under two-dimensional risk. We use an existing index of absolute correlation aversion to conveniently classify bivariate preferences, with respect to attitudes toward this risk. This classification seems to be more important than whether decision makers are correlation-averse or correlation-seeking for the study of insurance demand when a loss has a multidimensional impact. On this note, we also re-examine Mossin’s theorem under bivariate preferences, where full insurance is preferred with a fair premium, while less than full coverage is preferred with a proportional premium loading. Furthermore, based on the comparative statics of this two-dimensional insurance model for changes in correlation aversion, we derive testable implications about the classification of bivariate utility functions. For the particular case when the two-dimensional risk can be interpreted as risk on income and health, we identify the form of separable utility functions depending on health status and income that is consistent with household disability insurance decisions.
Article
Full-text available
The aim of this paper is to study the optimal investment problem by using coherent acceptability indices (CAIs) as a tool to measure the portfolio performance. We call this problem the acceptability maximization. First, we study the one-period (static) case, and propose a numerical algorithm that approximates the original problem by a sequence of risk minimization problems. The results are applied to several important CAIs, such as the gain-to-loss ratio, the risk-adjusted return on capital and the tail-value-at-risk based CAI. In the second part of the paper we investigate the acceptability maximization in a discrete time dynamic setup. Using robust representations of CAIs in terms of a family of dynamic coherent risk measures (DCRMs), we establish an intriguing dichotomy: if the corresponding family of DCRMs is recursive (i.e. strongly time consistent) and assuming some recursive structure of the market model, then the acceptability maximization problem reduces to just a one period problem and the maximal acceptability is constant across all states and times. On the other hand, if the family of DCRMs is not recursive, which is often the case, then the acceptability maximization problem ordinarily is a time-inconsistent stochastic control problem, similar to the classical mean-variance criteria. To overcome this form of time-inconsistency, we adapt to our setup the set-valued Bellman's principle recently proposed in [23] applied to two particular dynamic CAIs - the dynamic risk-adjusted return on capital and the dynamic gain-to-loss ratio. The obtained theoretical results are illustrated via numerical examples that include, in particular, the computation of the intermediate mean-risk efficient frontiers.
Article
Full-text available
This research is an extension of our previous work [Debnath and Srivastava (2021)]. In that paper, we designed a portfolio based on data taken from National Stock Exchange (NSE), India, during 1 January 2020 to 31 December 2020 and performance of that portfolio in real-life situation was examined during 1 January 2021 to 21 May 2021 assuming investments were made according to the proposed model. We observed that our proposed portfolio was efficient enough in that period to beat the performance of most of the in-demand mutual funds. It was also conjectured that this portfolio would be sustainable post the second wave of COVID-19 in India. In the present paper, our aim is to validate this conjecture. Here, we examine the performance of this portfolio during the period 1 January 2021 to 18 October 2021 using the same previous data set. We also investigate the performance of this portfolio if it was blindly adopted without applying the stock selection methodology during 1 January 2019 to 31 December 2019. Using paired t-test between the difference of means of the performances in the year 2019 and the year 2021, we show that the performance in 2021 was significantly enhanced because of selecting the stocks applying our proposed model.
ResearchGate has not been able to resolve any references for this publication.