Article

Stress Testing in a Value at Risk Framework

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

This article proposes a methodology-that can be used to parameterize stress test scenarios using the conditional probability distributions that are commonly used in daily VaR calculations. This new approach allows for a complete characterization of the value change distribution of a porfolio in a stress scenario. Statistical evidence demonstrates that the proposed loss exposure measure is substantially more accurate than the stress exposure measures that financial institutions commonly use.The results also suggest, contrary to popular perception, that historical VaR risk factor covari-ances and the assumption of conditional normality can be used to construct reasonably accurate loss exposure estimates in many stressful market environments.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... initially proposed by Kupiec (1998), exhibited in equation 2.7. Kupiec (1998) prescribes only the last term, without the multiplier (3+k) and an arbitrary number N in place of the fixed 60. ...
... initially proposed by Kupiec (1998), exhibited in equation 2.7. Kupiec (1998) prescribes only the last term, without the multiplier (3+k) and an arbitrary number N in place of the fixed 60. ...
... to, at least, other two risk factors: interest rates and exchange coupon. The results confirm the effectiveness of the Stressed VaR approach from Kupiec (1998) adapted for a two-volatility regime switching model, however they do not shed a light for the BIS stressed VaR, in which the simultaneous use of high and (not very) low volatility parcels cab be faced as an over specification. Next tables (2.7; 2.8) show that few violations occurred when using a maximum historical daily volatility VIX in the VaR for currency swaps: either only two days, when referring to overall maximum VIX, or only 9 days, when referring to maximum VIX until the analyzed period. ...
Conference Paper
Full-text available
1.Abstract We analyze whether two of the main recommendations, capital requirements and the use of Stressed VaR, would mitigate the effects of the Brazilian pre-election crisis in 2002 if they were already implemented in the past. We innovate in three situations: using the VIX as a volatility alternative (proxy) for stress scenarios when no historical data is available; modeling financial time series with SWGARCH and alpha-stable innovations (according to Broda et al., 2013) and analyzing Market Risk with two approaches simultaneously: the Early Waning approach, from the International Monetary Fund (IMF) and the Capital Requirements (BIS).
... In our numerical verification, the correlation matrices to be approximated are calculated from the sample data in the Shenzhen Stock Exchange and the Shanghai Stock Exchange in China. For the constraints of (6.1), we notice that particular restrictions may be associated with a historical stressful event (such as the 1987 stock market crash and 2008 economic crisis), or can be a set of hypothetical changes related with same possible future stressful market event [13]. Generally, identifying accurately the stress events set and restricting the correlation coefficients between stress events and other underlying events [13] are very difficult, and Content courtesy of Springer Nature, terms of use apply. ...
... For the constraints of (6.1), we notice that particular restrictions may be associated with a historical stressful event (such as the 1987 stock market crash and 2008 economic crisis), or can be a set of hypothetical changes related with same possible future stressful market event [13]. Generally, identifying accurately the stress events set and restricting the correlation coefficients between stress events and other underlying events [13] are very difficult, and Content courtesy of Springer Nature, terms of use apply. Rights reserved. ...
Article
Full-text available
In this paper, we are concerned with efficient algorithms for solving the least squares semidefinite programming which contains many equalities and inequalities constraints. Our proposed method is built upon its dual formulation and is a type of active-set approach. In particular, by exploiting the nonnegative constraints in the dual form, our method first uses the information from the Barzlai–Borwein step to estimate the active/inactive sets, and within an adaptive framework, it then accelerates the convergence by switching the L-BFGS iteration and the semi-smooth Newton iteration dynamically. We show the global convergence under mild conditions, and furthermore, the local quadratic convergence under the additional nondegeneracy condition. Various types of synthetic as well as real-world examples are tested, and preliminary but promising numerical experiments are reported.
... For this purpose, practitioners in financial industry usually seek an approximation of the restricted Table 1 Numerical results on E1 In our experiment, the correlation matrices to be approximated are calculated from the sample data in the Shenzhen Stock Exchange and the Shanghai Stock Exchange. We remark that the specified restrictions may correspond to an historical stressful event (such as the 1987 stock market crash and 2008 economic crisis) or may be a set of hypothetical changes corresponding to some possible future stressful market event (Kupiec 1998;Han et al. 2017; . But, anyhow, identifying the stress events set and restricting the correlation coefficients between stress events and other underlying events is a rather complicated and professional job (Kupiec 1998). ...
... We remark that the specified restrictions may correspond to an historical stressful event (such as the 1987 stock market crash and 2008 economic crisis) or may be a set of hypothetical changes corresponding to some possible future stressful market event (Kupiec 1998;Han et al. 2017; . But, anyhow, identifying the stress events set and restricting the correlation coefficients between stress events and other underlying events is a rather complicated and professional job (Kupiec 1998). For simplicity, we therefore randomly generate the constraints and their positions to imitate the stress testing scenarios. ...
Article
This paper proposes an L-BFGS algorithm based on the active set technique to solve the matrix approximation problem: given a symmetric matrix, find a nearest approximation matrix in the sense of Frobenius norm to make it satisfy some linear equalities, inequalities and a positive semidefinite constraint. The problem is a convex optimization problem whose dual problem is a nonlinear convex optimization problem with non-negative constraints. Under the Slater constraint qualification, it has zero duality gap with the dual problem. To handle large-scale dual problem, we make use of the active set technique to estimate the active constraints, and then the L-BFGS method is used to accelerate free variables. The global convergence of the proposed algorithm is established under certain conditions. Finally, we conduct some preliminary numerical experiments, and compare the L-BFGS method with the inexact smoothing Newton method, the projected BFGS method, the alternating direction method and the two-metric projection method based on the L-BFGS. The numerical results show that our algorithm has some advantages in terms of CPU time when a large number of linear constraints are involved.
... Nor has the private sector gone further in formulating a palpable definition of stresstesting. Anecdotal evidence suggests that risk managers pursue stress-testing as a way of understanding gamma-risk and as a tool for studying portfolio allocation (Kupiec (1999)). Large and complex portfolios containing assets with nonlinear payoffs, such as options, may behave very differently in response to large shocks than would be expected given its valuation in more typical situations. ...
... factors to include in the scenario are basically a judgement call. Kupiec (1999) notes that when constructing a particular scenario, it is common practice to "zero out" all but the primary factors of interest. For example, any exchange rates that are not expected to play a key role in a middle east crisis scenario would be left unchanged --unlike the basic running of the risk model in which all factors move. ...
Article
In recent months and years both practitioners and regulators have embraced the ideal of supplementing VaR estimates with "stress-testing". Risk managers are beginning to place an emphasis and expend resources on developing more and better stress-tests. In the present paper, we hold the standard approach to stress-testing up to a critical light. The current practice is to stress-test outside the basic risk model. Such an approach yields two sets of forecasts -- one from the stress-tests and one from the basic model. The stress scenarios, conducted outside the model, are never explicitly assigned probabilities. As such, there is no guidance as to the importance or revelance of the results of stress-tests. Moreover, how to combine the two forecasts into a usable risk metric is not known. Instead, we suggest folding the stress-tests into the risk model, thereby requiring all scenarios to be assigned probabilities.
... The paper makes three contributions. First, portfolio credit loss distributions are estimated using a non-parametric, stress-scenario approach (see Kupiec 1998 If loan sizes and LGDs were fixed instead of random, the estimated portfolio loss distributions would approximate transformed binomial distributions in which an aggregate annual borrower default rate is the key parameter, so I describe this paper's approach as involving modified binomial loss distributions. Because aggregate default rates can be related to the severity of economic downturns and (in this paper's setup) loss distribution percentiles represent bank survival rates, policymakers may set capital to limit bank failures to some acceptable estimated rate in an economic scenario of intuitively specified severity. ...
... Of course, in an even worse recession, the bank failure rate would be higher. This paper's way of defining and modeling VaR loss distributions is related to existing stresstest methods of capital allocation (see Jorion 2001, Kupiec 1998, and Shepheard-Walwyn and Rohner 2000. However, a typical credit stress-test analysis specifies default rates for each line of business, or for firms in each geographic region or industry. ...
Article
Resampling implementation of a stress-scenario approach to estimating portfolio default loss distributions is proposed as the basis for estimates of the appropriate absolute level of economic capital allocations for portfolio credit risk. Estimates are presented for stress scenarios of varying severity. Implications of use of different analysis time horizons are analyzed. Results for a numeraire portfolio are quite sensitive to such variations. Although the analysis is framed in terms of recent proposals to revise regulatory capital requirements for banks, the arguments and results are also relevant for bankers making capital structure decisions.
... ν stands for the k-regime variance at period t, j i β α α , , 0 are constants. BIS (2009) recommends a stressed value-at-risk (SVaR), a methodology initially proposed by Kupiec (1998) Where: Max, RC, VaR, SVar and kt stand for Maximum, Required Capital, Value at Risk, Stressed Value at Risk and a factor kt defined by the country financial regulator (usually a Central Bank). The original formula from Kupiec (1998) specifies only the last term, without the multiplier (3+ kt) and a sample number N in place of the fixed 60. ...
... BIS (2009) recommends a stressed value-at-risk (SVaR), a methodology initially proposed by Kupiec (1998) Where: Max, RC, VaR, SVar and kt stand for Maximum, Required Capital, Value at Risk, Stressed Value at Risk and a factor kt defined by the country financial regulator (usually a Central Bank). The original formula from Kupiec (1998) specifies only the last term, without the multiplier (3+ kt) and a sample number N in place of the fixed 60. ...
Article
Are the recommendations from the Bank for International Settlements (BIS) effective to a broad set of financial crises? We submitted two of the main Basel III recommendations for market risk to a back test: the capital requirements and the Value at Risk (VaR) methodology that includes the BIS’s Stressed VaR. We tested the main Brazilian currency exchange (U.S. Dollar to Brazilian Reais) and currency exchange swaps contracts through volatility-based VaR methodologies in the period that comprises the so-called Brazilian confidence crisis, which occurred in the second half of 2002. While the Stressed VaR revealed inapplicable, due to historical data shortage, the capital requirements level appeared innocuous, due to the high levels of daily volatility – daily oscillation limits may have a significant role on crisis mitigation. To circumvent the lack of either historical information or optimal window for stress patterns, we suggest to calibrate the Stressed VaR or the recently announced Expected Shortfall with a historical VIX (Volatility Index, Chicago Board Options Exchange), working as a volatility scale. We suggest modelling with other densities, apart from the BIS recommended standard normal.
... Stress testing is a term used in financial practice without any generally accepted definition. It appears in the context of quantification of losses or risks that may appear under special, mostly extremal circumstances [19]. Such circumstances are described by certain scenarios which may come from historical experience (a crisis observed in the past)-historical stress test, or may be judged to be possible in the future given changes of macroeconomic, socioeconomic or political factors-prospective stress test, etc. ...
... Asymptotic statistics and a detailed analysis of optimal solutions of parametric quadratic programs may help to derive asymptotic results concerning the "estimated" optimal portfolio composition obtained for an asymptotically normal estimateΣ of Σ. Here we follow a suggestion of [19] and rewrite the variance matrix as Σ = DCD with the diagonal matrix D of "volatilities" (standard deviations of the marginal distributions) and the correlation matrix C. Changes in the covariances may be then modeled by "stressing" the correlation matrix C by a positive semidefinite stress correlation matrixĈ ...
Book
Practical use of the contamination technique in stress testing for risk measures Value at Risk (VaR) and Conditional Value at Risk (CVaR) and for optimization problems with these risk criteria is discussed. Whereas for CVaR its application is straightforward, the presence of the simple chance constraint in the definition of VaR requires that various distributional and structural properties are fulfilled, namely, for the unperturbed problem. These requirements rule out direct applications of the contamination technique in the case of discrete distributions, which includes the empirical VaR. On the other hand, in the case of a normal distribution and parametric VaR one may exploit stability results valid for quadratic programs.
... Some systematic scenarios building techniques are described below. Kupiec (1998) has introduced a Correlation Matrix methodology where a few risk factors (which play major roles) are stressed and all the other peripheral factors are adjusted using historical volatilities and correlations. The stress losses calculated using this method have the benefit of introducing an element of probability into stress-testing. ...
... This may make the stress-test lose plausibility because it is probable that in an actual stressful event, the risk factors will not behave as they did in the past. Nevertheless, as argued by Kupiec (1998), stress scenarios that use historical volatilities and correlations are more plausible than scenarios that ignore these correlations altogether. ...
Article
Full-text available
To stimulate the Spatial Data Infrastructures (SDI) development effectively and efficiently, it is key to assess the progress and benefits of the SDI. Currently, several SDI assessment methods exist. However, these are still in an infant stage and none of these appear to meet the requirements of practitioners. As a result, SDI decision makers are still without any guidance on the performance of their SDI. In the financial sector stress testing is commonly used to assess the sustainability and success of the system. This work presents an early stage of a longer research activity by introducing the subject, underlying concepts and proposing a projection of an assessment method from FI to SDI. While this work already identifies a key scenario to begin with, concrete realisations remain part of the future work. Based on a review of the nature and concept of the SDI and Financial Infrastructure (FI) we conclude that the stress test methodology is likely to be an appealing alternative way of assessing SDIs. The Multi-factor Stress tests (Hypothetical and a Non-systematic Subjective scenario model) are most promising as a basis for SDI assessment. A first draft of the Stress Test for Infrastructure of Geographic information is presented: the STIG.
... Macro stress testing has also been performed by regulators as part of the system-wide stress tests that they themselves have implemented, post the crisis, to assess systemic risk in different countries' banking markets. Berkowitz (1999), Kupiec (1998), Lopez (2005) and Schachter (2001) discuss how stress testing may be used in ways that complement and are more or less integrated with VaR analysis. CGFS (2001), CGFS (2005) and CGFS (2000) present survey information and best practice guidelines on stress testing in financial firms. ...
Article
Full-text available
This paper analyzes the joint distribution of changes in agency credit ratings. We estimate both intra-and inter-industry correlations using Maximum Likelihood techniques. The analysis is performed unconditionally and then conditional on de-trended GDP. The latter estimates may be used for macro stress testing in which the credit quality of a portfolio is simulated conditional on a hypothesized future path of real output.
... This decade is important in the history of systemic risk development, because it spans considerable time frame before and after the Global Financial Crisis (GFC) and, thus, enables discovery of the systemic associative patterns between CEO pay and systemic 1 CoVar measures the change in the value at risk of the whole financial system conditional on the state of distress of financial firm relative to the median state of the firm. Value at risk (VaR) is a common method that measures loss associated with the 1-5% chance occurring events on financial markets (Kupiec 1998;Adrian and Brunnermeier 2016). 2 Dollar CoVar, CoVar $ , is a measure of systemic risk, reflecting the size of a financial firm and is calculated as the size of a firm times the CoVar, that is the change in the value at risk of a financial system conditional on a difference in the state of the firm relative to its median state (Adrian and Brunnermeier 2016). risk through the business cycles that include a couple of recoveries from recessions reflected with the periods of economic growth. ...
Article
Full-text available
We examine the role of chief executive officers’ (CEO) pay in contribution to systemic risk in the USA. In particular, by extending the CoVar model of Adrian and Brunnermeier (Am Econ Rev 106(7):1705–1741, 2016), we document that the systemic risk measure of dollar delta CoVar is positively influenced by CEO pay. Differentiation between the types of CEO pay incentives suggests that bonus and option awards comprise major contribution to systemic risk. It follows that governance measures that are aimed at systemic risk management can benefit from distinguishing between short-term and long-term CEO incentives.
... Stress-testing cannot guarantee the identification of actual impacts on a portfolio of future events, but provides another tool in the risk manager's armoury. Stress-tests are designed to determine how a portfolio might respond to adverse developments, including portfolio allocation, 2 and detecting weak spots early, thus facilitating preventative action, typically focusing on key risks such as market risk, credit risk and liquidity risk. 3 Stress-testing covers a range of methodologies. 4 For current purposes it is sufficient to regard stress-tests as being either based on historical data ('historical stress-tests') or invented scenarios ('artificial stress-tests'). ...
... [36] suggest a conditional density forecast in discrete time, but their approach is also limited to a factor model assumption. Other papers involving general conditional modelling in a scenario generation setting are given by [37], who analyze ways to model expert judgement as combined probability distributions but provide no testing or inclusion in a forecasting framework; [38], who extend the Black Litterman model by forecasting asset distributions conditionally on both views for its mean and variance but assume a linear Gaussian model for the asset returns; or [8], who propose a Market-Driven Scenario Approach based on [39] for forecasting financial assets conditional on fixed values for some of the considered assets and evaluate it on the P&L distribution of a portfolio given forecasted Brexit scenarios. However, their approach does not consider a multi period time horizon. ...
Preprint
We introduce the notion of Point in Time Economic Scenario Generation (PiT ESG) with a clear mathematical problem formulation to unify and compare economic scenario generation approaches conditional on forward looking market data. Such PiT ESGs should provide quicker and more flexible reactions to sudden economic changes than traditional ESGs calibrated solely to long periods of historical data. We specifically take as economic variable the S&P500 Index with the VIX Index as forward looking market data to compare the nonparametric filtered historical simulation, GARCH model with joint likelihood estimation (parametric), Restricted Boltzmann Machine and the conditional Variational Autoencoder (Generative Networks) for their suitability as PiT ESG. Our evaluation consists of statistical tests for model fit and benchmarking the out of sample forecasting quality with a strategy backtest using model output as stop loss criterion. We find that both Generative Networks outperform the nonparametric and classic parametric model in our tests, but that the CVAE seems to be particularly well suited for our purposes: yielding more robust performance and being computationally lighter.
... SeeKupiec (2002) andJorion (2007) for overviews.9 Since we focus on the left-tail risk, we set q to be 1%. ...
Article
We investigate the dependency, risk spillovers, and systemic risk between the sectoral indices returns of the Bombay stock exchange (BSE) and oil prices using recently developed empirical techniques. The dependence is modelled using the time varying Stochastic Autoregressive Copulas (SCAR). Conditional value-at-risk (CoVaR), ΔCoVaR and marginal expected shortfall (MES) measures are used to examine the systemic risk. We find rotated Gumbel and normal copulas to be the best fitting in our analysis. Sectors such as energy, power, and industrial exhibit higher persistence in dependence structure compared to other sectors. Our results reveal that the underlying forces of the dependence between oil prices with other industries vary across time, albeit not so much during stable periods, but increase remarkably during turbulent times. All sectors are affected significantly by extreme oil price movements. The average short-run MES is highest for the metals, materials, and industrials sectors. The lowest average short-run MES values are observed for the fast-moving consumer goods, auto, and carbon sectors. Our risk analysis results reveal that Indian stock sectors are not resistant to oil shocks and there exists significant systemic risk between these markets and the crude oil market.
... There are already methods available that were designed to tackle the same problem, see e.g., [1][2][3][4]. Newton-based methods for approximating covariance matrices can be found in [5][6][7]. Furthermore, there exist methods using hyperspherical decomposition [8] and unconstrained convex optimization [9]. ...
Article
Full-text available
Specifying time-dependent correlation matrices is a problem that occurs in several important areas of finance and risk management. The goal of this work is to tackle this problem by applying techniques of geometric integration in financial mathematics, i.e., to combine two fields of numerical mathematics that have not been studied yet jointly. Based on isospectral flows we create valid time-dependent correlation matrices, so called correlation flows, by solving a stochastic differential equation (SDE) that evolves in the special orthogonal group. Since the geometric structure of the special orthogonal group needs to be preserved we use stochastic Lie group integrators to solve this SDE. An application example is presented to illustrate this novel methodology.
... We estimate median (Expected) and 95th percentile (Value at Risk, VaR) annual damage for the sake of comparison with other published work, but we assess the risk of low-probability sea-level rises (Hull, 2018;Wilmott, 2014) using the Expected Shortfall (ES(95%)), which is the mean loss when the 95% percentile is exceeded. This risk measure is commonly used in financial economics to stress test systems and identify risk thresholds (Kupiec, 1998). ...
Article
Full-text available
The high degree of uncertainty associated with the extent of future sea-level rise stems primarily from the potential mass loss of the Greenland and Antarctica ice-sheets. We explore the impact of this uncertainty on economic damage due to sea-level rise for 136 major coastal cities. We compare the probability distribution for damage under the assumption of no adaptation for two relative sea-level projections: the RCP 8.5 scenario from the IPCC Fifth Assessment Report and a High-end scenario that incorporates expert opinion on additional ice-sheet melting. We use the 50th and 95th percentiles to estimate expected damage and one risk measure, the Expected Shortfall ES (95%), which represents the impact of low-probability, high-damage coastal flood risk (above the 95th percentile). Aggregate expected damage by 2050 under RCP 8.5 is US$1,600 billion, while the aggregate risk measure ES(95%) is almost twice as much as the average damage at US$3,082 billion. Under the High-end scenario, ES(95%) figures in Guangzhou and New Orleans by 2050 are twice as high as the expected damage. The city of Guangzhou leads the ranking under both scenarios, followed by Mumbai and New Orleans. Our results suggest that it is critical to incorporate the possibility of High-end scenarios into coastal adaptation planning for future sea-level rise, especially for risk-averse decision-making.
... Golub et al. (2018) propose a framework for calibrating asset returns to financial scenarios. Their Market-Driven Scenario (MDS) approach follows the conditioning philosophy outlined by Kupiec (2002). The core concept is to consider the joint distribution of factors that drive financial outcomes, and look at the conditional distribution of outcomes given an explicit value for a subset of these factors that capture the scenario. ...
Preprint
Economic Scenario Generators (ESGs) simulate economic and financial variables forward in time for risk management and asset allocation purposes. It is often not feasible to calibrate the dynamics of all variables within the ESG to historical data alone. Calibration to forward-information such as future scenarios and return expectations is needed for stress testing and portfolio optimization, but no generally accepted methodology is available. This paper introduces the Conditional Scenario Simulator, which is a framework for consistently calibrating simulations and projections of economic and financial variables both to historical data and forward-looking information. The framework can be viewed as a multi-period, multi-factor generalization of the Black-Litterman model, and can embed a wide array of financial and macroeconomic models. Two practical examples demonstrate this in a frequentist and Bayesian setting.
... However, as commented in [Rebonato and Jäckel, 2000], the drawback is that other portions of the matrix can be changed in an uncontrolled fashion. The shrinkage method proposed by Kupiec in [Kupiec, 1998] has the main drawback that "there is no way of determining to what extent the resulting matrix is optimal in any easily quantifiable sense", see [Rebonato and Jäckel, 2000]. Furthermore, the hyperspherical decomposition method and the unconstrained convex optimization approach are proposed in [Rebonato and Jäckel, 2000] and [Qi and Sun, 2010], respectively. ...
Article
Full-text available
In many areas of finance and of risk management it is interesting to know how to specify time-dependent correlation matrices. In this work we propose a new methodology to create valid time-dependent instantaneous correlation matrices, which we called correlation flows. In our methodology one needs only an initial correlation matrix to create these correlation flows based on isospectral flows. The tendency of the time-dependent matrices can be controlled by requirements. An application example is presented to illustrate our methodology.
... A key feature of these exercises was to shift away from previous "value at risk" (VaR) approaches. VaR analysis can offer some substantial advantages, including its practical viability and conceptual attractiveness (Kupiec, 1998) and the ability to contrast multiple models and calibrations (see for example Alexander and Sheedy, 2008). But with its decline, stress tests instead became increasingly reliant on a form of scenario analysis: taking unexpected (downside) macro scenarios and estimating how those impact, via loan and securities losses, on bank capital. ...
Article
Full-text available
This paper describes an approach for stress testing banks that is consistent across economies and geographies, in contrast to common “macro scenario” driven approaches. The latter would require economic scenarios to be both equally likely (in a probabilistic sense) and equally stressful (in a conditional loss sense) across countries in order to be comparable. The paper proposes a three-pronged approach for stressing bank solvency, which incorporates recalibrating pre-crisis Basel capital assumptions, adapting the BIS “expected shortfall” approach for securities, and using granular data for income haircuts. Loan losses are quantified using a simple “multiples” approach, starting from expected outcomes, which is derived from the pre-crisis Basel technical proposal. The approach is practical, can be more granular or conducted at a high level, depending on data availability, and offers a simple way for regulators, investors or risk assessors to compare and contrast stresses in different banking systems. Of the eight bank defaults recorded globally during 2017, this approach would have given a better “rank ordering” for seven of them, indicating the approach adds value to traditional solvency metrics.
... This study focuses on the most common measure of tail risk, Value at risk (VaR), which is defined as the worst-case scenario in terms of losses during a specific period. Overviews of VaR can be found in Kupiec (2002) and Jorion (2007). Due to its convenience, VaR is a popular method for measuring tail risk. ...
... Following e.g. Kupiec (1998) , we define a stress scenario on one set of ("core") risk factors and, assuming that the given covariance matrix is unaltered by the stress scenario, set the remaining ("peripheral") risk factors to their optimal estimates conditional on the scenario. Let β s denote the j < m core factor parameters that are stressed directly. ...
Article
In 2012, JPMorgan accumulated a USD 6.2 billion loss on a credit derivatives portfolio, the so-called “London Whale”, partly as a consequence of de-correlations of non-perfectly correlated positions that were supposed to hedge each other. Motivated by this case, we devise a factor model for correlations that allows for scenario-based stress testing of correlations. We derive a number of analytical results related to a portfolio of homogeneous assets. Using the concept of Mahalanobis distance, we show how to identify adverse scenarios of correlation risk. In addition, we demonstrate how correlation and volatility stress tests can be combined. As an example, we apply the factor-model approach to the “London Whale” portfolio and determine the value-at-risk impact from correlation changes. Since our findings are particularly relevant for large portfolios, where even small correlation changes can have a large impact, a further application would be to stress test portfolios of central counterparties, which are of systemically relevant size.
... However, as commented in [Rebonato and Jäckel, 2000], the drawback is that other portions of the matrix can be changed in an uncontrolled fashion. The shrinkage method proposed by Kupiec in [Kupiec, 1998] has the main drawback that "there is no way of determining to what extent the resulting matrix is optimal in any easily quantifiable sense", see [Rebonato and Jäckel, 2000]. Furthermore, the hyperspherical decomposition method and the unconstrained convex optimization approach are proposed in [Rebonato and Jäckel, 2000] and [Qi and Sun, 2010], respectively. ...
Preprint
Full-text available
In many areas of finance and of risk management it is interesting to know how to specify time-dependent correlation matrices. In this work we propose a new methodology to create valid time-dependent instantaneous correlation matrices, which we called correlation flows. In our methodology one needs only an initial correlation matrix to create these correlation flows based on isospectral flows. The tendency of the time-dependent matrices can be controlled by requirements. An application example is presented to illustrate our methodology.
... Financial risk is a challenge for most business firms. This is often due to the lack of necessary resources, with regards to manpower, databases and specialty of knowledge to perform a standardized and structured risk management (Kupiec 1998). Smaller firms do not Corresponding author email: solomonenimu@gmail.com ...
Article
Full-text available
This study assessed the financial risk of food vendors in Calabar Metropolis. It specifically sought to identify the types of financial risk in the study area, the level of financial risk, the effect of financial risk factors on vendor’s sales and strategies used to manage financial risk. The study used random sampling technique to select 120 restaurants in Calabar metropolis. Data were obtained from primary source using structured questionnaire and analyzed using descriptive and inferential statistics. The results showed that interest rate risk was the most common type of financial risk in the study area at 46.7%. The result also revealed that the mean financial risk level was 34.93%. Three variables were statistically significant in influencing sales and these were taxation, variable cost and financial risk. The result further revealed that savings was the most common financial risk management technique at 37.2%. The study therefore, recommends that food vendors should insure their businesses to reduce the effects of financial risk. Food vendors should maintain a balance with lending institutions to curtail financial risk for their viability and sustainability.
... Additionally to the estimated bad correlation matrix C one chooses a target correlation matrix C 0 (as reference, see e.g. Kupiec, [7]). Restricting the solution of (GCP) to the line between C and C 0 , problem (GCP) becomes in this case ...
... Following e.g. Kupiec (1998), we define a stress scenario on one set of ("core") risk factors and, assuming that the given covariance matrix is unaltered by the stress scenario, set the Proposition 3). All graphs show a 99% Value-at-Risk, and the initial and unstressed β is calibrated to an average asset correlation of ρ ≈ 0.3, unless indicated otherwise (cf. ...
Preprint
Full-text available
In 2012, JPMorgan accumulated a USD 6.2 billion loss on a credit derivatives portfolio, the so-called "London Whale", partly as a consequence of de-correlations of non-perfectly correlated positions that were supposed to hedge each other. Motivated by this case, we devise a factor model for correlations that allows for scenario-based stress-testing of correlations. We derive a number of analytical results related to a portfolio of homogeneous assets. Using the concept of Mahalanobis distance, we show how to identify adverse scenarios of correlation risk. As an example, we apply the factor-model approach to the "London Whale" portfolio and determine the value-at-risk impact from correlation changes. Since our findings are particularly relevant for large portfolios, where even small correlation changes can have a large impact, a further application would be to stress-test portfolios of central counterparties, which are of systemically relevant size.
... The two risk measures proposed in this article can be used in conjunction with the concept of ALR to decide on appropriate adaptation. Indeed, these measures are very appropriate for stress testing in an analogous way to the tests done in the financing and banking system to assess resilience (Kupiec 1998). These tests consist of assessing whether a system can or cannot recover (or how much effort will require to recover) from certain negative events occurring. ...
Article
Full-text available
This addendum adds to the analysis presented in 'Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities' Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.
... Financial risk is a challenge for most business firms. This is often due to the lack of necessary resources, with regards to manpower, databases and specialty of knowledge to perform a standardized and structured risk management (Kupiec 1998). Smaller firms do not Corresponding author email: solomonenimu@gmail.com ...
Article
Full-text available
Abstract This study assessed the financial risk of food vendors in Calabar Metropolis. It specifically sought to identify the types of financial risk in the study area, the level of financial risk, the effect of financial risk factors on vendor’s sales and strategies used to manage financial risk. The study used random sampling technique to select 120 restaurants in Calabar metropolis. Data were obtained from primary source using structured questionnaire and analyzed using descriptive and inferential statistics. The results showed that interest rate risk was the most common type of financial risk in the study area at 46.7%. The result also revealed that the mean financial risk level was 34.93%. Three variables were statistically significant in influencing sales and these were taxation, variable cost and financial risk. The result further revealed that savings was the most common financial risk management technique at 37.2%. The study therefore, recommends that food vendors should insure their businesses to reduce the effects of financial risk. Food vendors should maintain a balance with lending institutions to curtail financial risk for their viability and sustainability. Keywords: Financial Risk; Food Vendors; Risk Management; Calabar Metropolis
... VaR analysis offers some substantial advantages, such as its practical viability and conceptual attractiveness, as presented by Kupiec (1998) among others from a historical context, and the ability to consider and contrast multiple models and calibrations, for Source: Adapted from "Stress testing the UK banking system: key elements of the 2014 stress test" by Bank of England (2014, April). Retrieved from http://www.bankofengland.co.uk/financialstability/Documents/fpc/keyelements.pdf ...
Article
Forecasts, models and stress tests are important tools for policymakers and business planners. Recent developments in these related spheres have seen greater emphasis placed on stress tests from a regulatory perspective, while at the same time forecasting performance has been criticized. Given the interlinkages between the two, similar limitations apply to stress tests as to forecasts and should be borne in mind by practitioners. In addition, the recent evolution of stress tests, and in particular the increasing popularity of scenario-based approaches, raises concerns about how well the shortcomings of the associated models are understood. This includes estimated stress cases relative to base cases – the degree of pain – that simple scenario modelling approaches engender. This paper illustrates this phenomenon using simulation techniques and demonstrates that more extreme stress scenarios need to be employed in order to match the inference from simple value-at-risk approaches. Alternatively, complex modelling approaches can address this concern, but are not widely used to date. Some policymakers seem to be aware of these issues, judging by the severity of some recent stress scenarios. © 2017, University of Finance and Management in Warsaw. All rights reserved.
... The appearance of stress tests in the mathematical finance literature, on the other hand, is relatively new and there is yet no unified theory of stress testing. The foundations of the link between stress tests and risk models started with Kupiec (1998) who examined crossmarket effects resulting from a market shock. In a seminal paper, Berkowitz (2000) for the first time came up with the idea of folding stress tests into a risk model, thereby assigning all scenarios' certain probabilities. ...
Purpose Adverse developments in the global finance industry clearly underlined the importance of stress-testing. One of the key takeaways was the need to strengthen the coverage of the capital framework. Cognisant of this fact, Basel III encapsulates provisions to enhance the financial sector's ability to withstand shocks arising from possible stress events, thereby reducing adverse spillovers into the real economy. Similarly, the IFSB requires Islamic financial institutions to run stress tests as part of capital planning. Design/methodology/approach We perform thorough backtests on Islamic and conventional portfolios under widely used risk models, which are characterised by an underlying conditional volatility framework and distribution, to identify the most suitable risk model specification. Associated with an appropriate initial shock and estimation window size, the paper also conducts a model-based stress-test to examine whether the stress losses estimated by the selected models compare favourably to the historical shocks. Findings The results suggest that the model-based framework, when combined with an appropriate risk model and distribution, can successfully reproduce past stress periods. The conditional empirical risk model is the most effective one in both long and short portfolio cases $-$ particularly when combined with a long-enough estimation window. The relative performance of normal vs. heavy-tailed distributions and symmetric vs. asymmetric risk models, on the other hand, is highly dependent on whether the portfolio is long or short. Finally, we find that the Islamic portfolio is generally associated with lower historical stress losses as compared to the conventional portfolio. Originality/value The model-based framework eliminates some of the key problems associated with traditional scenario-based approaches and is easily adaptable to Islamic finance.
... Following this concept, Cherubini and Della Lunga (1999) pointed out that stress testing and Value-at-Risk analysis are polar cases of a continuum of Bayesian risk measures in which different degrees of precision is assumed for the adopted scenarios. In an independent paper, Kupiec (1998) achieved the same results with a different mathematical approach, as well as adding correlation uncertainty. Hence, in both approaches performing stress tests means computing new risk measures (expected losses, VaR, ruin probabilities and the like) for a posterior, conditional distribution rather than using an unconditional one. ...
Article
Full-text available
The aim of this paper is to investigate a range of financial techniques and policy strategies for a Quantitative Easing (QE) program by the European Central Bank (ECB) that would rely on Euro area government bonds purchases under various modalities and guises. There is widespread consensus that securities such as covered/corporate bonds and asset backed securities would not be sufficient for a large asset purchase program in the Euro-area given their outstanding amount and market liquidity. However a successful ECB’s QE program linked to the purchase of Euro-area (government) bonds would have to overcome a number of significant hurdles. In essence, a QE program based on outright bond purchases is a supply-driven monetary easing strategy, whereas the ECB operational framework - which is structured around a toolbox of “temporary” (normally, short-term) repo market transactions with a set of eligible banks as counterparties – can only accommodate demand-driven (essentially, temporary) liquidity needs; unlike the US Federal Reserve, the ECB does not rely on outright bond purchases as its routine operational tool to inject (or withdraw) liquidity in the economy (on a permanent basis). A supply driven monetary strategy becomes particularly relevant when a central banks hits the (nominal) interest rate Zero Lower Bound (ZLB) limit in its money creation process. Unless the ECB’s lending to the banking sector were to be made at negative rates – e.g. lending would become a cost (e.g., a loss) for the ECB balance sheet accounts (negative seignorage) - injecting additional liquidity in the economy by conventional demand–driven operational tools (e.g. repo transactions) could become virtually impossible. Moreover, ECB’s balance sheet concentration on short term assets - repos transactions normally lasting few weeks – requires a frequent roll-over activity only to maintain a constant level of high-power money stock (monetary base) in the economy. The issue of using government bonds for QE programs in the Euro-area goes well beyond the ECB monetary policy operational toolbox, as it raises several problems in light of the Maastricht Treaty’s prohibition of member state debt financing by the central bank. One important issue, oftentimes raised by the Deutsche Bundesbank and other national central banks, is that buying government bonds on a large scale may threaten the ECB balance sheet quality as worrisome credit risk exposure is likely to be taken up in the process - the same argument would apply to ABS or covered bonds issued by the private sector. Moreover such QE program may induce moral hazard behaviour from the Euro-area member states. Additional problems – not so often acknowledged in the policy debate - would arise if, to counter the credit-risk exposure increase, the ECB’s bond purchases were to be endowed with a credit seniority status (de facto, if not de iure) with respect to bonds owned by other investors. If the QE program were to be carried out on a massive scale, the seniority clause is bound to impart a downward pressure on bond (market) prices and that could be reinforced by the scanty level of liquidity of some Euro-area bond markets. Thus, a large QE program under the creditor seniority rule, while curbing the “moral-hazard” implications of government bond purchases through higher market rates, might have a perverse impact on euro-area systemic risk as banks balance sheet holding a large share of euro-area government bonds could suffer substantial (mark-to-market) losses. However, a broad-based QE program carried out under a pari-passu creditor rule is also fraught with problems, since it is likely to carry a significant increase in credit risk for the Euro-system central banks. With the introduction of a securitisation framework in a government bond based QE program we can avoid both the perverse effect on banks’ balance sheet of the seniority clause as well as the additional credit risk burden on the Euro-system central banks. Also, a securitisation-based QE program can provide a platform for pursuing a policy strategy fostering the creation of a large, liquid market of euro denominated (credit) risk-free bonds in the Euro area a securitisation program based on Euro area sovereign bond portfolio as collateral pool. Our securitization framework requires the segregation of a large pool of euro-area government bonds in a special purpose vehicle (or agency) that could fund this purchase by issuing two bond tranches. The senior tranche would be designed to satisfy the requirements of a reference asset for the ECB’s QE program. In fact, it can be designed so that it could provide the desired low level of (credit) risk with the appropriate level of liquidity. As a by-product of the tranching structure, the junior tranche bonds would be carrying a meaningful credit spread, which would be providing an important market signal about the macro- economic policy of the member states. The SPV assets pool can be fully or partially “managed” and the management guidelines could be designed so that the investment policy would reflect the macroeconomic condition of individual member states. In the empirical part of the paper we provide quantitative evidence on our securitization framework proposal and we compare it with a program of Euro-area public debt direct purchases under two alternative scenarios of: i) pari-passu creditor status of the ECB with respect to private investors; ii) preferred creditor status of the ECB. As far as the QE program size is concerned we set a reference amount of some 2.100 bln euro of ECB bond purchases, possibly varying within a range of [1.800 ; 2400] bln. Such range of values for the QE volume of purchases is obtained in our securitisation framework as we consider a collateral pool of government bond reaching some 3.000 bln euro. We also discuss the implications of smaller QE program, in the range of [500 ; 1500] billion euro. We consider three cases of “un-managed” (fixed composition) direct purchases according to which member states bonds are purchased in proportion to: i) the “capital key” representing equity holdings of ECB’s shares; ii) a “liquidity-key” representing the relative amount of sovereign bonds outstanding; iii) an equally-weighted portfolio of sovereign bonds. We evaluate the impact of various SPV characteristics - such as bond tranche attachment choice - as well as various financial market scenarios. We provide a risk management analytical framework based on Monte Carlo simulations. It allows us to gauge quantitatively a number of possible Government-Bond-Based (GBB) QE programs for the Euro-area. More specifically, our implementation strategy allows us to compute the probability distribution of credit losses under various credit spreads and default correlation scenarios as well as designing appropriate stress testing procedures to gauge the robustness of our proposed GBB QE solutions. In this version of the paper our modelling strategy relies on the Gaussian copula assumption in default risk correlation. We intend to relax this somewhat restrictive assumption in future research application of our proposed risk management framework. Using end of October 2014 CDS market quotes for 10 Euro-area sovereign bond market, as well as an estimate of a Gaussian copula measure based on historical CDS quotes time series, we conclude that the expected loss – e.g. fair value credit spread - would be about 75 basis points on a 5 year horizon for a senior tranche bond with 30% attachment (that is absorbing losses on the government bond portfolio beyond 30% of its value). For all practical purpose, such credit spread value would be similar to the risk of the German Bund as measured at the same valuation date. As for the junior tranche bond credit risk, we reckon an expected loss between 200 and 300 basis points – depending on the underlying assumptions - which would be comparable with Portuguese government bond credit risk at the same valuation date. Our stress test analysis - based on historical data recorded in November 2011 (Euro area sovereign debt crisis acme) - shows that the securitization structure would effectively limit the credit risk increase on the senior tranche to 5-6% of expected losses estimated in our benchmark case on a 5 year horizon. As one should expect, this fairly solid protection to the senior tranche would come at the cost of a huge credit deterioration for the junior bond tranche to an expected loss level of 60%. This implication should make the junior tranche bond a financial product suitable only for sophisticated institutional investors. The securitisation strategy compares favourably with the direct government bond purchase if the latter is made under the pari passu credit rule: in this case the expected loss on a 5 year horizon would be equal to some 4.5%, increasing to more than 20% in our stress scenario exercise. Thus, the implied credit spread would equal some 90 bps on an annual basis and jumping to 400 bps in our stress scenario analysis. All in all, the pari-passu rule entails a level of credit risk for the ECB balance sheet that would likely be deemed as excessive. On the contrary, the direct purchase solution would hold up much better if the ECB would be given a preferred status with respect to other investors, as the seniority rule would provide a robust shield to default losses. However the seniority rule is bound to have an impact on market credit risk as (non-official) investors would shoulder sovereign default losses. Our model-based estimates suggest that such an impact would appear to be reasonably contained for a broadly-based, well-diversified QE program invested in the 10 largest Euro area bond markets. For a QE size of 1,500 bln purchases the implied market credit spread increase would be in the range of [4 ; 36] bps; raising the QE size to 2,000 euro would widen such range to [6 ; 51] bps; however for smaller QE program (500 bln) the range of credit spread increases would be restrained to only [1 ; 10] bps. Smaller Euro area sovereign debt markets, such as Portugal, would bear the largest spread increase reflecting their reduced market debt recovery rate as a result of the seniority clause attached to the QE purchases.
... Others, e.g., Kupiec (1998), Rebonato and Jäckel (1999), Higham (2001), Rapisarda et al. (2007), have developed various approaches with varying levels of complexity and intuitive appeal to address this challenge, including, e.g., hypersphere decomposition and angular decomposition methods. Rebonato and Jäckel (1999) remark that spectral decomposition can be used as a starting point for a search procedure that identifies an optimal matrix. ...
Article
Full-text available
With contemporary data collection capacity, data sets containing large numbers of different multivariate time series relating to a common entity (e.g., fMRI, financial stocks) are becoming more prevalent. One pervasive question is whether or not there are patterns or groups of series within the larger data set (e.g., disease patterns in brain scans, mining stocks may be internally similar but themselves may be distinct from banking stocks). There is a relatively large body of literature centered on clustering methods for univariate and multivariate time series, though most do not utilize the time dependencies inherent to time series. This paper develops an exploratory data methodology which in addition to the time dependencies, utilizes the dependency information between S series themselves as well as the dependency information between p variables within the series simultaneously while still retaining the distinctiveness of the two types of variables. This is achieved by combining the principles of both canonical correlation analysis and principal component analysis for time series to obtain a new type of covariance/correlation matrix for a principal component analysis to produce a so-called “principal component time series”. The results are illustrated on two data sets.
... Certainly it will not do to take some average over the sizes of single factor moves as size of the joint move. This would neglect dependencies between risk factors which are crucial for measuring the plausibility of joint moves (Kupiec 1998). A scenario in which risk factors move against correlations is not plausible, even if every individual risk factor movement is fairly plausible. ...
Article
Full-text available
The analysis and prediction of systemic financial risks in the US during the COVID-19 pandemic is of great significance to the stability of financial markets in the US and even the world. This paper aims to predict the systemic financial risk in the US before and during the COVID-19 pandemic by using copula–GJR–GARCH models with component expected shortfall (CES), and also identify systemically important financial institutions (SIFIs) for the two comparative periods. The empirical results show that the overall systemic financial risk increased after the outbreak of the COVID-19 pandemic, especially in the first half of the year. We predicted four extreme risks that were basically successful in capturing the high risks in the US financial markets. Second, we identified the SIFIs, and depository banks made the greatest contribution to systemic risk from four financial groups. Third, after the outbreak of the epidemic, the share of Broker–Dealer and Other Institutions in the overall systemic risk has apparently increased. Finally, we recommend that the US financial regulators should consider macro-prudential guidance for major financial institutions, and we should pay more attention to Broker–Dealers, thereby improving the financial stability of the US and the global financial markets.
Chapter
The Value-at-Risk (VaR) concept was introduced by the American bank JP Morgan at the start of the 1990s to summarize the market risk impacting a portfolio or an assets-and-liabilities position in a single measure with a direct interpretation. The VaR quantifies, within a specified confidence level (typically 95 % or 99 %) the potential loss which could be sustained by a given isolated position, an entire portfolio, or a bank as a whole, in a short period of time (typically from 1 to 10 trading days) in normal market conditions. Whereas the VaR is merely a quantile of the distribution of losses (Sect. 27.1), calculating it may turn out to be complicated for positions that include many different instruments, among them derivatives (Sect. 27.2). Furthermore, the VaR has various shortcomings, and other indicators such as Expected Shortfall (Sect. 27.3) and risk measuring tools (Sect. 27.4) have been developed to overcome these deficiencies.
Article
Full-text available
With political and economic scenarios changing at an ever faster pace, it is necessary to understand the potential effects on asset prices. Today, the topic of rising inflation in the US as well as in the Eurozone, although still considered temporary by central banks, confronts us with the "unexpected risk" of a deviation from the baseline scenario. This implies the risk of having an aggressive monetary policy in the US, in a restrictive direction, therefore harmful to the financial markets. In this context, the question arises: is it possible to contemplate these events beforehand and act in good time? The answer is Yes and good risk management practices are important, using stress testing / scenario analysis techniques to accompany risk measures such as VaR and Expected Shortfall. Implementing this concept, through the implementation of stress test / scenario analysis - Bloomberg Economics Forecast Models® and Bloomberg Factor Models® - the present work seeks to consider plausible adverse scenarios that may arise and to assess the related impacts in terms of portfolio. The final aim is to improve the information set for the investor, allowing him to avoid potential market falls, as far as possible, that could prevent him from achieving his investment objectives.
Article
There is a large variability in profitability and productivity between farms operating with automatic milking systems (AMS). The objectives of this study were to identify the physical factors associated with profitability and productivity of pasture-based AMS and quantify how changes in these factors would affect farm productivity. We utilised two different datasets collected between 2015 and 2019 with information from commercial pasture-based AMS farms. One contained annual physical and economic data from 14 AMS farms located in the main Australian dairy regions; the other contained monthly, detailed robot-system performance data from 23 AMS farms located across Australia, Ireland, New Zealand, and Chile. We used linear mixed models to identify the physical factors associated with different profitability (Model 1) and partial productivity measures (Model 2). Additionally, we conducted a Monte Carlo simulation to evaluate how changes in the physical factors would affect productivity. Our results from Model 1 showed that the two main factors associated with profitability in pasture-based AMS were milk harvested/robot (MH; kg milk/robot per day) and total labour on-farm (full-time equivalent). On average, Model 1 explained 69% of the variance in profitability. In turn, Model 2 showed that the main factors associated with MH were cows/robot, milk flow, milking frequency, milking time, and days in milk. Model 2 explained 90% of the variance in MH. The Monte Carlo simulation showed that if pasture-based AMS farms manage to increase the number of cows/robot from 54 (current average) to ∼ 70 (the average of the 25% highest performing farms), the probability of achieving high MH, and therefore profitability, would increase from 23% to 63%. This could make AMS more attractive for pasture-based systems and increase the rate of adoption of the technology.
Article
In the shipping market, ports are increasingly interconnected in the global container network. A port disruption in one place can lead to a chain effect on subsequent ports. Nowadays, China's container port throughput accounts for around 30% of the global port throughput, manifesting its essential role in the global container trade system. Therefore, any disruption in Chinese ports could have a cascading effect on the world container trade. This study explores the potential impact on the container main trading lanes given a stress scenario for a number of Chinese ports. We are particularly interested in the degree of risk spillover to the different trading lanes. The stress scenario used in this study is a severe drop in the container throughput in major Chinese ports. This drop will be reflected in quantiles of the respective throughput distributions that are close to zero. D-vine copula-based quantile regression is used to measure the throughput impact of stressed Chinese ports as covariates on other response container trading routes. Copula is a powerful tool to measure the tail dependence structure between throughputs of major Chinese ports and major container trading routes. Results show that port disruptions in China severely impact the major container trading routes. The Asia to Sub Saharan Africa route is extremely vulnerable to downside risk given any port disruption event in China. We have also compared the results with the historical trade and port call data. The results provide significant implications for container shipping risk management.
Book
Full-text available
This companion book contains the solutions of the tutorial exercises which are included in the Handbook of Financial Risk Management. The table of contents is the following: 1. Introduction. Part I Risk Management in the Financial Sector 2. Market Risk. 3. Credit Risk. 4. Counterparty Credit Risk and Collateral Risk. 5. Operational Risk. 6. Liquidity Risk. 7. Asset Liability Management Risk. 8. Systemic Risk and Shadow Banking System. Part II Mathematical and Statistical Tools 9. Model Risk of Exotic Derivatives. 10. Statistical Inference and Model Estimation. 11. Copulas and Dependence Modeling. 12. Extreme Value Theory. 13. Monte Carlo Simulation Methods. 14. Stress Testing and Scenario Analysis. 15. Credit Scoring Models. Conclusion Appendix A.1 Numerical Analysis. A.2 Statistical and Probability Analysis. A.3 Stochastic Analysis.
Article
The covariance matrix of asset returns can change drastically and generate huge losses in portfolio value under extreme conditions such as market interventions and financial crises. Estimation of the covariance matrix under a chaotic market is often a call to action in risk management. Nowadays, stress testing has become a standard procedure for many financial institutions to estimate the capital requirement for their portfolio holdings under various stress scenarios. A possible stress scenario is to adjust the covariance matrix to mimic the situation under an underlying stress event. It is reasonable that when some covariances are altered, other covariances should vary as well. Recently, Ng et al. proposed a unified approach to determine a proper correlation matrix which reflects the subjective views of correlations. However, this approach requires matrix vectorization and hence it is not computationally efficient for high dimensional matrices. Besides, it only adjusts correlations, but it is well known that high correlations often go together with high standard deviations during a crisis period. To address these limitations, we propose a Bayesian approach to covariance matrix adjustment by incorporating subjective views of covariances. Our approach is computationally efficient and can be applied to high dimensional matrices.
Article
Full-text available
The objective of this paper was to analyze the risk management of a portfolio composed by Petrobras PN, Telemar PN and Vale do Rio Doce PNA stocks. It was verified if the modeling of Value-at-Risk (VaR) through the place Monte Carlo simulation with volatility of GARCH family is supported by hypothesis of efficient market. The results have shown that the statistic evaluation in inferior to dynamics, evidencing that the dynamic analysis supplies support to the hypothesis of efficient market of the Brazilian share holding market, in opposition of some empirical evidences. Also, it was verified that the GARCH models of volatility is enough to accommodate the variations of the shareholding Brazilian market, since the model is capable to accommodate the great dynamic of the Brazilian market.
Article
Full-text available
Abstract Stress testing is a simulation technique to evaluate portfolio reactions to several critical situations. In this paper, we review different stress testing methodologies to examine impacts of different stress scenarios on an Iranian equity portfolio. We identify the extreme tails of all risk factors in our portfolio by extreme value theory and model their dynamic and nonlinear dependence structures with copula functions. We performed three stress tests such as historical, hybrid and hypothetical stress scenarios to simulate the joint evolution of risk factors over time in a realistic way. According to the empirical findings, we find that historical scenario method is not a suitable tool for stress testing due to several drawbacks and show the importance of forward-looking analysis such as hybrid and hypothetical scenarios. We also indicate that the hypothetical stress approach is superior to the other two scenarios from the perspective of stress testing. Keywords Stress testing; Value at Risk; Expected Shortfall; Extreme Value Theory; t Copula; kernel smoothed empirical distribution
Contrary to the common approach of stress-testing under which banks are evaluated whether they are distressed, this empirical study chooses to move from the micro stress test approach to a wider new macro stress test category. By being able to stress testing the entire economy of the Eurozone, it will permit big banks to fail and, at the same time, will open room for new banking players to enter the sector, promoting the essence of a healthy destruction. The analysis performs a battery of stress tests, by implementing VaR, Cornish-Fisher VaR, Monte Carlo VaR, Expected Shortfall, Cornish-Fisher Expected Shortfall, and Monte Carlo Expected Shortfall. At the same time, it explicitly considers the new regulatory approach of IFRS9 to incorporate extreme values from forecasted series in the distributions. The analysis also performs two versions of stress tests, one including TARGET2 and one without it. The results document that future stress tests should include TARGET2 values in order to capture a better picture of the stressed economy. The findings from these stress tests clearly illustrate that although there has been a trough after the distress call of 2008, this trough ended. These are results derived without including the TARGET2 transfers. By including the TARGET2 transfers we receive a different picture that possibly acts as a protective mechanism against any future crisis. Caution is still advised, possibly due to some lingering imbalances within the Eurozone.
Chapter
The chapter briefly reviews the case for stress-testing risk models and recognizes the pressing ‘engineering’ problems that stand between the concept of stress testing and actually doing so for a risk model, such as a portfolio that includes a supposedly optimal allocation of assets. The chapter argues that the application of the Bayesian net ‘technology’ to stress testing introduced in the last decade lends itself particularly well to the need for a practical way to stress-test risk models. The chapter presents proposed solutions to the challenges of stress testing a model with particular reference to the use of Bayesian nets.
Chapter
As, in light of the recent financial crises, stress tests have become an integral part of risk management and banking supervision, the analysis and understanding of risk model behaviour under stress has become ever more important. In this paper, we present a general approach to implementing stress scenarios in a multi-factor credit portfolio model and analyse asset correlations, default probabilities and default correlations under stress. We use our results to study the implications for credit reserves and capital requirements and illustrate the proposed methodology by stressing a large investment banking portfolio. Although our stress testing approach is developed in a particular credit portfolio model, the main concept - stressing risk factors through a truncation of their distributions - is independent of the model specification and can be applied to other risk types as well.
Chapter
The 2004 Basel II framework marks a milestone in the transition from the rules-based 1988 Basel I Accord to a more risk-sensitive and principles-based approach to bank regulation. It significantly expands the regulatory scope by a new capital charge for operational risk and by its new three-pillar structure consisting of minimum capital requirements, supervisory review process, and market discipline. Depending on the risk category, the minimum capital requirements of its most advanced approaches are either built on banks' internal models or on regulatory models requiring banks' own estimates of risk components as input. The evolutionary nature of the Basel II framework will encourage banks to migrate to more risk-sensitive approaches in the future, depending on the complexity of their business and their risk management expertise.
Chapter
Even though risk management is the quality control of finance to ensure the smooth functioning of the business model and the corporate model, this chapter takes a more focused approach to risk management. We begin by describing the methods to calculate risk measures. We then describe how these risk measures may be reported. Reporting provides feedback to the identification and measurements of risks. Reporting enables the risk management to monitor the enterprise risk exposures so that the firm has a built-in, self-correcting procedure that enables the enterprise to improve and adapt to changes. In other words, risk management is concerned with four different phases, which are risk measurement, risk reporting, risk monitoring, and risk management in a narrow sense. We focus on risk measurement by taking a numerical example. We explain three different methodologies for that purpose, and examine whether the measured risk is appropriate based on observed market data.
Article
Indefinite approximations of positive semidefinite matrices arise in various data analysis applications involving covariance matrices and correlation matrices. We propose a method for restoring positive semidefiniteness of an indefinite matrix M0 that constructs a convex linear combination S(α) = αM1 + (1 - α)M0 of M0 and a positive semidefinite target matrix M1 . In statistics, this construction for improving an estimate M0 by combining it with new information in M1 is known as shrinking. We make no statistical assumptions about M0 and define the optimal shrinking parameter as α∗ = min{α ∈ [0, 1] : S(α) is positive semidefinite}. We describe three algorithms for computing α∗. One algorithm is based on the bisection method, with the use of Cholesky factorization to test definiteness; a second employs Newton's method; and a third finds the smallest eigenvalue of a symmetric definite generalized eigenvalue problem. We show that weights that reflect confidence in the individual entries of M0 can be used to construct a natural choice of the target matrix M1. We treat in detail a problem variant in which a positive semidefinite leading principal submatrix of M0 remains fixed, showing how the fixed block can be exploited to reduce the cost of the bisection and generalized eigenvalue methods. Numerical experiments show that when applied to indefinite approximations of correlation matrices shrinking can be at least an order of magnitude faster than computing the nearest correlation matrix.
Chapter
Some ten years ago, Value at Risk (VaR) set out to conquer the risk management community. Originally intended as a reporting tool for senior management, it soon entered other core areas of banking, such as capital allocation, portfolio optimisation and risk limitation. With its increasing importance, regulators also acknowledged VaR as a risk measure, when they allowed the calculation of capital requirements to be based on VaR. In this case, however, they required that a rigorous and comprehensive stress testing program be in place in order to complement the statistical model (Basle Committee on Banking Supervision, 1996). In this context, two questions arise: (1) Why is there a need to complement VaR models? and (2) What can be regarded as a rigorous and comprehensive stress testing program?
ResearchGate has not been able to resolve any references for this publication.