Article

FOREX Risk: Measurement and Evaluation using Value-at-Risk By

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Abstract:  We measure and evaluate the performance of a number of Value-at-Risk (VaR) methods using a portfolio based on the foreign exchange exposure of a small open economy (Ireland) among its trading partners. The sample period highlights the changing nature of Ireland's exposure to risk over the past decade in the run-up to EMU. Our results offer an indication of the level of accuracy of the various approaches and discuss the issues of models ensuring statistical accuracy or more conservative leanings. Our findings suggest that the Orthogonal GARCH model is the most accurate methodology while the EWMA specification is the more conservative approach. Copyright Blackwell Publishers Ltd, 2004.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The degree of performance increases because the implied volatility pertaining to index tracking can be directly fed into the market risk model (see studies by Ane, 2005;Lin & Shen, 2006;Papadamou & Stephanides, 2004). Bredin and Hyde (2004) tested the accuracies of six VaR forecasting models by adopting the interval forecast of Christoffersen (1998) and the binary and quadratic loss function applications of Lopez (1999). The models comprised EQMA and EWMA variance-covariance approach, three alternative multivariate GARCH methodologies and a non-parametric estimation model that is HS. ...
... This is due to the fact that when VaR quantifications involve models that are more flexible in handling fat tail effects such as GARCH t , the volatility asymmetric effect is reduced. Evidence from these findings contribute to the growing but still limited empirical research such as by Bredin and Hyde (2004), Hull and White (1998), Lee and Saltoglu (2002), Lin and Shen (2006) and Su and Knowles (2006). Thus, it can be concluded that allowing for abnormalities (such as fat tails or asymmetries) in the evaluation of the Malaysian non-financial sectors will certainly improve the reliability of risk forecast. ...
Article
Full-text available
This study examines Value-at-Risk (VaR) models that are integrated with several volatility representations to estimate the market risk for seven non-financial sectors traded on the first board of the Malaysian stock exchange. In a sample that spanned 19 years from1993 until 2012 for construction, consumer product, industrial product, plantation, property, trade and services and mining sectors, the expected maximum losses are quantified at 95% confidence level. For accuracy determination, assessments using Kupiec test and Christoffersen test have provided evidence that almost every model are found to be accurate for all sets of occurrence. However, using the Lopez test which takes into consideration the magnitude of the impact of exceptions, the most accurate model is the VaR which is integrated with GARCHt. This study found that fat tails and asymmetries are important issues that need to be considered when estimating VaR in managing financial risks.
... Indian Banks are showing considerable enthusiasm in foreign exchange business which is evident by their increasing concentration on forex products. Although foreign exchange business has emerged as a profitable business prospect, they also expose Banks to considerable risk (Bredin, 2004) [1]. Forex risk arises in cases where Banks hold assets or liabilities in foreign currency and the exchange rates fluctuate. ...
... Indian Banks are showing considerable enthusiasm in foreign exchange business which is evident by their increasing concentration on forex products. Although foreign exchange business has emerged as a profitable business prospect, they also expose Banks to considerable risk (Bredin, 2004) [1]. Forex risk arises in cases where Banks hold assets or liabilities in foreign currency and the exchange rates fluctuate. ...
Article
Full-text available
A study regarding the foreign exchange risk management in Indian Commercial Banks is proposed to be conducted in Bengaluru city. The main objective of the study will be to examine the foreign exchange risks faced by Banks and their customers, to understand the different instruments used to hedge those risks and the efficacy of those measures in managing the risks. The study will be considering the scenario from a banker and customer perspective. The research will be entirely quantitative in nature and data will be collected through structured questionnaires. The data so collected will be analysed using various statistical techniques and financial ratios. The proposed study is expected to further the cause of forex risk management.
... The degree of performance increases because the implied volatility pertaining to index tracking can be directly fed into the market risk model (see studies by Ane, 2005;Lin & Shen, 2006;Papadamou & Stephanides, 2004). Bredin and Hyde (2004) tested the accuracies of six VaR forecasting models by adopting the interval forecast of Christoffersen (1998) and the binary and quadratic loss function applications of Lopez (1999). The models comprised EQMA and EWMA variance-covariance approach, three alternative multivariate GARCH methodologies and a non-parametric estimation model that is HS. ...
... This is due to the fact that when VaR quantifications involve models that are more flexible in handling fat tail effects such as GARCH t , the volatility asymmetric effect is reduced. Evidence from these findings contribute to the growing but still limited empirical research such as by Bredin and Hyde (2004), Hull and White (1998), Lee and Saltoglu (2002), Lin and Shen (2006) and Su and Knowles (2006). Thus, it can be concluded that allowing for abnormalities (such as fat tails or asymmetries) in the evaluation of the Malaysian non-financial sectors will certainly improve the reliability of risk forecast. ...
Article
Full-text available
This study examines Value-at-Risk (VaR) models that are integrated with several volatility representations to estimate the market risk for seven nonfinancial sectors traded on the first board of the Malaysian stock exchange. In a sample that spanned 19 years from1993 until 2012 for construction, consumer product, industrial product, plantation, property, trade and services and mining sectors, the expected maximum losses are quantified at 95% confidence level. For accuracy determination, assessments using Kupiec test and Christoffersen test have provided evidence that almost every model are found to be accurate for all sets of occurrence. However, using the Lopez test which takes into consideration the magnitude of the impact of exceptions, the most accurate model is the VaR which is integrated with GARCHt. This study found that fat tails and asymmetries are important issues that need to be considered when estimating VaR in managing financial risks. © 2015, Faculty of Economics and Administration. All rights reserved.
... A number of researchers have recently used this test (or similar) to examine VaR methods in currency markets, including Bredin and Hyde (2004) and Alexander and Sheedy (2008). In both cases the authors find that the GARCH-based or conditional VaR methods outperform their unconditional counterparts. ...
... A further deficiency in the VaR backtesting literature is the focus only on 1-day risk horizons (see Bredin and Hyde (2004) and Brooks and Persand (2002) for examples). In a market crisis it may not be possible to hedge positions immediately and therefore examining risk over a multi-day risk horizon is important. ...
Article
During the sub-prime crisis of mid-2007, VaR models for market risk at many major financial institutions performed disappointingly. This performance is consistent with the use of VaR measures that fail to account for volatility clustering. This paper presents new backtesting evidence from equity markets to support the use of GARCH-based risk measures as a solution to this problem. The paper also examines many of the objections voiced by experts in relation to GARCH-based risk measures (such as the variability of such risk measures, regulatory issues, accessibility) and discusses possible solutions. Finally the paper examines some implementation issues related to GARCH-based VaR measures.
... Currently, in forex markets, there is a trend of having a few renowned currencies in reserve for trading. It is noted that the risks of having reserves in a few currencies are more probable as compared to having reserves in all world currencies [12][13][14]. Unluckily, the latter choice is not feasible for trading, whereas the first choice has related risks. In this regard, having appropriate reserves in multiple currencies, although seeming simple, fortunately, has also the potential to take advantage of the dynamic nature of the global forex market. ...
Article
Full-text available
In the current complex financial world, paper currencies are vulnerable and unsustainable due to many factors such as current account deficit, gold reserves, dollar reserves, political stability, security, the presence of war in the region, etc. The vulnerabilities not limited to the above, result in fluctuation and instability in the currency values. Considering the devaluation of some Asian countries such as Pakistan, Sri Lanka, Türkiye, and Ukraine, there is a current tendency of some countries to look beyond the SWIFT system. It is not feasible to have reserves in only one currency, and thus, forex markets are likely to have significant growth in their volumes. In this research, we consider this challenge to work on having sustainable forex reserves in multiple world currencies. This research is aimed to overcome their vulnerabilities and, instead, exploit their volatile nature to attain sustainability in forex reserves. In this regard, we work to formulate this problem and propose a forex investment strategy inspired by gradient ascent optimization, a robust iterative optimization algorithm. The dynamic nature of the forex market led us to the formulation and development of the instantaneous stochastic gradient ascent method. Contrary to the conventional gradient ascent optimization, which considers the whole population or its sample, the proposed instantaneous stochastic gradient ascent (ISGA) optimization considers only the next time instance to update the investment strategy. We employed the proposed forex investment strategy on forex data containing one-year multiple currencies’ values, and the results are quite profitable as compared to the conventional investment strategies.
... The concept of time diversification refers to the relationship between risk and the holding period. (Bredin & Hyde, 2004) uses a portfolio based on FOREX exposure of a small open economy among its counterparties. The relative merits of different forecasting techniques are compared with an assumption that expected returns on the assets are zero. ...
Article
Full-text available
Foreign exchange trading is popular among professional traders. There are many discussion boards online where the traders are wondering what the optimal time interval is. Despite the discussions, there is no scientific exploration. It is needed to see if empirical evidence will support the existence of such an optimal interval. It is also essential to establish a methodology for the calculation of such intervals for different asset classes. In this paper, we discuss two different models that can be used to find the optimal holding period of the foreign exchange for a given target profit. We also conduct numerical tests that illustrate the two models coincide with each other. The research is beneficial for automated trading systems.
... Among many applications of the VaR model, one can cite, in the FX field, Al Janabi (2006), who proposes to use consistently a VaR framework for managing trading risk exposure of FX securities in the context of emerging and illiquid markets. Bredin and Hyde (2004) review the performance of several VaR methods using a portfolio based on the FX exposure of a small open economy. ...
Article
This paper presents a rule for foreign exchange interventions (FXI), designed to preserve financial stability in floating exchange rate arrangements. The FXI rule addresses a market failure: the absence of hedging solution for tail exchange rate risk in the market (i.e. high volatility). Market impairment or overshoot of exchange rate between two equilibria could generate high volatility and threaten financial stability due to unhedged exposure to exchange rate risk in the economy. The rule uses the concept of Value at Risk (VaR) to define FXI triggers. While it provides to the market a hedge against tail risk, the rule allows the exchange rate to smoothly adjust to new equilibria. In addition, the rule is budget neutral over the medium term, encourages a prudent risk management in the market, and is more resilient to speculative attacks than other rules, such as fixed-volatility rules. The empirical methodology is backtested on Banco Mexico’s FXIs data between 2008 and 2016.
... The growth of trading and business activities since the 1990s has been matched by an increase in financial instability and losses. For this purpose, the concept of a value at risk VAR) approach to measuring such a category of market risk measures the largest portion of its portfolio lost by the business in question (Bredin and Hyde, 2004). As in other industries, banks have also had to face exposure to exchange rate risk, and their performance can be affected by currency fluctuations. ...
Article
Full-text available
Article History Keywords Bank-based risk factors Capital ratio Audit quality Financial stability Pakistan. JEL Classification: G21, G32, G33. The purpose of this study is to examine the effect of bank-based risk measures, country related and international risk factors along with capital ratio and audit quality for stability measures. This article has filled the literature gap while addressing two financial stability measures: Z score through return on assets and return on equity (ZROA and ZROE). A sample of 28 commercial banks is collected from national financial market in Pakistan, with annual observations each year from 2007 to 2016. Panel regression models like ordinary least square (OLS), fixed effect and random effect under robust title are applied to examine the effect of risk factors, capital ratio and audit quality on financial stability (FS). Study finds that bank-based risk factors such as liquidity, credit and operational risk have significant negative influence on both stability measures. Excessive capital ratio seems also to adversely affect financial stability measures. Additionally, higher payments to auditors increases audit quality, resulting in a positive influence on both stability measures. Policy makers, financial analysts and credit officers in banks recommend analysis and review of the relationship between risk factors, capital ratio and audit quality, and the FS of Pakistani commercial banks. However, this work is limited to commercial banks, with no consideration of developed financial institutes and industrial banks. Additionally, there is no methodological application of advanced techniques like GMM.
... Moreover, the result of the diversification based on different ways of calculating volatility is quite exceptional. Contrary to various previous empirical studies which have shown that GARCH is the best method to predict the volatility of financial series (Berkowitz and O'Brien 2002;Bredin and Hyde 2004;So and Yu 2006;Orhan and Köksal 2012), our findings indicate that EWMA are what investors might use for volatility computations, at least for the volatility in the Taiwan foreign exchange market. ...
Article
Full-text available
The main purpose of this paper is to select the most appropriate technique predicting precisely the exchange rate risk from three main approaches, namely, the Historical Simulation approach, the Variance–Covariance approach and the Monte Carlo Simulation approach. Our main finding shows that the historical simulation approach with exponentially weighted moving average, which exhibits the lowest out-of-sample loss, is the most appropriate method for value at risk estimation with regard to a multi-currency portfolio construction in the Taiwan foreign exchange market. Moreover, results in backtesting lend support to the accuracy of our proposed strategies at the 99% confidence level.
... The choices are informed by the distribution in Figure 3. We took account of the heavy-tails in the distribution and leverage effects which are normally associated with forex trading (Bredin and Hyde, 2004;Giot & Laurent, 2004). This actually informed our choice of the student-t and skewed student-t errors in modelling. ...
Article
We studied regime-switching behaviour of the volatility of the returns from the ZAR/USD exchange rate for the period January 4, 2002 to December 31, 2017. The results showed that, contrary to mainstream approaches for estimating volatility using GARCH (1,1) there are clear regimes in the returns which necessitate regime switching models. The results further revealed that the Markov regime switching model GARCH (1,1) with skewed student-t innovations is superior in capturing the heteroscedasticity of the returns. The deviance information criteria were used as a selection metric from among six candidate models
... : It is the most commonly used non parametric VaR technique (Bredin and Hyde, 2004). The historical VaR uses historical returns to calculate VaR using order statistics i.e R(1) ≥ R(2) ≥ R(T ) be the order statistics of the T returns, where losses are positive then The portfolio's VaR is calculated using historical return data. ...
Conference Paper
Full-text available
Traditional measures of performance evaluation are in vogue since long, however, Value at Risk (VaR) approaches are making their place in portfolio management industry from the last ten years. Value at Risk (VaR) approach focuses on the downside volatility of portfolio, thus making the investor clear about the maximum possible loss on his portfolio. In today's highly volatile environment, an investor is concerned more about the downside risk rather upward swing in the portfolio. He would hesitate to invest in a portfolio having more downward risk. Present paper makes an attempt to compare both the traditional and VaR performance measures and explore whether differences exist in ranking of funds using both the approaches. Results indicated that Sharpe ratio and Normal VaR results are same when used independently, while there is difference in results using Treynor and Jensen's Alpha.
... One of the most widely used non parametric approaches is the historical simulation approach (Bredin and Hyde, 2004). The historical VaR uses historical returns to calculate VaR using order statistics. ...
... Don and Stuart (2002) measured and evaluated the performance of parametric and nonparametric VaR methods using a portfolio based on the foreign exchange exposure of a small open economy, Ireland, among its key trading partners. The results suggest that the orthogonal GARCH model is the most accurate method, while the EWMA specification is the more conservative approach. ...
... For example, Hendricks (1996) calculated VaR measures using the equally weighted moving average approach and the historical simulation approach to evaluate the performance of portfolios. Bredin and Hyde (2004) measured and evaluated the VaR performance a foreign exchange portfolio by using the standard variance covariance approach, historical simulation approach, orthogonal GARCH and exponentially weighted moving average. ...
Article
Full-text available
This study aimed to examine the performance evaluation and persistence of equity mutual funds when the market reverses from a bull market to a bear market. We also attempt to find the appropriate performance evaluation indicators when funds exhibit performance reversal. Such an indicator will be of great assistance to investors in making investment decisions. Therefore, this study investigates the performance of 30 equity mutual funds in Taiwan. The data cover 500 dealing days from November 2006 to October 2008, with an emphasis on the 2007 financial crisis. There were two main findings in our study. First, we found that different performance measures led to significantly diverging rankings of mutual funds in different market climates. Second, the result of the persistence test showed that when the market climate shifts from bull to bear market, using the GRAROC model is the best indicator of future mutual fund performance.
... After describing briefly what it is or what is known as Forex, let's see what are its "dynamic" characteristics, i.e. what happens within it [5,6]. From the assumptions made, it appears that in the first Forex moves a high amount of money. ...
Article
Full-text available
Prediction of various market indicators is an important issue in finance. This can be accomplished through computer models and related applications. It turned out that artificial models have both great advantages and some limitations for learning the data patterns and predicting future values of the financial phenomenon under analysis. In this paper we analyze the particular financial market called Forex and the way computing models are used to automate trading strategies by making affordable predictions on the evolution of exchange rates between currencies.
... On another facade, Angelidis and Degiannakis (2005) enumerate the accuracy of parametric, non-parametric and semi-parametric methods in predicting the one-day-ahead VaR in three types of markets (namely, stock exchanges, commodities and foreign exchange rates) and for both long and short trading positions. In another study, Bredin and Hyde (2004) measure and evaluate the performance of a number of VaR methods that have proved to be popular using an equally weighted portfolio based on the foreign exchange exposure of a small open economy (Ireland) among its six major trading partners. Accordingly, their findings suggest that the Orthogonal GARCH model is the most accurate methodology, while the exponential weighted moving average (EWMA) specification is the more conservative approach. ...
Article
Full-text available
The aim of this article is to bridge the gap in equity trading risk management literatures and particularly from the perspective of emerging and illiquid markets, such as in the context of the Gulf Cooperation Council (GCC) financial markets. In this article, we demonstrate a practical approach for the measurement and control of market risk exposure for financial trading portfolios that contain several illiquid equity securities during the unwinding (close-out) period. This approach is based on the renowned concept of Value-at-Risk (VaR) along with the development of an optimisation risk algorithm utilising matrix-algebra technique. Our thorough asset market risk modelling-algorithm can simultaneously handle VaR analysis under normal and severe market conditions, besides it takes into account the effects of illiquidity and short-sales of traded equity securities. In order to illustrate the proper use of VaR and stress-testing methods, real-world structured modelling techniques of trading risk management are presented for the GCC financial markets. To this end, comprehensive simulation case studies are accomplished with the objective of constructing a realistic framework for trading risk measurement and control in addition to the instigation of a risk optimisation algorithm-process for the computation of maximum authorised VaR risk-budgeting limits. JEL Classification: C10, C13, G20, G28
... In their article Angelidis and Degiannakis (2005), they enumerate the accuracy of parametric, nonparametric and semi-parametric methods in predicting the one-day-ahead VaR in three types of markets (namely stock exchanges, commodities and foreign exchange rates) and for both long and short trading positions. In another study, Bredin and Hyde (2004) measure and evaluate the performance of a number of VaR methods that have proven popular through using an equally weighted portfolio based on the foreign exchange exposure of a small open economy (Ireland) among its six major trading partners. Accordingly, their findings suggest that the Orthogonal GARCH model is the most accurate methodology while the exponential weighted moving average (EWMA) specification is the more conservative approach. ...
Article
Full-text available
The aim of this article is to bridge the gap in equity trading risk management literatures and particularly from the perspective of emerging and illiquid markets, such as in the context of the Gulf Cooperation Council (GCC)’s six financial markets. To the authors’ best knowledge, this is the first research paper that addresses the issue of equity trading risk management in the GCC countries with direct applications to their six stock markets. In this paper, the authors demonstrate a practical approach for measurement, management and control of market and liquidity risk exposures for financial trading portfolios that contain several illiquid equity securities. This approach is based on the renowned concept of Liquidity-Adjusted Value at Risk (L-VaR) along with the development of an optimization software tool utilizing matrix-algebra technique under the notion of different correlation factors and liquidation horizons. The comprehensive trading risk model can simultaneously handle L-VaR analysis under normal and severe market conditions besides it takes into account the effects of illiquidity of all traded equity securities. In order to illustrate the proper use of L-VaR and stress-testing methods, real-world examples and feasible reports of equity trading risk management are presented for the six GCC equity financial markets by implementing a daily database of indices’ returns for the period 2004-2008. To this end, several financial modeling studies are achieved with the objective of creating a realistic framework of equity trading risk measurement and control reports in addition to the instigation of a practical iterative optimization technique for the calculation of maximum authorized L-VaR limits subject to real-world optimum operational constraints.
... In their article Angelidis and Degiannakis (2005), enumerate the accuracy of parametric, nonparametric and semi-parametric methods in predicting the one-day-ahead VaR in three types of markets (namely, stock exchanges, commodities and foreign-exchange rates) and for both long and short trading positions. In another study, Bredin and Hyde (2004) measure and evaluate the performance of a number of VaR methods that have proved popular using an equally weighted portfolio based on the foreign-exchange exposure of a small open economy (Ireland) among its six major trading partners. Accordingly, their findings suggest that the Orthogonal GARCH model is the most accurate methodology while the exponential weighted moving average (EWMA) specification is the more conservative approach. ...
Article
Full-text available
Purpose It is the purpose of this article to empirically test the risk parameters for larger foreign‐exchange portfolios and to suggest real‐world policies and procedures for the management of market risk with the aid of value at risk (VaR) methodology. The aim of this article is to fill a void in the foreign‐exchange risk management literature and particularly for large portfolios that consist of long and short positions of multi‐currencies of numerous developed and emerging economies. Design/methodology/approach In this article, a constructive approach for the management of risk exposure of foreign‐exchange securities is demonstrated, which takes into account proper adjustments for the illiquidity of both long and short trading/investment positions. The approach is based on the renowned concept of VaR along with the innovation of a software tool utilizing matrix‐algebra and other optimization techniques. Real‐world examples and reports of foreign‐exchange risk management are presented for a sample of 40 distinctive countries. Findings A number of realistic case studies are achieved with the objective of setting‐up a practical framework for market risk measurement, management and control reports, in addition to the inception of a practical procedure for the calculation of optimum VaR limits structure. The attainment of the risk management techniques is assessed for both long and short proprietary trading and/or active investment positions. Practical implications The main contribution of this article is the introduction of a practical risk approach to managing foreign‐exchange exposure in large proprietary trading and active investment portfolios. Key foreign‐exchange risk management methods, rules and procedures that financial entities, regulators and policymakers should consider in setting‐up their foreign‐exchange risk management objectives are examined and adapted to the specific needs of a model of 40 distinctive economies. Originality/value Although a substantial literature has examined the statistical and economic meaning of VaR models, this article provides real‐world techniques and optimum asset allocation strategies for large foreign‐exchange portfolios in emerging and developed financial markets. This is with the objective of setting‐up the basis of a methodology/procedure for the measurement, management and control of foreign‐exchange exposures in the day‐to‐day trading and/or asset management operations.
... Sarma et al. (2001) applied this model to the estimation of S&P500 and NSE-50 indexes volatility. Recently, the GARCH (1,1) was applied by Bredin and Hyde (2004) to estimate the volatility of currency portfolios. ...
Article
Full-text available
The main purpose of this paper is to compare empirically four Value-at-Risk simulation methods, namely, the Variance-Covariance, the Historical Simulation, the Bootstrapping and the Monte Carlo. We tried to estimate the VaR associated to three currencies and four currency portfolios in the Tunisian exchange market. Data covers the period between 01-01-1999 and 31-12-2007. Independently of the used technique, the Japanese Yen seems to be the most risky currency. Moreover, the portfolio diversification reduces the exchange rate risk. Lastly, the number of violations, when they exist, does not generally differ between the simulation methods. Recent evaluation tests were applied to select the most appropriate technique predicting precisely the exchange rate risk. Results based on these tests suggest that the traditional Variance-Covariance is the most appropriate method.
Chapter
Effective liquidity management and risk controls are crucial for maintaining the stability of both the global financial system and individual financial institutions. Despite consensus on this necessity, regulatory approaches have varied. Over recent decades, technological advancements have expanded the tools available for liquidity risk management. However, these innovations have also led to underestimations of actual risk exposure. This chapter aims to propose proactive policies and procedures for managing trading risk exposure and liquidity, particularly focusing on their interplay with counterparty, credit, and market risks. It offers practical solutions and internal regulations for financial operational divisions, emphasizing the unique challenges faced by trading units in both emerging and developed economies. Leveraging the author's background as a global market risk director and head of derivatives trading, this chapter offers valuable insights and guidelines for market participants, regulators, and policymakers to develop robust trading units within a sound regulatory framework. Specific techniques for managing trading risk are reviewed and adapted to the needs of developing markets, with a focus on multiple-asset proprietary portfolios. The chapter also explores the utility of trading risk models, such as Value-at-Risk (VaR), offering practical algorithms for parametric variance–covariance VaR. Guidelines for incorporating asset illiquidity into trading portfolios are provided, along with a thorough explanation of the Risk-Adjusted-Return-On-Capital (RAROC) technique. Additionally, the chapter addresses gaps in statistical data, offering solutions for incomplete and omitted statistics. In this context, this chapter presents comprehensive strategies for managing trading and liquidity risks, aiming to enhance the reliability and efficiency of financial markets in both emerging and developed economies.
Book
Full-text available
INTERNATIONAL CONFERENCE ON BUSINESS TRANSFORMATION DURING COVID-19 PANDEMIC SITUATION IN INDIA,” ICBTCPI – 2021
Chapter
Full-text available
Efficient screening of transactions provides an empowering tool for anti-money laundering procedures and actions. Automatic classification and detection of anomalous behaviours and transaction structures enable faster and more effective action on the side of the supervisory authority. This chapter introduces research achievements and tools developed to streamline transaction monitoring and ease domain experts with automatic and semi-automatic filtering of risky transaction typologies. Presented tools are integrated as part of PAMLS (Platform for Anti-Money Laundering Supervision) to streamline and automate the discovery of risky behaviours in bank transaction data enriched with relevant company information. Enriched transactional data is pseudo-anonymized with respect to the legal and regulatory framework. Screening tool as a part of PAMLS platform automatically detects and marks specific predefined scenarios using newly developed state-of-the-art AI method tailored specifically to time-evolving transaction graphs in transaction data. Easy-to-use tools, early warning system and subsequent parameterized queries with additional white-listed scenarios provide domain experts with additional data to easily explore suggested dangerous transaction groups and make more informed decisions and further action, be it at a level of a specific financial institution or a cluster of them.
Chapter
Full-text available
General Data Protection Regulation (GDPR) has been in place since May 2018 to give EU citizens more control over their personal data, applying principles like security and privacy by design. One of the most powerful tools to allow data processing while being compliant with data protection regulations is anonymisation, a procedure that consists of transforming data in such a way that makes no longer possible the re-identification of the data subjects. This chapter describes how anonymisation can be performed at a large scale, addressing common challenges to become GDPR compliant.
Chapter
Full-text available
This chapter presents Business Financial Management (BFM) tools for Small Medium Enterprises (SMEs). The presented tools represent a game changer as they shift away from a one-size-fits-all approach to banking services and put emphasis on delivering a personalized SME experience and an improved bank client’s digital experience. An SME customer-centric approach, which ensures that the particularities of the SME are taken care of as much as possible, is presented. Through a comprehensive view of SMEs’ finances and operations, paired with state-of-the-art ML/DL models, the presented BFM tools act as a 24/7 concierge. They also operate as a virtual smart advisor that delivers in a simple, efficient, and engaging way business insights to the SME at the right time, i.e., when needed most. Deeper and better insights that empower SMEs contribute toward SMEs’ financial health and business growth, ultimately resulting in high-performance SMEs.
Chapter
Full-text available
This chapter presents a holistic solution to the issue of data pipelining that ingest data as fast as needed, works with current and historic data, handles efficiently aggregates, and can handle them at any scale. This holistic solution minimizes the Total Cost of Ownership (TCO) of the storage systems needed to develop a data pipeline and minimizes the execution time of the data pipeline. In this direction, the chapter presents a range of architectural patterns for data pipelining and illustrates how the presented solution boosts their simplification and optimization.
Chapter
Full-text available
Despite the rapid digital transformation of banks and financial institutions, state-of-the-art Know Your Customer (KYC) processes still require customers to provide multiple artifacts to the different banks they collaborate with. In an era where data sharing is facilitated from a technological and a regulatory point of view, there is huge potential for improving the efficiency of KYC processes. However, this requires a trustful environment for exchanging data across the various stakeholders, including customers, banks, and other financial organizations. This chapter illustrates how blockchain technology can be used to foster such a trusted environment. It also presents the implementation of a decentralized KYC solution over the Hyperledger Fabric permissioned blockchain infrastructure.
Chapter
Risk assessment is of high importance when it comes to trading, investments, and other financial activities, as poor risk monitoring could lead to inefficient investments, loss of capital, and penalties by regulatory authorities. Thus, robust risk models, capable of yielding real-time results, are valuable assets for investment banking. This chapter introduces a financial tool that can provide risk assessment on Forex portfolios in (near) real-time and pre-trade analysis at rest. Financial risk is measured in terms of both Value at Risk and the Expected Shortfall, with the respective models utilizing not only statistical but also deep learning techniques that achieve accurate results. Moreover, the proposed application, based on state-of-the-art data management technologies, provides real-time risk assessments, utilizing the latest market data. These features along with the provided pre-trade analysis make this solution a valuable tool for practitioners in high frequency trading (HFT) and investment banking in general.
Chapter
Full-text available
Frauds in financial services are an ever-increasing phenomenon, and cybercrime generates multimillion revenues, therefore even a small improvement in fraud detection rates would generate significant savings. This chapter arises from the need to overcome the limitations of the rule-based systems to block potentially fraudulent transactions. After mentioning the limitations of rule-based approach, this chapter explains how machine learning is able to address many of these limitations and, more effectively, identify risky transactions. A novel AI-based fraud detection system – built over a Data Science and Machine Learning – is presented for the pre-processing of transaction data and model training in a batch layer (to periodically retrain the predictive model with new data) while in a stream layer, the real-time fraud detection is handled based on new input transaction data. The architecture presented makes this solution a valuable tool for supporting fraud analysts and for automating the fraud detection processes.
Chapter
Full-text available
Recent advances in Big Data and Artificial Intelligence have created new opportunities for AI-based agents, referred to as Robo-Advisors, to provide financial advice and recommendations to investors. In this chapter, we will introduce the concept of investment recommendation and describe how automated services for this task can be developed and tested. In particular, this chapter covers the following core topics: (1) the legal landscape for investment recommendation systems, (2) what financial asset recommendation is and what data it needs to function, (3) how to clean and curate that data, (4) approaches to build/train asset recommendation models and (5) how to evaluate such systems prior to putting them into production.
Chapter
Full-text available
For three decades, the Bitcoin network processes transactions collectively worth billions in a fully decentralized and trustless way. Its introduction as a way for disintermediating financial institutions came at a time of rising dismay against the establishment due to the financial collapse of 2008. Bitcoin’s rising popularity gave birth to the realization that its underlying technologies could be utilized for other use-cases besides money. This gave rise to imitators, iterators and ultimately a diverse ecosystem of protocols and applications. In most cases, those aforementioned protocols employed technologically proprietary approaches even when aiming to solve similar problems. Today, those separate networks stand as monolithic structures, with no knowledge of information that might exist on another. As such they are hostages to their non-interoperable nature and bound to hardcoded decisions. Their attempts at change often result in divided communities and further balkanization. This dissolution threatens the integrity of the decentralized space, as desolate systems are susceptible to manipulation. Some believe that the future of the wider decentralised ecosystem will rely on a Web 3.0 internet-like infrastructure that will allow for seamless integration and the free exchange of information between systems. In this chapter, we will explore the source of the limitations of blockchain systems, provide a historical overview of current and past interoperable approaches and discuss present and future solution design.
Chapter
Full-text available
In recent years there is a surge in the amount of digital data that are generated by financial organizations, which is driving the development and deployment of novel Big Data and Artificial Intelligence (AI) applications in the finance sector. Nevertheless, there is still no easy and standardized way for developing, deploying and operating data-intensive systems for digital finance. This chapter introduces a standards-based reference architecture model for architecting, implementing and deploying big data and AI systems in digital finance. The model introduces the main building blocks that comprise machine learning and data science pipelines for digital finance applications, while providing structuring principles for their integration in applications. Complementary viewpoints of the model are presented, including a logical view and considerations for developing and deploying applications compliant to the reference architecture. The chapter ends up presenting a few practical examples of the use of the reference model for developing data science pipelines for digital finance.
Chapter
Full-text available
In this chapter, we provide a historic overview of the origin and definitions of Central Bank Digital Currencies (CBDCs), by examining relevant research dating back to the 1990s. We find that digital versions of sovereign money accessible by the private sector were motivated by advancements and challenges emerging from the private sector itself. We present the factors that necessitate their issuance, and especially focus on financial stability, monetary policy, and the increased competition in payments leading to threats in financial and monetary sovereignty. Finally, we assess the appeal of the various technical options for CBDCs against what has emerged as their universally desirable features.
Article
Purpose It is quite possible that financial institutions including life insurance companies would encounter turbulent situations such as the COVID-19 pandemic before policies mature. Constructing models that can generate scenarios for major assets to cover abrupt changes in financial markets is thus essential for the financial institution's risk management. Design/methodology/approach The key issues in such modeling include how to manage the large number of risk factors involved, how to model the dynamics of chosen or derived factors and how to incorporate relations among these factors. The authors propose the orthogonal ARMA–GARCH (autoregressive moving-average–generalized autoregressive conditional heteroskedasticity) approach to tackle these issues. The constructed economic scenario generation (ESG) models pass the backtests covering the period from the beginning of 2018 to the end of May 2020, which includes the turbulent situations caused by COVID-19. Findings The backtesting covering the turbulent period of COVID-19, along with fan charts and comparisons on simulated and historical statistics, validates our approach. Originality/value This paper is the first one that attempts to generate complex long-term economic scenarios for a large-scale portfolio from its large dimensional covariance matrix estimated by the orthogonal ARMA–GARCH model.
Article
Recent accounting and financial researches suggest that the IFRS adoption leads to high-quality financial information characterized by their honesty. The purpose of this paper is to analyse the impact of IAS/IFRS adoption on key aspects of investment decision-making in emerging stock markets. The paper uses a state-space model combined with a standard GARCH specification and a multidimensional panel data model. The results of our empirical investigation show that the IFRS adoption contributed to improving development and performance of emerging markets. It leads to considerable development, reduced volatility, and prompt convergence towards information efficiency.
Article
Full-text available
The paper deals with Monte Carlo simulation method and its application in Risk Management. The author with the help of MATLAB 7.0 introduces new modification of Monte Carlo algorithm aimed at fast and effective calculation of financial organization's Value at Risk (VaR) by the example of Parex Bank's FOREX exposure. First Published Online: 14 Oct 2010
Article
Full-text available
The aim of this study is to fill a gap in modern risk management literature and especially from the perspective of stock markets in emerging economies. This paper provides pioneering strategic risk assessment techniques that can be applied to investment and trading portfolios in emerging financial markets, such as in the context of the Gulf Cooperation Council (GCC) stock markets. In this work, key equity price risk assessment methods and modeling techniques that financial entities, regulators and policymakers should consider in developing their daily price risk management objectives are examined and tailored to the particular needs of emerging markets. This is with the intent of setting-up the basis of a modeling technique for the measurement of strategic equity price exposures in the day-to-day financial trading operations. While extensive literatures have reviewed the statistical and economic proposition of Value at Risk (VaR) models, this paper provides real-world modeling techniques that can be applied to financial trading portfolios under the notion of illiquid and adverse financial market circumstances. In this paper, we demonstrate a practical modeling approach for the measurement of stock markets risk exposure for financial trading portfolios that contain several illiquid equity asset positions. This approach is based on the renowned concept of liquidity-adjusted VaR (LVaR) along with the innovation of a simulation tool utilizing matrix-algebra technique. To this end, a matrix-algebra modeling technique is used to create the necessary quantitative infrastructures and trading risk simulations. This tactic is a useful form to avoid mathematical complexity, as more and more securities are added to the portfolio of assets. In addition, it can simplify the financial modeling and programming process and can allow as well a clear-cut incorporation of short (sell) and long (buy) positions in the daily risk management process. As such, our broad risk model can simultaneously handle LVaR appraisal under normal and severe market conditions besides it takes into account the effects of liquidity of the traded equity securities. In order to illustrate the proper use of LVaR and stress-testing methods, real-world paradigms of trading risk assessment are presented for the GCC stock markets. To this end, simulation case studies are attained with the aspiration of bringing about a reasonable framework for the measurement of equity trading risk exposures. The modeling techniques discussed in this paper will aid financial markets' participants, regulators and policymakers in the instigation of meticulous and up to date simulation algorithms to handle equity price risk exposure. The suggested analytical methods and procedures can be put into practice in virtually all emerging economies, if they are bespoke to correspond to every market's preliminary level of intricacy
Article
Full-text available
The research highlights three Value-at-Risk (VaR) representations that are integrated with GARCH-based models to estimate the Malaysian stock exchange market risk. The methodology covers the quantification of expected maximum losses at 95% level of confidence for six non-financial sectors namely the construction, consumer product, industrial product, plantation, property and trade and services from the year of 1993 until 2006. Further analyses are conducted using Kupiec, Christoffersen and Lopez backtests. The results in particular based on Lopez’s Quadratic Loss Function test proved that when the basic VaR is integrated with GARCH model under the assumption of t-distribution, the model is found to be at the most accurate level. Thus consideration on non-normal behaviour of the market is important to determine financial risk quantification.
Article
Full-text available
Prediction of various market indicators is an important issue in finance. This can be accomplished through computer models and related applications to finance, and in particular through Artificial Neural Networks (ANNs) which have been used in stock market prediction and exchange rates during the last decade. The prediction of financial values (such as stock/exchange rate index as well as daily direction of change in the index) with neural networks has been investigated and, in some applications, it turned out that artificial neural networks have both great advantages and some limitations for learning the data patterns and predicting future values of the financial phenomenon under analysis. In this paper we analyze the particular financial market called FOREX and the way ANNs can make affordable predictions on the evolution of exchange rates between currencies.
Article
Full-text available
Daily VaR numbers have been calculated by using EWMA and GARCH models for the seven currencies. The outcome is GARCH provides slightly more accurate analysis than EWMA. The results are satisfactory for forecasting volatility at 95% and 99% confidence level. These two methods enhance the quality of the VaR models. Interestingly, VaR calculations have predicted the April 1994 and February 2001 devaluation in Turkey. It is also observed that the Turkish Lira's volatility was low during the crawling peg period. However, after February 2001 free floating period caused the volatility to increase. Therefore, volatility forecasts tend to remain high in the post crises period.
Article
The paper deals with Monte Carlo simulation method and its application in Risk Management. The author with the help of MATLAB 7.0 introduces new modification of Monte Carlo algorithm aimed at fast and effective calculation of financial organization's Value at Risk (VaR) by the example of Parex Bank's FOREX exposure.
Article
Full-text available
This article builds on prior work done in [M. Nwogugu, Towards multifactor models of decision making and risk: a critique of prospect theory and related approaches, part one, Journal of Risk Finance 6 (2) (2005) 150–162; M. Nwogugu, Towards multifactor models of decision making and risk: a critique of prospect theory and related approaches, part two, Journal of Risk Finance 6 (2) (2005) 163–172], and proves that cumulative prospect theory/prospect theory and related models are inaccurate and were derived from improper methods and calculations. Furthermore, evidence from neuro-biology shows that the natural mental processes of human beings will result in decision making patterns that differ from what is predicted and implicit in cumulative prospect theory and prospect theory.Decision making and risk assessment are multi-criteria processes that typically require some processing of information, and thus cannot be defined accurately by rigid quantitative models. CPT/PT do not incorporate the many psychological, legal, biological, knowledge, and situational price-dynamic factors inherent in decision making. Thus, there is a need for more realistic decision models.
Article
Full-text available
This article critiques models of market risk (ARMA, GARCH, ARCH, EVT, VAR, Stochastic-Volatility, etc.). The existing metrics for quantifying risk such as standard deviation, VAR/GARCH/EVT/ARMA/SV, etc. are inaccurate and inadequate particularly in emerging markets; and do not account for many facets of risk and decision making; and do not incorporate the many psychological, legal, liquidity, knowledge, and price-dynamic factors inherent in markets and asset prices.Areas for further research include: (a) development of dynamic market-risk models that incorporate asset-market psychology, liquidity, market size, frequency of trading, knowledge differences among market participants, information (capabilities and processing, and trading rules in each market); and (b) further development of concepts in belief systems.
Article
Purpose This paper aims to test empirically the performance of different models in measuring VaR and ES in the presence of heavy tails in returns using historical data. Design/methodology/approach Daily returns of popular indices (S&P500, DAX, CAC, Nikkei, TSE, and FTSE) and currencies (US dollar vs Euro, Yen, Pound, and Canadian dollar) for over ten years are modeled with empirical (or historical), Gaussian, Generalized Pareto (peak over threshold (POT) technique of extreme value theory (EVT)) and Stable Paretian distribution (both symmetric and non‐symmetric). Experimentation on different factors that affect modeling, e.g. rolling window size and confidence level, has been conducted. Findings In estimating VaR, the results show that models that capture rare events can predict risk more accurately than non‐fat‐tailed models. For ES estimation, the historical model (as expected) and POT method are proved to give more accurate estimations. Gaussian model underestimates ES, while Stable Paretian framework overestimates ES. Practical implications Research findings are useful to investors and the way they perceive market risk, risk managers and the way they measure risk and calibrate their models, e.g. shortcomings of VaR, and regulators in central banks. Originality/value A comparative, thorough empirical study on a number of financial time series (currencies, indices) that aims to reveal the pros and cons of Gaussian versus fat‐tailed models and Stable Paretian versus EVT, in estimating two popular risk measures (VaR and ES), in the presence of extreme events. The effects of model assumptions on different parameters have also been studied in the paper.
Article
Full-text available
The intent of this paper is to evoke hands-on policies for trading risk units operating within financial entities. As such, this paper discusses the principal objectives of proprietary trading units and how they should originate adequate internal risk management models. Moreover, it provides a number of viable industry recommendations along with proper internal regulations and policies. The attempt is to provide several guidelines that can assist emerging markets in the establishment of reliable internal trading risk management models within a prudential framework of in-house regulations. The aim of this paper is to share with financial markets’ participants, regulators and policy makers some of the author's real-world know-how as a derivatives trader and later on as a trading risk manager in emerging economies. A number of key internal risk management rules that should be considered in strengthening trading risk management units are examined and adapted to the specific needs of emerging markets. This is with the objective of setting up a practical framework for trading risk measurement, management and control. The suggested internal regulations can be implemented in almost all emerging economies, if they are tailored to relate to each market's initial level of complexity and the fundamental needs of internal risk management models. The main contribution of this paper is in the introduction of practical trading risk management internal regulations that can aid in the proper implementation of the Basel II committee requirement on banking supervision. Although significant literatures have examined the statistical and economic usefulness of Value at Risk and other internal trading risk management models, this paper provides rational and down-to-business guidelines that can be applied for the establishment of adequate trading risk management units in financial markets. The paper will be of significance to those involved in the design of successful and consistent trading risk units in emerging markets.Journal of Banking Regulation (2008) 10, 68–87. doi:10.1057/jbr.2008.18
Article
Full-text available
In this paper, we develop the theoretical and empirical properties of a new class of multivariate GARCH models capable of estimating large time-varying covariance matrices, Dynamic Conditional Correlation Multivariate GARCH. We show that the problem of multivariate conditional variance estimation can be simplified by estimating univariate GARCH models for each asset, and then, using transformed residuals resulting from the first stage, estimating a conditional correlation estimator. The standard errors for the first stage parameters remain consistent, and only the standard errors for the correlation parameters need be modified. We use the model to estimate the conditional covariance of up to 100 assets using S&P 500 Sector Indices and Dow Jones Industrial Average stocks, and conduct specification tests of the estimator using an industry standard benchmark for volatility models. This new estimator demonstrates very strong performance especially considering ease of implementation of the estimator.
Article
Full-text available
Matching university places to students is not as clear cut or as straightforward as it ought to be. By investigating the matching algorithm used by the German central clearinghouse for university admissions in medicine and related subjects, we show that a procedure designed to give an advantage to students with excellent school grades actually harms them. The reason is that the three-step process employed by the clearinghouse is a complicated mechanism in which many students fail to grasp the strategic aspects involved. The mechanism is based on quotas and consists of three procedures that are administered sequentially, one for each quota. Using the complete data set of the central clearinghouse, we show that the matching can be improved for around 20% of the excellent students while making a relatively small percentage of all other students worse off.
Article
Full-text available
The increases prominence of trading activities at many large banking companies has highlighted bank exposure to market risk-the risk of loss from adverse movements in financial market rates and prices. In response, bank supervisors in the United States and abroad have developed a new set of capital requirements to ensure that banks have adequate capital resources to address market risk. This paper offers an overview of the new requirements, giving particular attention to their most innovative feature: a capital charge calculated for each bank using the output of that bank's internal risk measurement model. The authors contend that the use of internal models should lead to regulatory capital charges that conform more closely to banks' true risk exposures. In addition, the information generated by the models should allow supervisors and market participants to compare risk exposures over time and across institutions.
Article
Full-text available
This paper is intended to address the deficiency by clearly defining what is meant by a "good" interval forecast, and describing how to test if a given interval forecast deserves the label "good". One of the motivations of Engle's (1982) classic paper was to form dynamic interval forecasts around point predictions. The insight was that the intervals should be narrow in tranquil times and wide in volatile times, so that the occurrences of observations outside the interval forecast would be spread out over the sample and not come in clusters. An interval forecast that 3 fails to account for higher-order dynamics may be correct on average (have correct unconditional coverage), but in any given period it will have incorrect conditional coverage characterized by clustered outliers. These concepts will be defined precisely below, and tests for correct conditional coverage are suggested. Chatfield (1993) emphasizes that model misspecification is a much more important source of poor interval forecasting than is simple estimation error. Thus, our testing criterion and the tests of this criterion are model free. In this regard, the approach taken here is similar to the one taken by Diebold and Mariano (1995). This paper can also be seen as establishing a formal framework for the ideas suggested in Granger, White and Kamstra (1989). Recently, financial market participants have shown increasing interest in interval forecasts as measures of uncertainty. Thus, we apply our methods to the interval forecasts provided by J.P. Morgan (1995). Furthermore, the so-called "Value-at-Risk" measures suggested for risk measurement correspond to tail forecasts, i.e., one-sided interval forecasts of portfolio returns. Lopez (1996) evaluates these types of forecasts applying the procedures develo...
Article
To measure the risks involved in their trading operations, major banks are increasingly employing value at risk (BR) models. In an important regulatory innovation, the Basle Committee has proposed that such models be used in the determination of the capital that banks must hold to back their securities trading. This article examines the empirical pevformance of diferent VaR models using data on the actual fixed-income, foreign exchange, and equity security holdings of a large bank. We examine how a bank applying the models would have fared in the past the proposed rules had been in operation.
Article
This article examines the covariance matrices that are often used for internal value at risk models. We first show how the large covariance matrices necessary for global risk management systems can be generated using orthogonalization procedures in conjunction with univariate volatility forecasting methods. We then examine the performance of three common volatility forecasting methods: the equally weighted average of squared returns; the exponentially weighted average; and generalized autortegressive conditional heteroscedasticity (GARCH). Standard statistical evaluation criteria using equity and foreign exchange data with 1996 as the test period give mixed results, although they generally favor the exponentially weighted moving average methodology for all but very short-term holding periods. But these criteria assess the ability to model the center of returns distributions, while value at risk models require accuracy in the tails. Operational evaluation takes the form of backtesting volatility forecasts following the Bank for International Settlements (BIS) guidelines. For almost all major equity markets and U.S. dollar exchange rates, both the equally weighted average and the GARCH models produce results falling within the acceptable "green zone." But on most of the test data, and particularly for foreign exchange, using predictions from exponentially weighted moving average models leads to an unacceptably high number of outliers. Thus value at risk measures calculated using this method would be understated.
Article
The exponentially weighted moving average (EWMA) estimator is widely used to forecast the conditional volatility of short horizon asset returns. The EWMA estimator is appropriate when returns are drawn from a normal conditional distribution, but when the conditional distribution of returns is fat-tailed - as is commonly found in practice - the EWMA estimator will be inefficient in the sense that it will attach too much weight to extreme returns. In this paper, we propose a simple generalisation of the EWMA estimator that nests both the standard EWMA estimator, and other EWMA estimators that are robust to leptokurtosis in the conditional distribution of returns. We illustrate the new estimator by forecasting the value at risk of aggregate equity portfolios for the US, the UK and Japan using historical simulation. Backtesting results show that a robust EWMA estimator based on the absolute value of returns rather than their squares offers an improvement over the standard EWMA estimator.
Article
The implementation of multivariate GARCH models in more than a few dimensions is extremely difficult: because the model has many parameters, the likelihood function becomes very flat, and consequently the optimization of the likelihood becomes practicably impossible. There is simply no way that full multivariate GARCH models can be used to estimate directly the very large covariance matrices that are required to net all the risks in a large trading book. This paper begins by describing the principal component GARCH or ‘orthogonal GARCH’ (O-GARCH) model for generating large GARCH covariance matrices that was first introduced in Alexander and Chibumba (1996) and subsequently developed in Alexander (2000, 2001b). The O-GARCH model is an accurate and efficient method for generating large covariance matrices that only requires the estimation of univariate GARCH models. Hence, it has many practical advantages, for example in value–at–risk models. It works best in highly correlated systems, such as term structures. The purpose of this paper is to show that, if sufficient care is taken with the initial calibration of the model, equities and foreign exchange rates can also be included in one large covariance matrix. Simple conditions for the final covariance matrix to be positive semi-definite are derived. (J.E.L.: C32, C53, G19, G21, G28).
Article
This paper presents theoretical results in the formulation and estimation of multivariate generalized ARCH models within simultaneous equations systems. A new parameterization of the multivariate ARCH process is proposed and equivalence relations are discussed for the various ARCH parameterizations. Constraints sufficient to guarantee the positive definiteness of the conditional covariance matrices are developed, and necessary and sufficient conditions for covariance stationarity are presented. Identification and maximum likelihood estimation of the parameters in the simultaneous equations context are also covered. * This paper began as a synthesis of at least three UCSD Ph.D. dissertations on various aspects of multivariate ARCH modelling, byYoshi Baba, Dennis Kraft and Ken Kroner. In fact, an early version of this paper was written by Baba, Engle, Kraft and Kroner, which led to the acronym (BEKK) used in this paper for the new parameterization presented. In the interests of continui...
Article
The proposed market-risk capital-adequacy framework, to be implemented at the end of 1997, requires Australian banks to hold capital against market risk. A fundamental component of this framework is the opportunity for banks to use their value-at-risk (VaR) models as the basis of the market-risk capital charge. Value-at-risk measures the potential loss on a portfolio for a specified level of confidence if adverse movements in market prices were to occur. This paper examines the VaR measure and some of the techniques available for assessing the performance of a VaR model. The first section of the paper uses a simple portfolio of two spot foreign exchange positions to illustrate three of the approaches used in the calculation of a VaR measure: variance-covariance, historical simulation and Monte-Carlo simulation. It is concluded that, although VaR is a very useful tool, it is not without its shortcomings and so should be supplemented with other risk-management techniques. The second section of the paper focuses on the use of backtesting – the comparison of model-generated VaR numbers with actual profits and losses z– for assessing the accuracy of a VaR model. Several statistical tests are demonstrated by testing daily VaR and profit and loss data obtained from an Australian bank. The paper concludes that, although the tests are not sufficiently precise to form the basis of regulatory treatment of banks’ VaR results, the tests do provide useful diagnostic information for evaluating model performance.
Article
A complete theory for evaluating interval forecasts has not been worked out to date. Most of the literature implicitly assumes homoskedastic errors even when this is clearly violated and proceed by merely testing for correct unconditional coverage. Consequently, the author sets out to build a consistent framework for conditional interval forecast evaluation, which is crucial when higher-order moment dynamics are present. The new methodology is demonstrated in an application to the exchange rate forecasting procedures advocated in risk management. Copyright 1998 by Economics Department of the University of Pennsylvania and the Osaka University Institute of Social and Economic Research Association.
Article
We study the effect of restrictions on dual trading in futures contracts. Previous studies have found that dual trading restrictions can have a positive, negative, or neutral effect on market liquidity. In this paper, we propose that trader heterogeneity may explain these conflicting empirical results. We find that, for contracts affected by restrictions, the change in market activity following restrictions differs between contracts. More important, the effect of a restriction varies among dual traders in the same market. For example, dual traders who ceased trading the S&P 500 index futures following restrictions had the highest personal trading skills prior to restrictions. However, realized bid-ask spreads for customers did not increase following restrictions. Our results imply that securities regulation may adversely affect customers, but in ways not captured by broad-based liquidity measures, such as the bid-ask spread.
Article
Time varying correlations are often estimated with Multivariate Garch models that are linear in squares and cross products of returns. A new class of multivariate models called dynamic conditional correlation (DCC) models is proposed. These have the flexibility of univariate GARCH models coupled with parsimonious parametric models for the correlations. They are not linear but can often be estimated very simply with univariate or two step methods based on the likelihood function. It is shown that they perform well in a variety of situations and give sensible empirical results.
Article
The aim of this article is 2-fold: first to test the adequacy of Pareto distributions to describe the tail of financial returns in emerging and developed markets, and second to study the possible correlation between stock market indices observed returns and return's extreme distributional characteristics measured by Value at Risk and Expected Shortfall. We test the empirical model using daily data from 41 countries, in the period from 1995 to 2005. The findings support the adequacy of Pareto distributions and the use of a log linear regression estimation of their parameters, as an alternative for the usually employed Hill's estimator. We also report a significant relationship between extreme distributional characteristics and observed returns, especially for developed countries.
Article
Value-at-Risk measures the potential loss on a portfolio, where the potential loss is linked directly to the probability of large, adverse movements in market prices. This paper considers four classes of Value-at-Risk model: variance-covariance models; historical-simulation models; Monte-Carlo simulation models; and extreme-value estimation models. Using portfolio data from all Australian banks over the past ten years, we compare the performance of specific implementations of each of the four Value-at-Risk model classes. Performance assessment is based on a range of measures that address the conservatism, accuracy and efficiency of each model.
Article
Under the market-risk capital requirements introduced at the beginning of this year, Australian banks may choose between two alternatives when measuring their market-risk exposure: a standard regulatory model and their own internally-developed risk-measurement model. The extent to which different banks' models provide differing estimates of risk for given financial instruments impacts directly on the fairness of the capital adequacy regime.To assess the dispersion of banks' market-risk measurements, we conducted a survey asking banks to provide their estimate of the market risk residing in a number of pre-specified portfolios. We found the spread in risk estimates to be broad. However, most of the variation across banks is attributable to a small number of banks that use crude, but conservative, models. No bank was found to systematically underestimate risk. The survey results also suggest that there is no undue disparity between the banks' internal-model based capital charges and the capital charge that would be required if those banks were to use the standard regulatory model. No significant correlation was found between banks' risk estimates and their choice of market-risk modelling approach.
Article
A natural generalization of the ARCH (Autoregressive Conditional Heteroskedastic) process introduced in Engle (1982) to allow for past conditional variances in the current conditional variance equation is proposed. Stationarity conditions and autocorrelation structure for this new class of parametric models are derived. Maximum likelihood estimation and testing are also considered. Finally an empirical example relating to the uncertainty of the inflation rate is presented.
Article
A common approach to estimating the conditional volatility of short horizon asset returns is to use an exponentially weighted moving average (EWMA) of squared past returns. The EWMA estimator is based on the maximum likelihood estimator of the variance of the normal distribution, and is thus optimal when returns are conditionally normal. However, there is ample evidence that the conditional distribution of short horizon financial asset returns is leptokurtic, and so the EWMA estimator will generally be inefficient in the sense that it will attach too much weight to extreme returns. In this paper, we propose an alternative EWMA estimator that is robust to leptokurtosis in the conditional distribution of portfolio returns. The estimator is based on the maximum likelihood estimator of the standard deviation of the Laplace distribution, and is a function of an exponentially weighted moving average of the absolute value of past returns, rather than their squares. We employ the robust EWMA e...
Methods for Evaluating Value-at-Risk EstimatesPerformance Evaluation of Alter-native VaR Models’ (mimeo, Indira Gandhi Institute of Development Research
  • J Lopez
  • M Sarma
  • S Thomas
  • A Shah
Lopez, J. (1999), ‘Methods for Evaluating Value-at-Risk Estimates’, Federal Reserve Bank of San Francisco Economic Review, No. 02, pp. 3–17. Sarma, M., S. Thomas and A. Shah (2000), ‘Performance Evaluation of Alter-native VaR Models’ (mimeo, Indira Gandhi Institute of Development Research). Wilmott, P. (1998), Derivatives: The Theory and Practice of Financial Engineering (Wiley, Chichester, UK). IntervalForecasts’, International FOREX RISK: MEASUREMENT AND EVALUATION 1417 #Blackwell Publishing Ltd 2004 r
  • C Alexander
Alexander, C. (2001), 'Orthogonal GARCH,' Mastering Risk Volume 2 (FT Prentice Hall), pp. 21-38.
Methods for Evaluating Value-at-Risk Estimates', Federal Reserve Bank of San Francisco Economic Review
  • J Lopez
Lopez, J. (1999), 'Methods for Evaluating Value-at-Risk Estimates', Federal Reserve Bank of San Francisco Economic Review, No. 02, pp. 3-17.
Measuring Market Traded Risk: Value-atRisk and Backtesting Techniques', Reserve Bank of Australia Discussion Paper No
  • C Cassidy
  • M Gizycki
Cassidy, C. and M. Gizycki (1997), 'Measuring Market Traded Risk: Value-atRisk and Backtesting Techniques', Reserve Bank of Australia Discussion Paper No. 9708.
Conservatism, Accuracy and Efficiency: Comparing at Risk Models' , Australian Prudential Regulatory Authority Discussion Paper
  • J Engel
  • M Gizycki
Engel, J. and M. Gizycki (1999), 'Conservatism, Accuracy and Efficiency: Comparing Value at Risk Models', Australian Prudential Regulatory Authority Discussion Paper No. 2.
Performance Evaluation of Alternative VaR Models' (mimeo
  • M Sarma
  • S Thomas
  • A Shah
Sarma, M., S. Thomas and A. Shah (2000), 'Performance Evaluation of Alternative VaR Models' (mimeo, Indira Gandhi Institute of Development Research).
Orthogonal GARCH: An Empirical Validation on Equities, Foreign-Exchange and Interest Rates
  • C Alexander
  • A Chibumba
Alexander, C. (2001), 'Orthogonal GARCH,' Mastering Risk Volume 2 (FT Prentice Hall), pp. 21-38. ---(2002), 'Principal Component Models for Generating Large GARCH Covariance Matrices', Economic Notes, Vol. 31, No. 2, pp. 337-59. ---and A. Chibumba (1998), 'Orthogonal GARCH: An Empirical Validation on Equities, Foreign-Exchange and Interest Rates', Working Paper (University of Sussex).
Capital Requirements and Value-at-Risk Analysis', Institute for Financial Research, Birkbeck College Working Paper IFR1
  • P Jackson
  • D Maude
  • W Perraudin
Jackson, P., D. Maude and W. Perraudin (1995), 'Capital Requirements and Value-at-Risk Analysis', Institute for Financial Research, Birkbeck College Working Paper IFR1. ---------(1997), 'Bank Capital and Value-at-Risk', Journal of Derivatives, Vol. 4, No. 3, pp. 73-89. ---, W. Perraudin and D. Maude (1998), 'Testing Value-at-Risk Approaches to Capital Adequacy', Bank of England Quarterly Bulletin, Vol. 38, No. 3, pp. 256-66.
  • K Kroner
---and K. Kroner (1995), 'Multivariate Simultaneous GARCH' Econometric Theory, Vol. 11, pp. 122-50.
Capital Requirements and Value-at-Risk Analysis' , Institute for Financial Research
  • P Jackson
  • D Maude
  • W Perraudin
Chibumba1998 ‘Orthogonal GARCH: An Empirical Validation on Equities Foreign-Exchange and Interest Rates’ Working Paper
  • Alexander C Anda
Gizycki1999 ‘Conservatism Accuracy and Efficiency: ComparingValueat Risk Models’ Australian Prudential Regulatory Authority Discussion Paper
  • J Engel
Sheppard2001 ‘Theoretical and Empirical Properties of Dynamic Conditional Correlation Multivariate GARCH’ Working Paper
  • R Engle
  • Andk
Perraudin1995 ‘Capital Requirements and Value-at-Risk Analysis’ Institute for Financial Research Birkbeck College Working Paper IFR1
  • P D Jackson
  • Maudeandw
Assessing the Dispersion in Banks’ Estimates of Market Risk: The Results of a Value-at-Risk Survey’ Australian Prudential Regulatory Authority Discussion Paper No
  • M Gizycki
  • Andn
Shah2000 ‘Performance Evaluation of Alternative VaR Models’(mimeo Indira Gandhi Institute of Development Research)
  • M S Sarma
  • Thomasanda