About
54
Publications
26,024
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
410
Citations
Citations since 2017
Introduction
Option pricing, Portfolio allocation, Operational Risk, VaR estimation, Variable Selection, Factorisation Machines
Additional affiliations
December 1997 - present
Publications
Publications (54)
The IFRS 9 accounting standard requires the prediction of credit deterioration in financial instruments, i.e., significant increases in credit risk (SICR). However, the definition of such a SICR-event is inherently ambiguous, given its reliance on comparing two subsequent estimates of default risk against some arbitrary threshold. We examine the sh...
Logistic regression is a very popular binary classification technique in many industries, particularly in the financial service industry. It has been used to build credit scorecards, estimate the probability of default or churn, identify the next best product in marketing, and many more applications. The machine learning literature has recently int...
Some insights about the applicability of logistic factorisation machines in banking
Trading in binary options is discussed using an approach based on expected profit (EP) and expected loss (EL) as metrics of reward and risk of trades. These metrics are reviewed and the role of the EL/EP ratio as an indicator of quality of trades, taking risk tolerance into account, is discussed. Formulas are derived for the EP and EL of call and p...
Trading in binary options is discussed using an approach based on expected profit (EP) and expected loss (EL) as metrics of reward and risk of trades. Formulas are derived for the EP and EL of call and put binaries assuming that the price of the underlying instrument follows a geometric Brownian motion. The results are illustrated with practical da...
Machine learning and statistical models are increasingly used in a prediction context and in the process of building these models the question of which variables to include often arises. Over the last 50 years a number of procedures have been proposed in especially the statistical literature. In this paper a new variable selection procedure is intr...
Since the introduction of factorisation machines in 2010, it became a popular prediction technique amongst machine learners who applied the method with success in a number of data science challenges such as Kaggle or KDD Cup. Despite these successes, factorisation machines are not often considered as a modelling technique in business, partly becaus...
In the context of variable selection for the standard linear model a subset of variables is defined to be "good at margin " if it has two properties, namely its associated error sum of squares will be improved in relative terms by less than if any variable is added to it, and also deteriorate by at least if any variable inside it is removed from it...
Since the introduction of factorisation machines in 2010, it became a popular prediction technique amongst machine learners who applied the method with success in several data science challenges such as Kaggle or KDD Cup. Despite these successes, factorisation machines are not often considered as a modelling technique in business, partly because la...
The pricing of options is discussed, using an approach based on expected profit (EP) and expected loss (EL) as measures of the reward and risk of trades, respectively. It is shown that the EL/EP ratio is an important indicator of the quality of trades.
Formulas are derived for these measures for European call and put options under the traditional g...
Financial institutions are concerned about various forms of risk that might impact them. The management of these institutions has to demonstrate to shareholders and regulators that they manage these risks in a pro-active way. Often the main risks are caused by excessive claims on insurance policies or losses that occur due to defaults on loan payme...
Background: This article considers whether South African banks should utilise the credit ratings provided by US-based credit rating agencies when assessing the creditworthiness of corporate borrowers.
Aim: A review is conducted of the relevant literature and specifically the methodologies used by the credit rating agencies for ranking corporates i...
The Basel II accord (2006) includes guidelines to financial institutions for the estimation of regulatory capital (RC) for retail credit risk. Under the advanced Internal Ratings Based (IRB) approach, the formula suggested for calculating RC is based on the Asymptotic Risk Factor (ASRF) model, which assumes that a borrower will default if the value...
The Basel regulatory credit risk rules for expected losses require banks use downturn loss given default (LGD) estimates because the correlation between the probability of default (PD) and LGD is not captured, even though this has been repeatedly demonstrated by empirical research. A model is examined which captures this correlation using empirical...
Background: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks...
Determining banks’ expected losses (EL) is straightforward because they are calculated using a linear combination of credit risk-related measures. Non-linear metrics, like economic capital (EC), pose considerable implementation challenges including computation complexity and a lack of adequate risk aggregation and attribution techniques when multip...
Many banks currently use the loss distribution approach (LDA) for estimating economic and regulatory capital for operational risk under Basel’s advanced measurement approach. The LDA requires the modeling of the aggregate loss distribution in each operational risk category (ORC), among others. The aggregate loss distribution is a compound distribut...
Standard Bank, South Africa, currently employs a methodology when developing application or behavioural scorecards that involves logistic regression. A key aspect of building logistic regression models entails variable selection which involves dealing with multicollinearity. The objective of this study was to investigate the impact of using differe...
The Basel II regulatory framework significantly increased the resilience of the banking system, but proved ineffective in preventing the 2008/9 financial crisis. The subsequent introduction of Basel III aimed, inter alia, to supplement bank capital using buffers. The countercyclical buffer boosts existing minimum capital requirements when systemic...
Many banks use the loss distribution approach in their advanced measurement models to estimate regulatory or economic capital. This boils down to estimating the 99.9% VaR of the aggregate loss distribution and is notoriously difficult to do accurately. Also, it is well-known that the accuracy with which the tail of the loss severity distribution is...
Many banks use the loss distribution approach in their advanced measurement models to estimate regulatory or economic capital. This boils down to estimating the 99.9% value-at-risk of the aggregate loss distribution and is notoriously difficult to do accurately. Also, it is well-known that the accuracy with which the tail of the loss severity distr...
GARCH models are useful to estimate the volatility of financial return series. Historically the innovation distribution of a GARCH model was assumed to be standard normal but recent research emphasizes the need for more general distributions allowing both asymmetry (skewness) and kurtosis in the innovation distribution to obtain better fitting mode...
Universities are academic institutions with the primary objectives of teaching students a particular academic discipline and for conducting research related to that discipline. Traditionally, very little collaboration existed between universities and industry with respect to training and research in the mathematical sciences. Because of funding pre...
Extended stochastic volatility models are studied which use the daily returns as well as the volatility information in intraday price data summarised in terms of a number of realised measures. These extended models treat the logarithm of daily volatility as a latent process with autoregressive structure, relate to daily returns via their variance m...
The Basel II regulatory framework significantly increased the resilience of the banking system, but proved ineffective in preventing the 2008/9 financial crisis. The subse-quent introduction of Basel III aimed, inter alia, to supplement bank capital using buff-ers. The countercyclical buffer boosts existing minimum capital requirements when systemi...
Standard Bank, South Africa, currently employs a particular methodology when developing application or behavioural scorecards. One of the processes in this methodology involves model building using logistic regression. A key aspect of building logistic regression models entails variable selection which involves dealing with multicollinearity. The o...
The role of operational risk in the 2007/2008 financial crisis is explored. The factors that gave rise to the crisis are examined and it is found that although the event is largely regarded as a credit crisis, operational risk factors played a significant role in fuelling its duration and severity. It is concluded that, from an operational risk per...
P. M. Lildholdt [Estimation of GARCH models based on open, close, high and low prices. Working paper 128, Centre Analytical Finance, Aarhus School of Business. (2002)] introduced a method to fit GARCH models to financial return series using open, close, high and low prices. This method assumes that the price process on each day follows a lognormal...
Since their introduction in 1978, regression quantiles have played an increasingly important role in regression analysis. In particular, they form the basis for quantile regression, a statistical technique intended to draw inferences about conditional quantile functions. Although it is well known that regression quantiles have limiting normal distr...
We studied the finite sample behaviour of a range of estimators for extreme regression quantiles. We studied the behaviour of the estimators over a wide range of heavy tailed error distributions and design matrices containing influential design points. Our main conclusion is that restricted bounded influence estimators should be the estimators of c...
It may be misleading to estimate value-at-risk (VAR) or other risk measures assuming normally distributed innovations in a model for a heteroscedastic financial return series. Using the t-distribution instead or applying extreme value theory (EVT) have been proposed as possible solutions to this problem. We study the effect on the quality of risk e...
During 1988 and 1989, we developed the Future system to help aid the South African Navy to identify threatening events early. The navy currently uses the system daily, improving productivity and reducing risk. The navy uses the system to monitor and interpret shipping movements recorded in a data base. It is a real-time knowledge-based system and i...
Currently an ever increasing number of articles on Neural Networks are appearing in especially the engineering and computer science journals. Neural Networks are applied to a wide variety of problems, but considerable successes have been obtained when applying Neural Networks to classification problems. Statisticians concern themselves with such pr...
The influence functions of the regression trimmed-mean estimators proposed by Koenker and Bassett (1978) and Welsh (1987) are bounded in the dependent-variable space but not in the independent-variable space. This article follows the approach of Mallows (1973, 1975) and modifies these estimators so that the resulting estimators have bounded-influen...
Trimmed mean type estimators are proposcd for estimating the parameters of an AR(1) process. Thcsc definitions are then extended to bounded influence trimmed mcans in analogy to those in the regression case. The behaviour of the estimators are studied numerically under two outlicr generating models.
It has become common practice to fit GARCH models to financial time series by means of pseudo maximum likelihood. In this study we investigate the behaviour of several maximum likelihood based methods for estimating the Garch model parameters and for estimating volatility and risk measures (VaR and expected shortfall). We consider NIG, skewed-t, t...