Article

L’applicazione dei nuovi scenari di variazione dei tassi di interesse proposti dal Comitato di Basilea: quali implicazioni per le banche italiane?

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

Article
Full-text available
This paper contributes to prior literature and to the current debate concerning the prudential supervisory framework to measure interest rate risk in the banking book (IRRBB), which has been significantly changed on April 2016, when the Basel Committee on Banking Supervision (BCBS) published the latest update of its measurement standards. The consultation launched by the European Banking Authority (EBA) on December 2021, aiming at introducing the supervisory outlier test (SOT) on net interest income (NII), presents several issues and policy implications which could influence in the next future banks' asset and liability management strategies, their internal control systems, risk policies and procedures. By analyzing a sample of 28 Italian commercial banks at the end of 2021, representing more than 70% of Italian baking system’s total assets , we observe that the thresholds proposed by the EBA appear very strict and significantly depend on: i) the sample considered, ii) the lower bound applied to interest rates in the downward scenarios and iii) the current level of interest rates term structure. Our results suggest that the proposed values should be considered with caution as it seems that their potential impacts have not been thoroughly assessed. Further analyses are therefore necessary to guarantee greater robustness of the methodology used for the calibration of the thresholds, taking also into account a wider sample of banks and longer time series, as well as the correlation between the two approaches.
Book
The AIFIRM Commission on the interest rate risk in the banking book (IRRBB) has been established in a period of significant changes in the related prudential supervisory framework, which started in April 2016 with the publication of Basel Committe on Banking Supervision (BCBS)’s new standards. BCBS confirmed the secondpillar classification of IRRBB and introduced changes in its measurement approach. European regulation has already partially adopted these standards; the European Banking Authority (EBA) will issue specific technical standards and update its guidelines by March 2022. The Commission has firstly analyzed the most significant aspects of the recent changes in IRRBB-related regulation, assessing the potential impacts on models, processes and banks’ exposure to IRRBB. Following this analysis, the Commission has developed operational proposals that intend to provide support to individual risk managers and their structures in measuring, controlling and managing IRRBB and in adapting bank processes to the new regulatory requirements.
Article
Full-text available
The use of derivative instruments by Italian retail banks, particularly by the less significant ones, has considerably decreased in recent years after the boom recorded in the past decades. However, the recently regulatory evolution in terms of IFRS9 and EMIR regulation, as well as the current level of interest rates, suggest banks consider the implementation of hedging strategies to optimize in a future perspective their interest rate margin profile. Notably, the asset and liability management strategies put in place by banks are also a function of expectations regarding the future dynamics of interest rates, which, in turn, have an impact on the cost of derivatives. In the current market environment, characterized by low expectations of an increase in interest rates, the implementation of specific hedging strategies can also take place at reasonably low costs. Further, this paper analyzes the opportunities arising from the recent regulatory changes from an integrated perspective, taking into account the risk profile, the internal processes, and the business model of the Italian retail banks. An adequate level of risk culture shared among the commercial and control functions, and more proper and accurate use of these instruments, represent necessary conditions for derivatives to become a strategic option to improve the banks' profitability and act as fundamental drivers for the shareholders' value creation process.
Article
Full-text available
This paper contributes to prior literature and to the current debate concerning recent revisions of the regulatory approach to measuring bank exposure to interest rate risk in the banking book by focusing on assessment of the appropriate amount of capital banks should set aside against this specific risk. We first discuss how banks might develop internal measurement systems to model changes in interest rates and measure their exposure to interest rate risk that are more refined and effective than are regulatory methodologies. We then develop a backtesting framework to test the consistency of methodology results with actual bank risk exposure. Using a representative sample of Italian banks between 2006 and 2013, our empirical analysis supports the need to improve the standardized shock currently enforced by the Basel Committee on Banking Supervision. It also provides useful insights for properly measuring the amount of capital to cover interest rate risk that is sufficient to ensure both financial system functioning and banking stability.
Article
Full-text available
# AIFIRM Magazine Vol. 10 N. 3 # E’ di estrema importanza al fine di monitorare correttamente il rischio di un portafoglio e al fine di pianificare strategie adeguate di asset allocation modellizzare opportunamente la struttura a termine dei tassi d’interesse. La modellizzazione parametrica dei tassi spot consente, in particolare, di dare una forma analitica alla curva analizzata, impiegando come variabile indipendente il tempo a scadenza. Pertanto, tale approccio regressivo ha il vantaggio di essere idoneo a qualsiasi tipo di scenario e analisi what-if che lo sperimentatore abbia intenzione di condurre (stress paralleli, twist, butterfly lungo tutta la curva o porzioni di essa). Lo svantaggio delle tecniche parametriche risiede nel fatto che queste assumono aprioristicamente una forma di funzione alla quale la term-structure osservata sul mercato deve necessariamente adattarsi previa calibrazione di un insieme di parametri. Le funzioni da impiegare per il fitting proposte dalla letteratura tecnico-scientifica sono basate sulle tipiche forme che le curve dei tassi hanno assunto negli anni passati. Il presente studio ha il fine di dimostrare come questi approcci non sono sempre in grado di soddisfare un fitting ottimale qualora le strutture a termine dei tassi presentino delle irregolarità, come quelle osservate oggi sui mercati finanziari (tassi negativi, illiquidità, alta volatilità) e come questa mancanza possa essere risolta implementando un sistema di machine-learning basato su reti neurali e funzioni a base radiale. L’articolo si suddivide in tre parti: nella prima vengono presentati i tradizionali modelli parametrici riportati in letteratura (Nelson-Siegel, Svensson e Rezende-Ferreira), mentre nella seconda viene illustrato il principio di funzionamento di una rete costituita da percettroni artificiali e da funzioni a base radiale. La terza parte applica le metodologie discusse nei due punti precedenti su quattro term-structures dimostrando la maggiore idoneità delle reti neurali nel modellizzare curve che presentino irregolarità di forma.
Article
Full-text available
# AIFIRM Magazine Vol. 10 N. 2 # Non è molto raro per un pricer dover valorizzare delle opzioni impiegando superfici di volatilità quotate sul mercato non complete. Infatti, specialmente per gli strike non ATM (at-the-money), non sempre è presente un contributore che valorizzi tutti i punti costituenti la superficie. Qualora i dati mancanti della matrice risultino non molti e sparsi con una sufficiente uniformità, ci si può ricondurre ad un semplice problema di data-missing, risolvibile la maggior parte delle volte, con una interpolazione bidimensionale dei valori o note tecniche specifiche (regressione, ANOVA, Response Surface Methodology). Succede però che talvolta le quotazioni manchino per una porzione consistente della superficie e, pertanto, approssimare localmente un dato a partire dai valori assunti nel suo intorno non è possibile farlo in modo soddisfacente. Si pensi, ad esempio, alla superficie dei premi dei floor sui tassi inflattivi europei, a partire dalla quale vengono desunte le volatilità da applicare alla valorizzazione delle opzioni scritte sugli strumenti finanziari inflattivi in valuta domestica. Le sezioni in corrispondenza degli strike fortemente in e out of the money (-2%, -1.5% e 3%) non sono quasi interamente valorizzate, così come quella in corrispondenza dello strike -0.5%. In questi casi, il problema da risolvere, al fine di giungere ad un pricing affidabile dell’opzione, è ricostruire le porzioni mancanti analizzando globalmente la superficie. Le reti neurali artificiali feed-forward possono essere utilmente impiegate per la risoluzione di questa criticità. L’articolo può essere suddiviso in due parti: la prima di carattere teorico, illustra il funzionamento delle (ANN) Artificial Neural Network; la seconda, di carattere applicativo, descrive la procedura atta alla ricostruzione delle porzioni non quotate della superficie di volatilità.
Article
Full-text available
Because publicly available measures of deposit runoff risk are scarce, regulators' models to measure interest rate risk in the banking book are based on very coarse assumptions about the allocation of nonmaturity deposits within the regulatory maturity ladder. Using well-established statistical approaches, we address this issue by developing a methodology that considers deposits' actual behavior in terms of both price sensitivity to changes in market rates and volume stability over time. Our model extends the current knowledge of public measures of interest rate risk and can be applied to publicly available data in a manner replicable by those outside of banking institutions. The use of different allocation criteria affects not only the size of the risk indicator but also the nature of banks' risk exposure and determines the risk inversion phenomenon, that is, banks exposed to an increase in interest rates can experience a reduction in their equity economic value if interest rates decrease. Overall, our results confirm the importance of accurate modeling of nonmaturity deposits for estimating interest rate risk in the banking book.
Article
Full-text available
Linear principal component analysis (PCA) can be extended to a nonlinear PCA by using artificial neural networks. But the benefit of curved components requires a careful control of the model complexity. Moreover, standard techniques for model selection, including cross-validation and more generally the use of an independent test set, fail when applied to nonlinear PCA because of its inherent unsupervised characteristics. This paper presents a new approach for validating the complexity of nonlinear PCA models by using the error in missing data estimation as a criterion for model selection. It is motivated by the idea that only the model of optimal complexity is able to predict missing values with the highest accuracy. While standard test set validation usually favours over-fitted nonlinear PCA models, the proposed model validation approach correctly selects the optimal model complexity.
Conference Paper
Full-text available
Experimental time courses often reveal a nonlinear behaviour. Analysing these nonlinearities is even more challenging when the observed phenomenon is cyclic or oscillatory. This means, in general, that the data describe a circular trajectory which is caused by periodic gene regulation. Nonlinear PCA (NLPCA) is used to approximate this trajectory by a curve referred to as nonlinear component. Which, in order to analyse cyclic phenomena, must be a closed curve hence a circular component. Here, a neural network with circular units is used to generate circular components. This circular PCA is applied to gene expression data of a time course of the intraerythrocytic developmental cycle (IDC) of the malaria parasite Plasmodium falciparum. As a result, circular PCA provides a model which describes continuously the transcriptional variation throughout the IDC. Such a computational model can then be used to comprehensively analyse the molecular behaviour over time including the identification of relevant genes at any chosen time point.
Article
Full-text available
This paper introduces a parametrically parsimonious model for yield curves that has the ability to represent the shapes generally associated with yield curves: monotonic, humped, and S-shaped. The authors find that the model explains 96 percent of the variation in bill yields across maturities during the period 1981-83. The movement of the parameters through time reflects and confirms a change in Federal Reserve monetary policy in late 1982. The ability of the fitted curves to predict the price of the long-term Treasury bond with a correlation of 0.96 suggests that the model captures important attributes of the yield/maturity relation. Copyright 1987 by the University of Chicago.
Article
Full-text available
Interest income is the most important source of revenue for most banks. The aim of this paper is to assess the impact of different interest rate scenarios on the banks' interest income. As we do not know the interest rate sensitivity of real banks, we construct for each bank a portfolio with a similar composition of its assets and liabilities, called 'tracking bank'. We evaluate the effect of 260 historical interest rate shocks on the tracking banks of German savings and cooperative banks. It turns out that a sharp decrease in the steepness of the yield curve has the most negative impact on the banks' interest income.
Article
Full-text available
Visualizing and analysing the potential non-linear structure of a dataset is becoming an important task in molecular biology. This is even more challenging when the data have missing values. Here, we propose an inverse model that performs non-linear principal component analysis (NLPCA) from incomplete datasets. Missing values are ignored while optimizing the model, but can be estimated afterwards. Results are shown for both artificial and experimental datasets. In contrast to linear methods, non-linear methods were able to give better missing value estimations for non-linear structured data.Application: We applied this technique to a time course of metabolite data from a cold stress experiment on the model plant Arabidopsis thaliana, and could approximate the mapping function from any time point to the metabolite responses. Thus, the inverse NLPCA provides greatly improved information for better understanding the complex response to cold stress. scholz@mpimp-golm.mpg.de.
Article
The paper develops a Value-at-Risk methodology to assess Italian banks’ interest rate risk exposure. By using five years of daily data, the exposure is evaluated through a principal component VaR based on Monte Carlo simulation according to two different approaches (parametric and non-parametric). The main contribution of the paper is a methodology for modelling interest rate changes when underlying risk factors are skewed and heavy-tailed. The methodology is then implemented on a one-year holding period in order to compare the results from those resulting from the Basel II standardized approach. We find that the risk measure proposed by Basel II gives an adequate description of risk, provided that duration parameters are changed to reflect market conditions. Finally, the methodology is used to perform a stress testing analysis.
Article
The savings and loan crisis of the 1980s revealed the vulnerability of some depository institutions to changes in interest rates. Since that episode, U.S. bank supervisors have placed more emphasis on monitoring the interest rate risk of commercial banks. Economists at the Board of Governors of the Federal Reserve System developed a duration-based economic value model (EVM) designed to estimate the interest rate sensitivity of banks. The authors test whether measures derived from the Fed's EVM are correlated with the interest rate sensitivity of U.S. community banks. The answer to this question is important because bank supervisors rely on EVM measures for monitoring and risk-scoping bank-level interest rate sensitivity. The authors find that the Federal Reserve's EVM is indeed correlated with banks' interest rate sensitivity and conclude that supervisors can rely on this tool to help assess a bank's interest rate risk. These results are consistent with prior research that finds the average interest rate risk at banks to be modest, though the potential interaction between interest rate risk and other risk factors is not considered here.
Article
In this dissertation, the main goal is visualisation of financial time series. We expect that visualisation of financial time series will be a useful auxiliary for technical analysis. Firstly, we review the technical analysis methods and test our trading rules, which are built by the essential concepts of technical analysis. Next, we compare the quality of linear principal component analysis and nonlinear principal component analysis in financial market visualisation. We compare different methods of data preprocessing for visualisation purposes. Using visualisation, we demonstrate the difference between normal and crisis time period. Thus, the visualisation of financial market can be a tool to support technical analysis.
Article
In the current low interest rate environment, the possibility of a sudden increase in rates is a potentially serious threat to financial stability. As a result, analyzing interest rate risk (IRR) is critical for financial institutions and supervisory agencies. We propose a new method for generating yield curve scenarios for stress testing banks’ exposure to IRR based on the Nelson-Siegel (1987) yield-curve model. We show that our method produces yield-curve scenarios with a wider variety of slopes and shapes than scenarios generated by the historical and hypothetical methods typically used in the banking industry and proposed in the literature. We stress test the economic value of equity of a bank balance sheet based on Call Report data from a large U.S. bank. We show that our method provides more information about the bank’s exposure to IRR using fewer yield-curve scenarios than the alternative historical and hypothetical methods.
Article
I examine whether a duration transformation of financial accounting information adequately captures interest rate sensitivity in financial institutions. First, I calculate an ex-ante measure of interest rate sensitivity in complex banking institutions based upon a model developed by bank supervisors at the Federal Reserve Board of Governors; the duration measure is a combination of financial accounting information and duration proxies. Second, I test for an association between this ex-ante measure of interest rate sensitivity and both accounting and capital market ex-post measures of interest rate sensitivity. The results show a predictable relationship between ex-ante modeled interest rate sensitivity and banks’ ex-post accounting performance and stock returns given observed changes in market interest rates. I conclude that an interest rate sensitivity model based upon a combination of financial accounting information and duration proxies can be used to predict interest rate sensitivity in complex banks, including publicly-traded bank holding companies. The issue is important because interest rate risk management is integral to financial institutions, and its reliable measurement using publicly-available accounting data is relevant to bank managers, researchers, market participants, and regulators.
Article
This paper analyses the robustness of the standardised framework proposed by the Basel Committee on Banking Supervision (2004b) to quantify the interest rate risk of banks. We generalise this framework and study the change in the estimated level of interest rate risk if the strict assumptions of the standardised framework are violated. Using data on the German universal banking system, we find that estimates of the interest rate risk are very sensitive to the framework's assumptions. We conclude that the results obtained using the standardised framework in its current specification should be treated with caution when used for supervisory and risk management purposes.
Article
Nonlinear principal component analysis is a novel technique for multivariate data analysis, similar to the well-known method of principal component analysis. NLPCA, like PCA, is used to identify and remove correlations among problem variables as an aid to dimensionality reduction, visualization, and exploratory data analysis. While PCA identifies only linear correlations between variables, NLPCA uncovers both linear and nonlinear correlations, without restriction on the character of the nonlinearities present in the data. NLPCA operates by training a feedforward neural network to perform the identity mapping, where the network inputs are reproduced at the output layer. The network contains an internal “bottleneck” layer (containing fewer nodes than input or output layers), which forces the network to develop a compact representation of the input data, and two additional hidden layers. The NLPCA method is demonstrated using time-dependent, simulated batch reaction data. Results show that NLPCA successfully reduces dimensionality and produces a feature space map resembling the actual distribution of the underlying system parameters.
Chapter
Nonlinear principal component analysis (NLPCA) as a nonlinear generalisation of standard principal component analysis (PCA) means to generalise the principal components from straight lines to curves. This chapter aims to provide an extensive description of the autoassociative neural network approach for NLPCA. Several network architectures will be discussed including the hierarchical, the circular, and the inverse model with special emphasis to missing data. Results are shown from applications in the field of molecular biology. This includes metabolite data analysis of a cold stress experiment in the model plant Arabidopsis thaliana and gene expression analysis of the reproductive cycle of the malaria parasite Plasmodium falciparum within infected red blood cells.
Article
Summarising a high dimensional data set with a low dimensional embedding is a standard approach for exploring its structure. In this paper we provide an over view of some existing techniques for discovering such embeddings. We then introduce a novel probabilistic interpretation of principal component analysis (PCA) that we term dual probabilistic PCA (DPPCA). The DPPCA model has the additional advantage that the linear mappings from the embedded space can easily be non- linearised through Gaussian processes. We refer to this model as a Gaussian process latent variable model (GP-LVM). Through analysis of the GP-LVM objective function, we relate the model to popular spectral techniques such as kernel PCA and multidimensional scaling. We then review a practical algorithm for GP-LVMs in the context of large data sets and develop it to also handle discrete valued data and missing attributes. We demonstrate the model on a range of real-world and artificially generated data sets.
Article
We propose a procedure for representing a time series as the sum of a smoothly varying trend component and a cyclical component. We document the nature of the co-movements of the cyclical components of a variety of macroeconomic time series. We find that these co-movements are very different than the corresponding co-movements of the slowly varying trend components.
Article
The 2008/9 financial crisis highlighted the importance of evaluating vulnerabilities owing to interconnectedness, or Too-Connected-to-Fail risk, among financial institutions for country monitoring, financial surveillance, investment analysis and risk management purposes. This paper illustrates the use of balance sheet-based network analysis to evaluate interconnectedness risk, under extreme adverse scenarios, in banking systems in mature and emerging market countries, and between individual banks in Chile, an advanced emerging market economy.
Article
While conventional farming systems face serious problems of sustainability, organic agriculture is seen as a more environmentally friendly system as it favours renewable resources, recycles nutrients, uses the environment’s own systems for controlling pests and diseases, sustains ecosystems, protects soils, and reduces pollution. At the same time organic farming promotes animal welfare, the use of natural foodstuffs, product diversity and the avoidance of waste, among other practices. However, the future of organic agriculture will depend on its economic viability and on the determination shown by governments to protect these practices. This paper performs panel regressions with a sample of Catalan farms (Spain) to test the influence of organic farming on farm output, costs and incomes. It analyses the cost structures of both types of farming and comments on their social and environmental performance.
Article
The relationship between cointegration and error correction models, first suggested by Granger, is here extended and used to develop estimation procedures, tests, and empirical examples. A vector of time series is said to be cointegrated with cointegrating vector a if each element is stationary only after differencing while linear combinations a8xt are themselves stationary. A representation theorem connects the moving average , autoregressive, and error correction representations for cointegrated systems. A simple but asymptotically efficient two-step estimator is proposed and applied. Tests for cointegration are suggested and examined by Monte Carlo simulation. A series of examples are presented. Copyright 1987 by The Econometric Society.
La misurazione del rischio di tasso di interesse del portafoglio bancario nell'ambito di Basilea 2: quali le possibili criticità nella ricerca di nuove best practices
  • D Curcio
  • I Gianfrancesco
La vischiosità dei depositi a vista durante la recente crisi finanziaria: implicazioni in una prospettiva di risk management
  • I Gianfrancesco
  • C Giliberto
Nonlinear PCA based on neural networks
  • M Scholz
M. Scholz, Nonlinear PCA based on neural networks, Dep. Of Computer Science, Humboldt University Berlin Diploma Thesis (2002)
  • O Caligaris
O. Caligaris, Le reti neurali, Lettera matematica PRISTEM, 61: 20-28 (2007)
Principles for Management and Supervision of Interest Rate Risk. Bank for International Settlements
 Basel Committee on Banking Supervision (2004), Principles for Management and Supervision of Interest Rate Risk. Bank for International Settlements.