Working Paper

Loss Given Default of Secured Commercial Loans

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... However, internal rating models based on the standard linear econometric approach have been generally shown to exhibit poor performance in forecasting loss given default (Altman and Hotchkiss (2010)). Studies of credit risk show that while ML models outperform traditional models, their performance depends on the specific ML model, the environment, and the sample used in the analysis (see Bastos (2010), Qi andZhao (2011), Lotterman et al. (2012), Tows (2016), Bazarbash (2017), and Nazemi et al. (2018)). Motivated by these studies, this paper covers the most common and powerful ML methods applied for credit analysis. ...
... See for instance,Lotterman et al. (2012), among others, who show superior performance of machine learning models, including support vector machines and neural networks relative to the typical linear models in predicting loss given default. For a recent example, seeBazarbash (2019). 4 For a review of recent academic literature studying the use of digital technology in finance, seeGomber et al. (2017).5 ...
... However, internal rating models based on the standard linear econometric approach have been generally shown to exhibit poor performance in forecasting loss given default (Altman and Hotchkiss (2010)). Studies of credit risk show that while ML models outperform traditional models, their performance depends on the specific ML model, the environment, and the sample used in the analysis (see Bastos (2010), Qi andZhao (2011), Lotterman et al. (2012), Tows (2016), Bazarbash (2017), and Nazemi et al. (2018)). Motivated by these studies, this paper covers the most common and powerful ML methods applied for credit analysis. ...
Article
Recent advances in digital technology and big data have allowed FinTech (financial technology) lending to emerge as a potentially promising solution to reduce the cost of credit and increase financial inclusion. However, machine learning (ML) methods that lie at the heart of FinTech credit have remained largely a black box for the nontechnical audience. This paper contributes to the literature by discussing potential strengths and weaknesses of ML-based credit assessment through (1) presenting core ideas and the most common techniques in ML for the nontechnical audience; and (2) discussing the fundamental challenges in credit risk analysis. FinTech credit has the potential to enhance financial inclusion and outperform traditional credit scoring by (1) leveraging nontraditional data sources to improve the assessment of the borrower's track record; (2) appraising collateral value; (3) forecasting income prospects; and (4) predicting changes in general conditions. However, because of the central role of data in ML-based analysis, data relevance should be ensured, especially in situations when a deep structural change occurs, when borrowers could counterfeit certain indicators, and when agency problems arising from information asymmetry could not be resolved. To avoid digital financial exclusion and redlining, variables that trigger discrimination should not be used to assess credit rating. JEL Classification Numbers: C52, C53, C55, G21, G23
Article
Full-text available
The introduction of the Basel II Accord has had a huge impact on financial institutions, allowing them to build credit risk models for three key risk parameters: PD (probability of default), LGD (loss given default) and EAD (exposure at default). Until recently, credit risk research has focused largely on the estimation and validation of the PD parameter, and much less on LGD modeling. In this first large-scale LGD benchmarking study, various regression techniques for modeling and predicting LGD are investigated. These include one-stage models, such as those built by ordinary least squares regression, beta regression, robust regression, ridge regression, regression splines, neural networks, support vector machines and regression trees, as well as two-stage models which combine multiple techniques. A total of 24 techniques are compared using six real-life loss datasets from major international banks. It is found that much of the variance in LGD remains unexplained, as the average prediction performance of the models in terms of R2 ranges from 4% to 43%. Nonetheless, there is a clear trend that non-linear techniques, and in particular support vector machines and neural networks, perform significantly better than more traditional linear techniques. Also, two-stage models built by a combination of linear and non-linear techniques are shown to have a similarly good predictive power, with the added advantage of having a comprehensible linear model component.
Article
Full-text available
In this research project, we will analyze the recovery rates of defaulted bonds in the US corporate bond market. Our data set has been obtained from the Trade Reporting and Compliance Engine (TRACE) database maintained by the Financial Regulatory Authority (FINRA) and will allow us, for the first time, to analyze the traded prices and volumes of defaulted bonds based on a complete set of transaction data. Analyzing in detail the microstructure of trading will allow us to estimate reliable market-based recovery rates for a broad cross-section of defaulted corporate bonds. Our emphasis in this research will be on investigating the relation between these recovery rates and a comprehensive set of potential determinants suggested by various theoretical models, e.g., bond characteristics, firm fundamentals and indicators of overall macroeconomic conditions. In addition, we will analyze – for the first time – the effect of liquidity on the recovery rates, which is particularly interesting, due to the potential illiquidity of bonds following default. The results would offer new insights in the context of the credit risk and liquidity literature, since existing studies of the recovery rate are not as comprehensive in scope as our proposed research project.
Article
Full-text available
The New Basel Accord, which was implemented in 2007, has made a significant difference to the use of modelling within financial organisations. In particular it has highlighted the importance of Loss Given Default (LGD) modelling. We propose a decision tree approach to modelling LGD for unsecured consumer loans where the uncertainty in some of the nodes is modelled using a mixture model, where the parameters are obtained using regression. A case study based on default data from the in-house collections department of a UK financial organisation is used to show how such regression can be undertaken.
Book
Full-text available
Throughout history, rich and poor countries alike have been lending, borrowing, crashing--and recovering--their way through an extraordinary range of financial crises. Each time, the experts have chimed, "this time is different"--claiming that the old rules of valuation no longer apply and that the new situation bears little similarity to past disasters. With this breakthrough study, leading economists Carmen Reinhart and Kenneth Rogoff definitively prove them wrong. Covering sixty-six countries across five continents, This Time Is Different presents a comprehensive look at the varieties of financial crises, and guides us through eight astonishing centuries of government defaults, banking panics, and inflationary spikes--from medieval currency debasements to today's subprime catastrophe. Carmen Reinhart and Kenneth Rogoff, leading economists whose work has been influential in the policy debate concerning the current financial crisis, provocatively argue that financial combustions are universal rites of passage for emerging and established market nations. The authors draw important lessons from history to show us how much--or how little--we have learned. Using clear, sharp analysis and comprehensive data, Reinhart and Rogoff document that financial fallouts occur in clusters and strike with surprisingly consistent frequency, duration, and ferocity. They examine the patterns of currency crashes, high and hyperinflation, and government defaults on international and domestic debts--as well as the cycles in housing and equity prices, capital flows, unemployment, and government revenues around these crises. While countries do weather their financial storms, Reinhart and Rogoff prove that short memories make it all too easy for crises to recur. An important book that will affect policy discussions for a long time to come, This Time Is Different exposes centuries of financial missteps.
Article
Full-text available
In this paper we analyse a comprehensive database of 149,378 recovery rates on Italian bank loans. We investigate a new methodology to compute the recovery percentage that we suggest to consider as a mixed random variable. To estimate the probability density function of such a mixture, we propose the mixture of beta kernels estimator and we analyse its performance by Monte Carlo simulations. The application of these proposals to the Bank of Italy's data shows that, even if we remove the endpoints from the support of the recovery rate, the density function estimate is far from being a beta function.
Article
Full-text available
The aim of credit risk models is to identify and quantify future outcomes of a set of risk measurements. In other words, the model's purpose is to provide as good an approximation as possible of what constitutes the true underlying risk relationship between a set of inputs and a target variable. These parameters are used for regulatory capital calculations to determine the capital needed that serves a buffer to protect depositors in adverse economic conditions. In order to manage model risk, financial institutions need to set up validation processes so as to monitor the quality of the models on an ongoing basis. Validation is important to inform all stakeholders (e.g. board of directors, senior management, regulators, investors, borrowers, …) and as such allow them to make better decisions. Validation can be considered from both a quantitative and qualitative point of view. Backtesting and benchmarking are key quantitative validation tools. In backtesting, the predicted risk measurements (PD, LGD, CCF) will be contrasted with observed measurements using a workbench of available test statistics to evaluate the calibration, discrimination and stability of the model. A timely detection of reduced performance is crucial since it directly impacts profitability and risk management strategies. The aim of benchmarking is to compare internal risk measurements with external risk measurements so to allow to better gauge the quality of the internal rating system. This paper will focus on the quantitative PD validation process within a Basel II context. We will set forth a traffic light indicator approach that employs all relevant statistical tests to quantitatively validate the used PD model, and document this complete approach with a reallife case-study.
Article
Full-text available
We examine the effect of US branch banking deregulations on the entry size of new firms using micro-data from the US Census Bureau. We find that the average entry size for startups did not change following the deregulations. However, among firms that survived at least four years, a greater proportion of firms entered either at their maximum size or closer to the maximum size in the first year. The magnitude of these effects were small compared to the much larger changes in entry rates of small firms following the reforms. Our results highlight that this large-scale entry at the extensive margin can obscure the more subtle intensive margin effects of changes in financing constraints.
Article
Full-text available
Customer relationships arise between banks and firms because, in the process of lending, a bank learns more than others about its own customers. This information asymmetry allows lenders to capture some of the rents generated by their older customers; competition, thus, drives banks to lend to new firms at interest rates that initially generate expected losses. As a result, the allocation of capital is shifted toward lower quality and inexperienced firms. This inefficiency is eliminated if complete contingent contracts are written or, when this is costly, if banks can make nonbinding commitments that, in equilibrium, are backed by reputation. Copyright 1990 by American Finance Association.
Article
Full-text available
The aim of this paper is to develop an aggregate stability index for the Romanian financial system, which is meant to enhance the set of analysis used by authorities to assess the financial system stability. The index takes into consideration indicators related to financial system development, vulnerability, soundness and also indicators which characterise the international economic climate. Another purpose of our study is to forecast the financial stability level, using a stochastic simulation model. The outcome of the study shows an improvement of the Romanian financial system stability during the period 1999-2007. The constructed aggregate index captures the financial turbulences periods like 1998-1999 Romanian banking crisis and 2007 subprime crisis. The forecasted values of the index show a deterioration of financial stability in 2009, influenced by the estimated decline of the financial and economic activity.
Article
Full-text available
This paper develops a computable general equilibrium model in which endogenous agency costs can potentially alter business-cycle dynamics. A principal conclusion is that the agency-cost model replicates the empirical fact that output growth displays positive autocorrelation at short horizons. This hump-shaped output behavior arises because households delay their investment decisions until agency costs are at their lowest--a point in time several periods after the initial shock. Copyright 1997 by American Economic Association.
Article
This paper extends what we know about loss given default (LGD) on commercial loans by studying certain types of these loans that have been excluded from previous research but that may be more representative of loans held by small and mid-sized banks. We use a newly available dataset on commercial loan losses from failed banks that were resolved by the FDIC using loss share agreements. We examine LGD for more than 50,000 distressed loans, broken into three categories: construction and development loans, other commercial real estate loans, and commercial and industrial loans. We compare the characteristics of these loans with those of previous studies and find many similarities as well as significant differences. We explore the relationship between LGD and default date, workout period, loan modification, asset size, bank characteristics, geography, lien status, and other factors that may be related to loss severity. The results inform commercial lenders and regulators about the factors that influence losses on defaulted loans during periods of distress, and provide a useful benchmark for stress testing for smaller banks. To the best of our knowledge, this paper also offers the first published empirical analysis of LGD for construction and development loans.
Article
This paper extends what we know about loss given default (LGD) by examining a newly available dataset on commercial real estate (CRE) loan losses. These data come from 295 failed banks resolved by the FDIC using loss-share agreements between 2008 and 2013. We examine over 14,000 distressed CRE loans to study the relationship between LGD and loan size, workout period, loan seasoning, asset price changes over the life of the loan, and other factors related to losses. We also examine the relationship between LGD and certain bank characteristics. The results inform commercial lenders and regulators about the factors that influence losses on defaulted loans during periods of distress.
Article
We conduct a comprehensive study of some parametric models that are designed to fit the unusual bounded and bimodal distribution of loss given default (LGD). We first examine a smearing estimator, a Monte Carlo estimator and a global adjustment approach to refine transformation regression models that address issues with LGD boundary values. Although these refinements only marginally improve model performance, the smearing and Monte Carlo estimators help to reduce the sensitivity of transformation regressions to the adjustment factor. We then conduct a horse race among the refined transformation methods, five parametric models that are specifically suitable for LGD modeling (two-step, inflated beta, Tobit, censored gamma and two-tiered gamma regressions), fractional response regression and standard linear regression. We find that the sophisticated parametric models do not clearly outperform the simpler ones in either predictive accuracy or rank-ordering ability, in-sample, out-of-sample or out of time. Therefore, it is important for modelers and researchers to choose the model that is appropriate for their particular data set, considering differences in model complexity, computational burden, ease of implementation and model performance.
Article
In credit risk modeling, banks and insurance companies routinely use a single model for estimating key risk parameters. Combining several models to make a final prediction is not often considered. Using an ensemble or a collection of models rather than a single model can improve the accuracy and robustness of prediction results. In this study, we investigate two well-established ensemble learning methods (stochastic gradient boosting and random forest) and propose two new ensembles (ensemble by partial least squares and bag-boosting) in the application of predicting the loss given default. We demonstrate that an ensemble approach significantly increases the discriminatory power of the model compared with a single decision tree. In addition, the ensemble learning methods can be applied directly to predicting the exposure at default and probability of default with some simple modifications. The proposed approaches introduce a novel modeling framework that banks and other financial institutions can use to estimate and validate credit risk parameters based on the internal data of different portfolios. Moreover, the proposed approaches can be readily extended to general portfolio risk modeling in the areas of regulatory capital and economic capital management, loss forecasting, stress testing and pre-provision net revenue projections.
Book
An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.
Article
In many domains, the combined opinion of a committee of experts provides better decisions than the judgment of a single expert. This paper shows how to implement a successful ensemble strategy for predicting recovery rates on defaulted debts. Using data from Moody’s Ultimate Recovery Database, it is shown that committees of models derived from the same regression method present better forecasts of recovery rates than a single model. More accurate predictions are observed whether we forecast bond or loan recoveries, and across the entire range of actual recovery values.
Article
The 2007-09 financial crisis illustrated the importance of healthy banks for the overall stability of the financial system and economy. Because banking is inherently risky, the health of banks depends on their ability to manage risk and exposure to losses. ; An important component of a strong risk management system is a bank’s ability to assess the potential losses on its investments. One factor that determines the extent of losses is the recovery rate on loans and bonds that are in default. For example, the recovery rate is said to be 50 percent if the creditor is able to recover only half the amount of principal and accrued interest due. ; Drawing on more than 30 years of recovery data on defaulted debt instruments, Mora finds that the state of the economy helps determine creditor recovery rates. Industry distress also drives recovery rates, and evidence suggests industry distress can be triggered by a weak economy.
Article
Economists have long recognized the importance of information veracity in valuing risky securities. Market participants concerned about the credibility of information measures may require additional compensation to entice them to hold stocks with less transparent information. These same securities are expected to display greater sensitivities to measures of market sentiment. We find that investor sentiment sensitivities increase directly with multiple measures of opacity in the cross-section. Next we examine the extent to which sentiment sensitivities are priced in an asset pricing context. Using the Jha et al. (2009) model of conditional performance evaluation, we find an inverse relation between ex ante known investor sentiment and the marginal performance of opaque stocks. In contrast, translucent stocks exhibit relatively little variability in performance across levels of sentiment.
Article
The sample of observed defaults significantly understates the average firm׳s true expected cost of default due to a sample selection bias. I use a dynamic capital structure model to estimate firm-specific expected default costs and quantify the selection bias. The average firm expects to lose 45% of firm value in default, a cost higher than existing estimates. However, the average cost among defaulted firms in the estimated model is only 25%, a value consistent with existing empirical estimates from observed defaults. This substantial selection bias helps to reconcile the levels of leverage and default costs observed in the data.
Article
This article surveys the macroeconomic implications of financial frictions. Financial frictions lead to persistence and when combined with illiquidity to non-linear amplification effects. Risk is endogenous and liquidity spirals cause financial instability. Increasing margins further restrict leverage and exacerbate downturns. A demand for liquid assets and a role for money emerges. The market outcome is generically not even constrained efficient and the issuance of government debt can lead to a Pareto improvement. While financial institutions can mitigate frictions, they introduce additional fragility and through their erratic money creation harm price stability.Institutional subscribers to the NBER working paper series, and residents of developing countries may download this paper without additional charge at www.nber.org.
Article
Financial innovation, changing regulatory requirements and competitive pressures have fueled recent interest in modeling the payoffs to corporate debt in the event of default. Building on the findings of empirical research, we propose in this paper a simple, flexible approach to forecasting the distribution of defaulted debt recovery outcomes. Our approach is based on mixtures of Gaussian distributions, explicitly conditioned on borrower characteristics, debt instrument characteristics and credit conditions at the time of default. Using Moody's Ultimate Recovery Database, we show that our mixture specification out-performs popular regression based alternatives whilst providing a much richer characterization of how conditioning variables affect distributional properties.
Article
Using Moody’s Ultimate Recovery Database, we estimate a model for bank loan recoveries using variables reflecting loan and borrower characteristics, industry and macroeconomic conditions, and several recovery process variables. We find that loan characteristics are more significant determinants of recovery rates than are borrower characteristics prior to default. Industry and macroeconomic conditions are relevant, as are prepackaged bankruptcy arrangements. We examine whether a commonly used proxy for recovery rates, the 30-day post-default trading price of the loan, represents an efficient estimate of actual recoveries and find that such a proxy is biased and inefficient.
Article
The concept of robustness started emerging in financial and economics literature in the late 1990s. In particular, principles from robust control theory have been used by economic decision makers to investigate the fragility of decision rules across a range of economic models. In line with this tendency, our article applies principles from robustness to a situation where the decision maker is a bank owner and the decision rule determines the optimal provisioning strategy for loan losses. In this regard, we recognize that bank provisions are made for debts that have been identified as impaired or non-performing. Our first objective is to formulate a dynamic banking loan loss model involving a provisioning portfolio consisting of provisions for expected losses and loan loss reserves for unexpected losses. Here, unexpected loan losses and provisioning for expected losses are modeled via a compound Poisson process and an exponential Lévy process, respectively. Historical evidence from Organization for Economic Corporation and Development countries assists in confirming some of the modeling choices made. This setup naturally leads to a finite-horizon provisioning problem that may be solved via a mixed optimal/robust control approach involving a constraint for risk. Our investigation concludes with a brief analysis of some of the robustness issues and suggestions for topics of possible future research. Copyright © 2008 John Wiley & Sons, Ltd.
Article
This paper reviews empirical research on the use of private and court-supervised mechanisms for resolving default and reorganizing companies in financial distress. Starting with a simple framework for financial distress and a quick overview of the theoretical research in this area, we proceed to summarize and synthesize the empirical research in the areas of financial distress, asset and debt restructuring, and features of the formal bankruptcy procedures in the US and around the world. Studies of out-of-court restructurings (workouts and exchange offers), corporate governance issues relating to distressed restructurings, and the magnitude of the costs and the efficiency of bankruptcy reorganizations are among the topics covered.
Article
This article, based on a 27-year study, describes the characteristics of commercial and industrial loan defaults in Latin America (LA), including in particular the loss in the event of default (LIED). For banks, improved understanding of losses enables lenders to make better pricing decisions, to allocate capital more efficiently, and to obtain more accurate estimates of loan losses and valuations of existing loan portfolios. For investors, the benefit is a more informed decision about portfolio diversification. The article provides characteristics of 1,149 Latin American defaults, examines the distribution of defaults by country, depicts the stability of LA LIED by year, and reports the effects of sovereign events on Latin American corporate loan loss rates.
Article
Structural models of default calibrated to historical default rates, recovery rates, and Sharpe ratios typically generate Baa--Aaa credit spreads that are significantly below historical values. However, this "credit spread puzzle" can be resolved if one accounts for the fact that default rates and Sharpe ratios strongly covary; both are high during recessions and low during booms. As a specific example, we investigate credit spread implications of the Campbell and Cochrane (1999) pricing kernel calibrated to equity returns and aggregate consumption data. Identifying the historical surplus consumption ratio from aggregate consumption data, we find that the implied level and time variation of spreads match historical levels well. The Author 2008. Published by Oxford University Press on behalf of The Society for Financial Studies. All rights reserved. For Permissions, please email: journals.permissions@oxfordjournals.org, Oxford University Press.
Article
We compare six modeling methods for Loss Given Default (LGD). We find that non-parametric methods (regression tree and neural network) perform better than parametric methods both in and out of sample when over-fitting is properly controlled. Among the parametric methods, fractional response regression has a slight edge over OLS regression. Performance of the transformation methods (inverse Gaussian and beta transformation) is very sensitive to [epsilon], a small adjustment made to LGDs of 0 or 1 prior to transformation. Model fit is poor when [epsilon] is too small or too large, although the fitted LGDs have strong bi-modal distribution with very small [epsilon]. Therefore, models that produce strong bi-model pattern do not necessarily have good model fit and accurate LGD predictions. Even with an optimal [epsilon], the performance of the transformation methods can only match that of the OLS.
Article
The empirical literature on credit risk has relied mostly on the corporate bond market to estimate losses in the event of default. The reason for this is that, as bank loans are private instruments, few data on loan losses are publicly available. The contribution of this paper is to apply mortality analysis to a unique set of micro-data on defaulted bank loans of a European bank. The empirical results relate to the timing of recoveries on bad and doubtful bank loans, the distribution of cumulative recovery rates, their economic determinants and the direct costs incurred by that bank on recoveries on bad and doubtful loans.
Article
Using data on defaulted firms in the United States over the period 1982–1999, we show that creditors of defaulted firms recover significantly lower amounts in present-value terms when the industry of defaulted firms is in distress. We investigate whether this is purely an economic-downturn effect or also a fire-sales effect along the lines of Shleifer and Vishny [1992. Liquidation values and debt capacity: a market equilibrium approach. Journal of Finance 47, 1343–1366]. We find the fire-sales effect to be also at work: Creditors recover less if the industry is in distress and non-defaulted firms in the industry are illiquid, particularly if the industry is characterized by assets that are specific, that is, not easily redeployable by other industries, and if the debt is collateralized by such specific assets. The interaction effect of industry-level distress and asset-specificity is strongest for senior unsecured creditors, is economically significant, and robust to contract-specific, firm-specific, macroeconomic, and bond-market supply effects. We also document that defaulted firms in distressed industries are more likely to emerge as restructured firms than to be acquired or liquidated, and spend longer time in bankruptcy.
Article
There are very few studies concerning the recovery rate of bank loans. Prediction models of recovery rates are increasing in importance because of the Basel II-framework, the impact on credit risk management, and the calculation of loan rates. In this study, we focus the analyses on the distribution of recovery rates and the impact of the quota of collateral, the creditworthiness of the borrower, the size of the company and the intensity of the client relationship on the recovery rate. All our hypotheses can be confirmed. A higher quota of collateral leads to a higher recovery rate, whereas the risk premium of the borrower and the size of the company is negatively related to the recovery rate. Borrowers with an intense client relationship with the bank exhibit a higher recovery rate.
Article
The New Basel Accord will allow internationally active banking organizations to calculate their credit risk capital requirements using an internal ratings based (IRB) approach, subject to supervisory review. One of the modeling components is loss given default (LGD), the credit loss incurred if an obligor of the bank defaults. The flexibility to determine LGD values tailored to a bank's portfolio will likely be a motivation for a bank to want to move from the foundation to the advanced IRB approach. The appropriate degree of flexibility depends, of course, on what a bank knows about LGD broadly and about differentiated LGDs in particular; consequently supervisors must be able to evaluate "what a bank knows." The key issues around LGD are: 1) What does LGD mean and what is its role in IRB? 2) How is LGD defined and measured? 3) What drives differences in LGD? 4) What approaches can be taken to model or estimate LGD? By surveying the academic and practitioner literature, with supportive examples and illustrations from public data sources, this paper is designed to provides basic answers to these questions. The factors which drive significant differences in LGD include place in the capital structure, presence and quality of collateral, industry and timing of the business cycle.
Article
This paper proposes a dynamic model to estimate the credit loss distribution of the aggregate portfolio of loans granted in a banking system. We consider a sectoral approach distinguishing between corporates and households. The evolution of their default frequencies and the size of the loans portfolio are expressed as functions of macroeconomic conditions as well as unobservable credit risk factors, which capture contagion effects between sectors. In addition, we model the distributions of the Exposures at Default and the Losses Given Default. We apply our framework to the Spanish banking system, where we find that sectoral default frequencies are not only affected by economic cycles but also by a persistent latent factor. Finally, we identify the riskier sectors, perform stress tests and compare the relative risk of small and large institutions.
Article
This paper examines the role of relationship lending using a data set on small firm finance. The abilities to acquire private information over time about borrower quality and to use this information in designing debt contracts largely define the unique nature of commercial banking. Recently, a theoretical literature on relationship lending has appeared which provides predictions about how loan interest rates evolve over the course of a bank-borrower relationship. The study focuses on small, mostly untraded firms for which the bank-borrower relationship is likely to be important. The authors examine lending under lines of credit (L/Cs), because the L/C itself represents a formalization of the relationship and the data are thus more "relationship-driven." They also analyze the empirical association between relationship lending and the collateral decision. Using data from the National Survey of Small Business Finance, the authors find that borrowers with longer banking relationships pay a lower interest rate and are less likely to pledge collateral. Empirical results also suggest that banks accumulate increasing amounts of this private information over the duration of the bank-borrower relationship.
Article
Interest income is the most important source of revenue for most banks. The aim of this paper is to assess the impact of different interest rate scenarios on the banks' interest income. As we do not know the interest rate sensitivity of real banks, we construct for each bank a portfolio with a similar composition of its assets and liabilities, called 'tracking bank'. We evaluate the effect of 260 historical interest rate shocks on the tracking banks of German savings and cooperative banks. It turns out that a sharp decrease in the steepness of the yield curve has the most negative impact on the banks' interest income.
Article
The past twenty years have seen great theoretical and empirical advances in the field of corporate finance. Whereas once the subject addressed mainly the financing of corporations--equity, debt, and valuation--today it also embraces crucial issues of governance, liquidity, risk management, relationships between banks and corporations, and the macroeconomic impact of corporations. However, this progress has left in its wake a jumbled array of concepts and models that students are often hard put to make sense of. Here, one of the world's leading economists offers a lucid, unified, and comprehensive introduction to modern corporate finance theory. Jean Tirole builds his landmark book around a single model, using an incentive or contract theory approach. Filling a major gap in the field, The Theory of Corporate Finance is an indispensable resource for graduate and advanced undergraduate students as well as researchers of corporate finance, industrial organization, political economy, development, and macroeconomics. Tirole conveys the organizing principles that structure the analysis of today's key management and public policy issues, such as the reform of corporate governance and auditing; the role of private equity, financial markets, and takeovers; the efficient determination of leverage, dividends, liquidity, and risk management; and the design of managerial incentive packages. He weaves empirical studies into the book's theoretical analysis. And he places the corporation in its broader environment, both microeconomic and macroeconomic, and examines the two-way interaction between the corporate environment and institutions. Setting a new milestone in the field, The Theory of Corporate Finance will be the authoritative text for years to come.
Article
The authors explore the determinants of liquidation values of assets, particularly focusing on the potential buyers of assets. When a firm in financial distress needs to sell assets, its industry peers are likely to be experiencing problems themselves, leading to asset sales at prices below value in best use. Such illiquidity makes assets cheap in bad times and so ex ante is a significant private cost of leverage. The authors use this focus on asset buyers to explain variation in debt capacity across industries and over the business cycle, as well as the rise in U.S. corporate leverage in the 1980s. Copyright 1992 by American Finance Association.
Article
We develop a framework for modelling conditional loss distributions through the introduction of risk factor dynamics. Asset value changes of a credit portfolio are linked to a dynamic global macroeconometric model, allowing macro effects to be isolated from idiosyncratic shocks. Default probabilities are driven primarily by how firms are tied to business cycles, both domestic and foreign, and how business cycles are linked across countries. The model is able to control for firm-specific heterogeneity as well as generate multi-period forecasts of the entire loss distribution, conditional on specific macroeconomic scenarios.
Article
This paper empirically examines how ties between a firm and its creditors affect the availability and cost of funds to the firm. The authors analyze data collected in a survey of small firms by the Small Business Administration. The primary benefit of building close ties with an institutional creditor is that the availability of financing increases. The authors find smaller effects on the price of credit. Attempts to widen the circle of relationships by borrowing from multiple leaders increases the price and reduces the availability of credit. In sum, relationships are valuable and appear to operate more through quantities rather than prices. Copyright 1994 by American Finance Association.
Article
Evidence from many countries in recent years suggests that collateral values and recovery rates (RRs) on corporate defaults can be volatile and, moreover, that they tend to go down just when the number of defaults goes up in economic downturns. This link between RRs and default rates has traditionally been neglected by credit risk models, as most of them focused on default risk and adopted static loss assumptions, treating the RR either as a constant parameter or as a stochastic variable independent from the probability of default (PD). This traditional focus on default analysis has been partly reversed by the recent significant increase in the number of studies dedicated to the subject of recovery‐rate estimation and the relationship between default and RRs. This paper presents a detailed review of the way credit risk models, developed during the last 30 years, treat the RR and, more specifically, its relationship with the PD of an obligor. Recent empirical evidence concerning this issue is also presented and discussed. (J.E.L.: G15, G21, G28).
Article
We document empirically the determinants of the observed recovery rates on defaulted securities in the United States over the period 1982–1999. The recovery rates are measured using the prices of defaulted securities at the time of default and at the time of emergence from default or from bankruptcy. In addition to seniority and security of the defaulted securities, industry conditions at the time of default are found to be robust and important determinants of the recovery rates. In particular, recovery in a distressed state of the industry (median annual stock return for the industry firms being less than -30%) is lower than the recovery in a healthy state of the industry by 10 to 20 cents on a dollar depending on the measure of recovery employed. The determinants of recovery rates appear to be different from the firm-specific determinants of default risk of the firm. Our results underscore the existence of substantial variability in recoveries, in the cross-section of securities as well as in the time-series, and suggest that in order to capture recovery risk, the credit risk models require an industry factor in addition to the factor representing the firm value.
Article
This paper develops a simple neoclassical model of the business cycle in which the condition of borrowers' balance sheets is a source of output dynamics. The mechanism is that higher borrower net worth reduces the agency costs of financing real capital investments. Business upturns improve net worth, lower agency costs, and increase investment, which amplifies the upturn; vice versa, for downturns. Shocks that affect net worth (as in a debt-deflation) can initiate fluctuations. Copyright 1989 by American Economic Association.
A market based macro stress test for the corporate credit exposures of UK banks
  • M Drehmann
Drehmann, M. (2005, April). A market based macro stress test for the corporate credit exposures of UK banks. In BCBS seminarBanking and Financial Stability: Workshop on Applied Banking Research.
Corporate defaults and macroeconomic shocks: non-linearities and uncertainty
  • M Drehmann
  • A J Patton
  • S Sorensen
Drehmann, M., Patton, A. J., & Sorensen, S. (2006). Corporate defaults and macroeconomic shocks: non-linearities and uncertainty. Bank of England, mimeo.