Modeling data with multiple time dimensions

ArticleinComputational Statistics & Data Analysis 51(9):4761-4785 · February 2007with 1,235 Reads 
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
Cite this publication
Abstract
A large class of problems in time series analysis can be represented by a set of overlapping time series with different starting times. These time series may be treated as different probes of the same underlying process. Such probes may follow a characteristic lifecycle as a function of the time since the series began. They may also be subject to environmental shocks according to calendar time. In addition, the calibration of each probe may be unknown such that each series may show a different magnitude of response to the underlying lifecycles and environmental impacts.This paper describes an approach to analyzing these multiple time series as a single set such that the underlying lifecycles and calendar-based shocks may be measured. Simultaneously, the individual calibrations of the time series are also measured. This technique is referred to as dual-time dynamics, and it applies to many important business problems. Applications to tree ring analysis, the SETI@home project, and retail loan portfolio forecasting are provided. Other areas of possible application include digital media services, insurance, human resource management, health care, and biological systems to name a few.

Do you want to read the rest of this article?

Request Full-text Paper PDF
Advertisement
  • ... The DtD refers to the dual-time dynamics, adopted by Strategic Analytics Inc., for consumer behavior modeling and delinquency forecasting. It was brought to the public domain by Breeden (2007). Using our notations, the DtD model takes the ...
    ... and η −1 (·) corresponds to the so-called superposition function in Breeden (2007). ...
    ... Clearly, the DtD model involves the interactions between the vintage effects and (f (m), g(t)). Unlike the sequential GAMEED above, Breeden (2007) suggested to iteratively fit f (m), g(t) and γ ...
    Thesis
    Full-text available
    This research deals with some statistical modeling problems that are motivated by credit risk analysis. Credit risk modeling has been the subject of considerable research interest in finance and has recently drawn the attention of statistical researchers. In the first chapter, we provide an up-to-date review of credit risk models and demonstrate their close connection to survival analysis. The first statistical problem considered is the development of adaptive smoothing spline (AdaSS) for heterogeneously smooth function estimation. Two challenging issues that arise in this context are evaluation of reproducing kernel and determination of local penalty, for which we derive an explicit solution based on piecewise type of local adaptation. Our nonparametric AdaSS technique is capable of fitting a diverse set of `smooth' functions including possible jumps, and it plays a key role in subsequent work in the thesis. The second topic is the development of dual-time analytics for observations involving both lifetime and calendar timescale. It includes "vintage data analysis" (VDA) for continuous type of responses in the third chapter, and "dual-time survival analysis" (DtSA) in the fourth chapter. We propose a maturation-exogenous-vintage (MEV) decomposition strategy in order to understand the risk determinants in terms of self-maturation in lifetime, exogenous influence by macroeconomic conditions, and heterogeneity induced from vintage originations. The intrinsic identification problem is discussed for both VDA and DtSA. Specifically, we consider VDA under Gaussian process models, provide an efficient MEV backfitting algorithm and assess its performance with both simulation and real examples. DtSA on Lexis diagram is of particular importance in credit risk modeling where the default events could be triggered by both endogenous and exogenous hazards. We consider nonparametric estimators, first-passage-time parameterization and semiparametric Cox regression. These developments extend the family of models for both credit risk modeling and survival analysis. We demonstrate the application of DtSA to credit card and mortgage risk analysis in retail banking, and shed some light on understanding the ongoing credit crisis.
  • ... Stress testing is usually interpreted as predicting the future performance of a portfolio under the influence of a severe recession (Berkowitz, 2000). Many approaches have been proposed (Committee on the Global Financial System, April 2001; Lily and Hong, August 2004;Perli and Nayda, 2004;Breeden, 17 May 2007;Rösch and Scheule, 2007;Bellotti and Crook, 2009;Malik and Thomas, 2010;Breeden, 2009), but those that are dedicated to lending include a quantification of lifecycle, macroeconomic, and credit quality effects (Breeden, 17 May 2007;Bellotti and Crook, 2009;Malik and Thomas, 2008;Banerjee and Canals-Cerdaá, June 2012). Although including all three of these factors is essential, it also introduces some important modelling challenges. ...
    ... Stress testing is usually interpreted as predicting the future performance of a portfolio under the influence of a severe recession (Berkowitz, 2000). Many approaches have been proposed (Committee on the Global Financial System, April 2001; Lily and Hong, August 2004;Perli and Nayda, 2004;Breeden, 17 May 2007;Rösch and Scheule, 2007;Bellotti and Crook, 2009;Malik and Thomas, 2010;Breeden, 2009), but those that are dedicated to lending include a quantification of lifecycle, macroeconomic, and credit quality effects (Breeden, 17 May 2007;Bellotti and Crook, 2009;Malik and Thomas, 2008;Banerjee and Canals-Cerdaá, June 2012). Although including all three of these factors is essential, it also introduces some important modelling challenges. ...
    ... Of the many models used for stress testing, successful models all have the common features of trying to capture the lifecycle, credit quality, and environmental impacts. These include APC models (Mason and Fienberg, 1985;Glenn, 2005), Dual-time Dynamics (Breeden, 17 May 2007), Survival or Proportional Hazard Functions (Cox and Oakes, 1984;Hosmer and Lemeshow, 1999;Therneau and Grambsch, 2000;Efron, 2002), and Panel Data Methods (Arellano, 2003;Hsiao, 2003). Differences arise in whether parametric or nonparametric functions are chosen, and whether the modelling is done at the account or vintage level, but the same dimensions are present within the models. ...
    Article
    The regulatory and business need to expand the use of macroeconomic-scenario-based forecasting and stress testing in retail lending has led to a rapid expansion in the types and complexity of models being applied. As these models become more sophisticated and include lifecycle, credit quality, and macroeconomic effects, model specification errors become a common, but rarely identified feature of many of these models. This problem was discovered decades ago in demography with Age-Period-Cohort (APC) models, and we bring those insights to the retail lending context with a detailed discussion of the implications here. Although the APC literature proves that no universal, data-driven solution is possible, we propose a domain-specific solution that is appropriate to lending. This solution is demonstrated with an auto loan portfolio. © 2016 Operational Research Society Ltd. All rights reserved. 0160-5682/16.
  • ... With such a model the dynamics of the behavioural score B t is described by a K th order Markov chain where the transitions depend on economic variables and on the length of time the loan has been repaid. This last term does not occur in any corporate credit models but is of real importance in consumer lending ( Breeden 2007, Stepanova and Thomas 2002) ...
    ... Markov chain where the transitions depend on economic variables and on the length of time the loan has been repaid. This last term does not occur in any corporate credit models but is of real importance in consumer lending ( Breeden 2007, Stepanova and Thomas 2002) ...
    ... As is well known in consumer credit modeling (Breeden 2007, Stepanova and Thomas 2002), the age of the loan (the number of months since the account was opened) is an important factor in default risk. To investigate this we split age into seven segments namely, 0-6 months , 7-12 months, 13-18 months , 19-24 months , 25-36 months , 37-48 months , more than 48 months.. ...
    Article
    Full-text available
    The corporate credit risk literature has many studies modelling the change in the credit risk of corporate bonds over time. There is far less analysis of the credit risk for portfolios of consumer loans. However behavioural scores, which are commonly calculated on a monthly basis by most consumer lenders are the analogues of ratings in corporate credit risk. Motivated by studies in corporate credit risk, we develop a Markov chain model based on behavioural scores to establish the credit risk of portfolios of consumer loans. However such a consumer credit model differs in many respects from corporate credit ones based on Markov chains – the need for a second order Markov chain, the inclusion of economic variables and the age of the loan. The model is applied using data on a credit card portfolio from a major UK bank.
  • ... This provides an exploratory analysis which should help to identify a more parsimonious (and forward predictable) specification at the next stage. Breeden (2007) develops an approach to exactly this kind of nonparametric EMV decomposition, which he calls dual time dynamics because of the dual time scales (calendar-time and maturity) which are used to index the data in Table 1. This extends the approach beyond standard vintage analysis which typically does not explicitly model variability with calendar-time. ...
    ... We describe (4) as nonparametric, because the length of each of the parameter vectors β M , β E , and β V depends on the size of the observed data rather than being fixed as a function of the model. Breeden (2007;equation (19)) also considers models of exactly this form, and as described in Section 2.1, they are exactly analagous to APC models used in other fields. As with any standard factorial model, there is a natural identifiability conflict between a factor effect and the scalar intercept (β 0 ), which is typically addressed by a single linear constraint on the factor effects (or at least on their estimators). ...
    ... Breeden (2007, Section 3.6) provides a good example of this, where the results of fitting a GAM are found not to closely match the E, M and V functions which were used to generate a set of test data. Although not discussed explicitly by Breeden (2007) this behaviour is largely due to the identifiability conflict, as it can be seen that much closer resolution with the generating model can be achieved by addition of the same linear function to the estimated M and V effects, with a balancing subtraction from the E effect, exactly as implied by (6). ...
    Article
    In this paper, we consider the problem of modelling historical data on retail credit portfolio performance, with a view to forecasting future performance, and facilitating strategic decision making. We consider a situation, common in practice, where accounts with common origination date (typically month) are aggregated into a single vintage for analysis, and the data for analysis consists of a time series of a univariate portfolio performance variable (for example, the proportion of defaulting accounts) for each vintage over successive time periods since origination. An invaluable management tool for understanding portfolio behaviour can be obtained by decomposing the data series nonparametrically into components of exogenous variability (E), maturity (time since origination; M) and vintage (V), referred to as an EMV model. For example, identification of a good macroeconomic model is the key to effective forecasting, particularly in applications such as stress testing, and identification of this can be facilitated by investigation of the macroeconomic component of an EMV decomposition. We show that care needs to be taken with such a decomposition, drawing parallels with the Age-Period-Cohort approach, common in demography, epidemiology and sociology. We develop a practical decomposition strategy, and illustrate our approach using data extracted from a credit card portfolio.
  • ... However, they are less well designed for modeling continuous variables such as the exposure at default or loss given default, to name a couple common examples. Therefore, this paper will employ the approach of Dual-time Dynamics (DtD) [2]. DtD shares many of the same concepts with APC, but is generically applicable to the key rates encountered in retail lending. ...
    ... For richer data sets or coarser segmentations, cross-terms between quality and maturation or between quality and exogenous may be appropriate. The estimation process employed here is the same as described in [2], and has strong similarities to the iterative solution method of Generalized Additive Models [10]. For the initial decomposition, no macroeconomic or credit score factors are included. ...
    Article
    Stress testing has become an important topic in retail lending since the introduction of the new Basel II guidelines. The present work uses a scenario-based forecasting approach developed explicitly for retail lending in order to provide a suitable stress testing approach. We first decompose the historical vintage performance data into a maturation function of months-on-books, a quality function of vintage origination date, and an exogenous function of calendar date. In a second step, the exogenous function is modeled with macroeconomic data or factors representing portfolio management impacts. Stress tests are performed by extrapolating the exogenous function using externally provided scenarios for extreme macroeconomic events. The resulting scenario is combined with the known maturation and quality functions. This process is repeated for each of a key set of rates, such as default rate, exposure at default, and loss given default in the context of Basel II. These key rate forecasts are combined to create total portfolio forecasts and stress tests. This approach is demonstrated in an analysis of the US Mortgage markets.
  • ... The first is the need to validate the probability of default predictions that the scorecard makes rather than the relative ranking of the borrowers, which was what is important in deciding which applicants for credit to accept. . So one needs to be confident in the translation of score to probability of default and to use the standard chi square and normal distribution type tests to validate the model by backtesting to compare actual numbers of defaults with predicted ones. of consumer loans which can then be used for stress testing ( Breeden 2007, Breeden et al 2008b, Rosch and Schuele 2008, Malik and Thomas 2009 Similarly the fourth issue that the Basel Accord has highlighted, the need to model the recovery rate RR (or alternatively the loss given default LGD, where RR=1-LGD) of what percentage of a defaulted loan will subsequently be recovered is also so important that it deserves to be considered as a separate challenge ( Challenge 8). ...
    ... can then use such model to estimate portfolio level default rates. The types of models developed so far include reputation based models ( Andrade and Thomas 2007), dual time dynamics ( Breeden 2007, Breeden andThomas 2008a), survival analysis ( Bellotti andCrook 2009b, Malik and, and correlation models with added economic variables ( Rosch and Scheule 2003). Given the amount of research that has gone into corporate credit risk models, one suspects that there will be considerable more research into these consumer equivalents given the realisation by bankers now of how much more is being lent to households than to companies. ...
    Article
    Consumer finance has become one of the most important areas of banking, both because of the amount of money being lent and the impact of such credit on global economy and the realisation that the credit crunch of 2008 was partly due to incorrect modelling of the risks in such lending. This paper reviews the development of credit scoring—the way of assessing risk in consumer finance—and what is meant by a credit score. It then outlines 10 challenges for Operational Research to support modelling in consumer finance. Some of these involve developing more robust risk assessment systems, whereas others are to expand the use of such modelling to deal with the current objectives of lenders and the new decisions they have to make in consumer finance.
  • ... However, they are less well designed for modeling continuous variables such as the exposure at default (EAD) or loss given default (LGD), to name a couple of common examples. Therefore, in this paper we employ the approach of dual-time dynamics (DtD) (Breeden (2007)). DtD shares many of the same concepts with APC, but is generically applicable to the key rates encountered in retail lending. ...
    ... For richer datasets or coarser segmentations, cross- terms between quality and maturation, or between quality and exogenous, may be appropriate. The estimation process employed here is the same as described by Breeden (2007), and has strong similarities to the iterative solution method of generalized additive models (Hastie and Tibshirani (1990)). For the initial decomposition, no macroeconomic or credit score factors are included. ...
  • ... However, they are less well designed for modeling continuous variables such as the exposure at default or loss given default, to name a couple common examples. Therefore, this paper will employ the approach of Dual-time Dynamics (DtD) [2]. DtD shares many of the same concepts with APC, but is generically applicable to the key rates encountered in retail lending. ...
    ... For richer data sets or coarser segmentations, cross-terms between quality and maturation or between quality and exogenous may be appropriate. The estimation process employed here is the same as described in [2], and has strong similarities to the iterative solution method of Generalized Additive Models [10]. For the initial decomposition, no macroeconomic or credit score factors are included. ...
    Article
    In this article, we collect consumer delinquency data from several economic shocks in order to study the creation of stress test models. We leverage the Dual-time Dynamics modeling technique to better isolate macroeconomic impacts whenever vintage-level performance data is available. The stress test models follow a framework described here of focusing on consumer-centric macroeconomic variables so that the models are as robust as possible when predicting the impacts of future shocks. We con- sider the Mexican Peso Crisis / Tequila Effect by examining Argentina; Asian Economic Crisis by considering Thailand, Indonesia, and Singapore; the Hong Kong SARS recession; and the relative lack of recessions in recent data from Canada and Australia.
  • ... However, they are less well designed for modeling continuous variables such as the exposure at default or loss given default, to name a couple common examples. Therefore, this paper will employ the approach of Dual-time Dynamics (DtD) [2]. DtD shares many of the same concepts with APC, but is generically applicable to the key rates encountered in retail lending. ...
    ... For richer data sets or coarser segmentations, cross-terms between quality and maturation or between quality and exogenous may be appropriate. The estimation process employed here is the same as described in [2], and has strong similarities to the iterative solution method of Generalized Additive Models [10]. For the initial decomposition, no macroeconomic or credit score factors are included. ...
    Article
    Stress-testing has become an important topic in retail lending since the introduction of the new Basel II guidelines. Here we use a scenario-based forecasting approach developed explicitly for retail lending in order to provide a suitable stress-testing approach. We first decompose the historical vintage performance data into a maturation function of months-on-books, a quality function of vintage origination date and an exogenous function of calendar date. In a second step, the exogenous function is modeled with macroeconomic data or factors representing portfolio management impacts. Stress tests are performed by extrapolating the exogenous function using externally provided scenarios for extreme macroeconomic events. The resulting scenario is combined with the known maturation and quality functions. This process is repeated for each member of a key set of rates, such as default rate, exposure at default and loss given default in the context of Basel II. These key rate forecasts are combined to create total portfolio forecasts and stress tests. This approach is demonstrated in an analysis of the US mortgage markets.
  • ... These are uniquely appropriate for capturing the dynamics of retail portfolios (Breeden, 2010). This class includes dual-time dynamics (Breeden, 2007), survival and proportional hazards models (Hosmer & Lemeshow, 1999;Lawless, 2003), age period cohort (APC) models (Glenn, 2005), and panel data methods (Frees, 2004). Of these, articles have been published on the use of dual-time dynamics (Breeden, 2007(Breeden, , 2009 and survival models (Bellotti & Crook, 2007;Malik & Thomas, 2008;Stepanova & Thomas, 2001) for producing forecasts and stress tests of retail loan portfolios. ...
    ... This class includes dual-time dynamics (Breeden, 2007), survival and proportional hazards models (Hosmer & Lemeshow, 1999;Lawless, 2003), age period cohort (APC) models (Glenn, 2005), and panel data methods (Frees, 2004). Of these, articles have been published on the use of dual-time dynamics (Breeden, 2007(Breeden, , 2009 and survival models (Bellotti & Crook, 2007;Malik & Thomas, 2008;Stepanova & Thomas, 2001) for producing forecasts and stress tests of retail loan portfolios. ...
    Article
    Full-text available
    Problems in the US mortgage industry have shown weaknesses in the standard regulatory and economic capital approaches. Although a significant amount of discussion is occurring around how to segment portfolios or predict key variables in order to better fit the existing formulas, we believe that a re-examination of existing capital formulas with respect to credit risk is required.In this paper we develop a formula which is specifically tuned to the dynamics of retail loan portfolios and which could be employed for either regulatory capital or economic capital. The key advantages of this approach are that it is based upon a much more accurate model of retail loan defaults, does not require any new data feeds, is based upon readily available modeling frameworks, and can adapt to portfolio changes such as those observed in the US mortgage crisis.
  • ... These are uniquely appropriate for capturing the dynamics of retail portfolios (Breeden, 2010). This class includes dual-time dynamics (Breeden, 2007), survival and proportional hazards models (Hosmer & Lemeshow, 1999;Lawless, 2003), age period cohort (APC) models (Glenn, 2005), and panel data methods (Frees, 2004). Of these, articles have been published on the use of dual-time dynamics (Breeden, 2007(Breeden, , 2009 and survival models (Bellotti & Crook, 2007;Malik & Thomas, 2008;Stepanova & Thomas, 2001) for producing forecasts and stress tests of retail loan portfolios. ...
    ... This class includes dual-time dynamics (Breeden, 2007), survival and proportional hazards models (Hosmer & Lemeshow, 1999;Lawless, 2003), age period cohort (APC) models (Glenn, 2005), and panel data methods (Frees, 2004). Of these, articles have been published on the use of dual-time dynamics (Breeden, 2007(Breeden, , 2009 and survival models (Bellotti & Crook, 2007;Malik & Thomas, 2008;Stepanova & Thomas, 2001) for producing forecasts and stress tests of retail loan portfolios. ...
    Article
    Lifetime loan forecasting has become essential to lender risk management and profitability. Loan-pricing models require forecasts over the life of the loan. Current expected credit loss (CECL) calculations proposed by the US Financial Accounting Standards Board (FASB) (2012) and included in International Financial Reporting Standard 9 (IFRS9) require lifetime forecasts. In both cases, we cannot create forecasts that assume the current or historic environment persists for many years into the future. Instead, a more reasonable approach is to use macroeconomic scenarios for the near term and then relax onto the long-run average for future years. In the current paper, we develop a modeling framework that can incorporate mean-reverting scenarios into any scenario-based forecasting model. Using prior economic conditions, we create an environmental index with which to calibrate a discrete version of an Ornstein-Uhlenbeck (OU) mean-reverting model. OU models are best applied to stationary processes, which is true for the environment function derived from age-period-cohort-type (APC-type) models. The mean-reverting model is used to transition from the near-term macroeconomic scenario to the long-run average to provide stable lifetime estimates for long-duration loans. We demonstrate this framework with a loan-level forecasting model using an age-vintage-time structure for retail loans, in this case, a small auto loan portfolio. The loan-level age-vintage-time model is similar in structure to an APC model, but it is estimated at the loan-level for greater robustness on small portfolios. The environment function of time is correlated to macroeconomic factors, and it is then extrapolated backward in time before the performance data to stabilize the trend of the environment function. This framework is in line with the explicit goals of the new FASB loan-loss accounting guidelines. In addition, this model provides a simple mechanism to facilitate the transition between point-in-time and through-the-cycle economic capital estimates with an internally consistent model.
  • ... However, they are less well designed for modeling continuous variables such as the exposure at default or loss given default, to name a couple common examples. Therefore, this paper will employ the approach of Dual-time Dynamics (DtD) [2]. DtD shares many of the same concepts with APC, but is generically applicable to the key rates encountered in retail lending. ...
    ... For richer data sets or coarser segmentations, cross-terms between quality and maturation or between quality and exogenous may be appropriate. The estimation process employed here is the same as described in [2], and has strong similarities to the iterative solution method of Generalized Additive Models [10]. For the initial decomposition, no macroeconomic or credit score factors are included. ...
    Article
    Stress testing has become an important topic in retail lending since the introduction of the new Basel II guidelines. The present work uses a scenario-based forecasting approach developed explicitly for retail lending in order to provide a suitable stress testing approach. We first decompose the historical vintage performance data into a maturation function of months-on-books, a quality function of vintage origination date, and an exogenous function of calendar date. In a second step, the exogenous function is modeled with macroeconomic data or factors representing portfolio management impacts. Stress tests are performed by extrapolating the exogenous function using externally provided scenarios for extreme macroeconomic events. The resulting scenario is combined with the known maturation and quality functions. This process is repeated for each of a key set of rates, such as default rate, exposure at default, and loss given default in the context of Basel II. These key rate forecasts are combined to create total portfolio forecasts and stress tests. This approach is demonstrated in an analysis of the US Mortgage markets.
  • ... Whittaker et al (2007) update a scorecard over time by using a Kalman filter that monitors and extrapolates time-dependent weights of evidence in a borrower's score. Breeden (2007Breeden ( , 2009) proposes a model structure in which expected default rate of a portfolio cohort is assumed proportional to the product of three (exponential) factors, one of which depends on vintage age, the second on clock time and the third on a risk quality measure for the particular vintage. Surprisingly, the identical mathematical structure is claimed for individual borrower account default probabilities and hazard rates, with individual risk scores substituting for the portfolio quality measure. ...
    Article
    Full-text available
    This paper proposes a proportional odds model to combine systemic and non-systemic risk for prediction of default and prepay performance in cohorts of booked loan accounts. We assume that performance odds is proportional to two independent factors, one based on age-dependent systemic, possibly external, global disruptions to a cohort of individual accounts, the other on traditional non-systemic information odds based on demographic, behavioural and financial payment patterns of the individual accounts. A proportional odds model provides a natural formulation that can combine hazard rate predictions of baseline defaults, prepayments and active accounts with traditional non-systemic risk scores of individuals within the cohort. Theoretical comparisons with proportional hazard models are illustrated. Although our model is developed in terms of Good/Bad performance, it can include late payments, prepayments, defaults, as well as responses to offers and other classifications. We make 60-month default and prepay forecasts under two different systemic risk scenarios for a portfolio of Alt A mortgages with 24-month ‘teaser rates’ originated in 2004.
  • ... In terms of identifying an external time-dependent factor, the TDPH model is similar to the dual-time dynamics (DtD) model of Breeden (2007), , and . The DtD approach represents the collective effect of external time-dependent factors as an additional additive term in a generalized additive model (GAM). ...
    Article
    Full-text available
    In the consumer credit industry, assessment of default risk is critically important for the financial health of both the lender and the borrower. Methods for predicting risk for an applicant using credit bureau and application data, typically based on logistic regression or survival analysis, are universally employed by credit card companies. Because of the manner in which the predictive models are fit using large historical sets of existing customer data that extend over many years, default trends, anomalies, and other temporal phenomena that result from dynamic economic conditions are not brought to light. We introduce a modification of the proportional hazards survival model that includes a time-dependency mechanism for capturing temporal phenomena, and we develop a maximum likelihood algorithm for fitting the model. Using a very large, real data set, we demonstrate that incorporating the time dependency can provide more accurate risk scoring, as well as important insight into dynamic market effects that can inform and enhance related decision making.
  • ... One reason for this is the lack of economic variables in the models. This could be addressed by using the dual time approach of Breeden (2007) which looks at vintage and maturity of the debt as well as economic conditions or by directly including economic variables into the regression (Bellotti & Crook, 2012). Another reason is that the LGD distribution is far from normal and so regression approaches do not work without major modifications. ...
    Article
    Full-text available
    One approach to modelling Loss Given Default (LGD), the percentage of the defaulted amount of a loan that a lender will eventually lose is to model the collections process. This is particularly relevant for unsecured consumer loans where LGD depends both on a defaulter's ability and willingness to repay and the lender's collection strategy. When repaying such defaulted loans, defaulters tend to oscillate between repayment sequences where the borrower is repaying every period and non-repayment sequences where the borrower is not repaying in any period. This paper develops two models - one a Markov chain approach and the other a hazard rate approach to model such payment patterns of debtors. It also looks at simplifications of the models where one assumes that after a few repayment and non-repayment sequences the parameters of the model are fixed for the remaining payment and non-payment sequences. One advantage of these approaches is that they show the impact of different write-off strategies. The models are applied to a real case study and the LGD for that portfolio is calculated under different write-off strategies and compared with the actual LGD results.
  • ... Test 3: Bootstrap Validation. DtD as described in [6] uses a hybrid nonlinear / non-parametric, iterative estimation process. This complexity means that the properties of the estimator cannot be solved in closed form. ...
    Article
    Full-text available
    Monte Carlo simulation is a common method for studying the volatility of market traded instruments. It is less employed in retail lending, because of the inherent nonlinearities in consumer behaviour. In this paper, we use the approach of Dual-time Dynamics to separate loan performance dynamics into three components: a maturation function of months-on-books, an exogenous function of calendar date, and a quality function of vintage origination date. The exogenous function captures the impacts from the macroeconomic environment. Therefore, we want to generate scenarios for the possible futures of these environmental impacts. To generate such scenarios, we must go beyond the random walk methods most commonly applied in the analysis of market-traded instruments. Retail portfolios exhibit autocorrelation structure and variance growth with time that requires more complex modelling. This paper is aimed at practical application and describes work using ARMA and ARIMA models for scenario generation, rules for selecting the correct model form given the input data, and validation methods on the scenario generation. We find when the goal is capturing the future volatility via Monte Carlo scenario generation, that model selection does not follow the same rules as for forecasting. Consequently, tests more appropriate to reproducing volatility are proposed, which assure that distributions of scenarios have the proper statistical characteristics. These results are supported by studies of the variance growth properties of macroeconomic variables and theoretical calculations of the variance growth properties of various models. We also provide studies on historical data showing the impact of training length on model accuracy and the existence of differences between macroeconomic epochs. Journal of the Operational Research Society (2010) 61, 399-410. doi: 10.1057/jors.2009.105 Published online 14 October 2009
  • ... As a simple example of this approach, Figure 3 shows the modeling residuals from a 6 1/2-year data set of 30-year fixed-rate non-conforming (subprime, jumbo, or low documentation) US mortgages. A Dual-time Dynamics model [3,7,6] was employed which models vintage-level aggregate performance data to explicitly estimate three nonparametric functions: a lifecycle function with months-on-books, a; an exogenous function of calendar date, t; and a quality function with vintage origination date, v. The test was on modeling probability of default. ...
    Article
    As part of the increased regulatory scrutiny of retail lending models, analysts are routinely being asked to show that their models are complete, ie, that they have captured all the structure present in the data. Rather than create increasingly complex models, we consider tests on model residuals to determine whether more structure remains to be modeled. To test for residual structure in one dimension, we discuss using the usual Durbin-Watson and Ljung-Box tests, but consider applying them along other dimensions besides calendar time. Then we expand this correlation-test concept to multiple dimensions in order to test for the presence of cross-terms in the residuals. Null hypotheses can be created by randomizing and reanalyzing the residuals. As an example, these methods are applied to the residuals of a dual-time dynamics model of US mortgage defaults. In practical applications, when residual structure is found, the analyst can then make an informed decision about whether the amount of residual structure is sufficient to warrant further modeling. Simple segmentation is often sufficient to capture the structure.
  • ... In cases where vintage-level performance data is available, we can leverage the method of Dual-time Dynamics [2] to remove the impacts of the natural maturing of the portfolio and changes in the marketing plan. DtD studies the rate of events occurring in aggregate rather than whether individual events such as default or early repayment occur at an account level. ...
  • ... Vintage models naturally capture the timing of losses and attrition versus age of the loan, and therefore are an obvious choice for lifetime loss calculations. An Age-Period-Cohort approach is commonly used to estimate such models [19,21,4,8]. Using rates for probability of default (PD), exposure at default (EAD), loss given default (LGD), and probability of attrition (PA), monthly loss forecasts are created and aggregated to a lifetime loss estimate. ...
    Preprint
    Full-text available
    The new guidelines for loan loss reserves, CECL (Current Expected Credit Loss), were initially proposed so that lenders' loss reserves would be forward-looking. Under the previous guidance, loss reserves might only increase after a crisis had already ravaged a lender's portfolio. The goal of being forward-looking was for lenders to accumulate reserves in advance of the crisis, to be prepared. As the industry starts to implement CECL, questions have been raised about whether CECL will be forward-looking enough. Some preliminary studies have suggested that CECL could be procyclical, meaning that loss reserves would peak at the peak of a crisis. Although better than seeing failure only after it has happened, being required to raise liquidity at the peak of a crisis could still fail to save the lender from collapse, or even facilitate it. However, CECL is built on models, and the modeling details are important. These preliminary studies appeared to correlate losses directly to macroeconomic factors, equivalent to the time series models that we tested in Living with CECL: Mortgage Modeling Alternatives. As we saw in that study, time series models fail to capture the credit cycle and thus are among the least accurate models in the study. Our recent blog post (Breeden, 8 Aug 2018) showed that the credit cycle is very strong in mortgage and always leads the economic cycle. Earlier research showed that the credit cycle is partly driven by underwriting practices, but also strongly influenced by consumer loan demand (Breeden, J.L. and J.J. Canals-Cerdá, 2018). That is why Age-Period-Cohort (vintage) and survival models were the winners on long-range accuracy. We did not, however, test for procyclicality. The study tests a range of models for procyclicality. The scenarios were obtained by purchasing reports from Consensus Economics published in the month preceding each quarter's forecast, so these CECL estimates use the real economic assumptions available at the time.
  • Article
    Motivated by a real problem, this study aims to develop models to conduct stress testing on credit card portfolios. Two modelling approaches were extended to include the impact of lenders’ actions within the model. The first approach was a regression model of the aggregate losses based on economic variables with autocorrelations of the errors. The second approach was a set of vintage-level models that highlighted the months-on-book effect on credit losses. A case study using the models was described using South African credit card data. In this case, the models were used to stress test the credit card portfolio under several economic scenarios.
  • Article
    Statistical Theories and Methods with Applications to Economics and Business highlights recent advances in statistical theory and methods that benefit econometric practice. It deals with exploratory data analysis, a prerequisite to statistical modelling and part of data mining. It provides recently developed computational tools useful for data mining, analysing the reasons to do data mining and the best techniques to use in a given situation. Provides a detailed description of computer algorithms. Provides recently developed computational tools useful for data mining Highlights recent advances in statistical theory and methods that benefit econometric practice. Features examples with real life data. Accompanying software featuring DASC (Data Analysis and Statistical Computing). Essential reading for practitioners in any area of econometrics; business analysts involved in economics and management; and Graduate students and researchers in economics and statistics.
  • Article
    The corporate credit risk literature has many studies modelling the change in the credit risk of corporate bonds over time. There is far less analysis of the credit risk for portfolios of consumer loans. However behavioural scores, which are commonly calculated on a monthly basis by most consumer lenders are the analogues of ratings in corporate credit risk. Motivated by studies in corporate credit risk, we develop a Markov chain model based on behavioural scores to establish the credit risk of portfolios of consumer loans. We motivate the different aspects of the model – the need for a second order Markov chain, the inclusion of economic variables and the age of the loan – using data on a credit card portfolio from a major UK bank.
  • Conference Paper
    Intelligent Transportation System has a new kind of complicated time series data which would be the traffic flow, average speed or some other traffic condition information at the same time period. All above data is useful and important for our traffic system which includes the traffic flow prediction, tendency analysis or cluster. With the development in time series analysis model and their applications, it is important to focus on how to find the useful and real-time traffic information from the Intelligent Transportation System. Using this method of building models for the Intelligent Transportation System is the way to solve the traffic prediction problem and make control of the massive traffic network.
  • Bad debt rising. Federal Reserve Bank of New York
    • D P Morgan
    • I Toll
    Morgan, D.P., Toll, I., 1997. Bad debt rising. Federal Reserve Bank of New York. Curr. Issues Econom. Finance 3 (4), 1-5.
  • Detection of Climate Signal in Dendrochronological Data Analysis: A Comparison of Tree-Ring Standardization Methods Synoptic dendroclimatology: overview and prospectus
    • S Helama
    • M Lindholm
    • M Timonen
    • M Eronen
    • Springer
    • Berlin
    • K Hirschboeck
    • F Ni
    • M Wood
    • C Woodhouse
    Helama, S., Lindholm, M., Timonen, M., Eronen, M., 2004. Detection of Climate Signal in Dendrochronological Data Analysis: A Comparison of Tree-Ring Standardization Methods. Springer, Berlin. Hirschboeck, K., Ni, F., Wood, M., Woodhouse, C., 1996. Synoptic dendroclimatology: overview and prospectus. In: Dean, J., Meko, D., Swetnam, T. (Eds.), Tree-Rings, Environment and Humanity: Proceedings of the International Conference, Tucson, Arizona, 17–21 May 1994, pp. 205–223.
  • The Customer Value Imperative Creating Shareholder Value Through Consumer Credit Portfolio Management: An Industry Best Practices Report, RMA. Search for Extra-Terrestrial Intelligence
    • Management Association
    • Oliver
    • Wyman
    • Company
    • Llc
    Risk Management Association, Oliver, Wyman & Company, LLC, 1999. The Customer Value Imperative. Creating Shareholder Value Through Consumer Credit Portfolio Management: An Industry Best Practices Report, RMA. Search for Extra-Terrestrial Intelligence. http://setiathome.ssl.berkeley.edu/.
  • SW United States drought: tree-ring perspectives, In: The Physical Basis of Climate Change
    • M K Hughes
    • N Graham
    • D M Meko
    • F Ni
    • G Funkhouser
    Hughes, M.K., Graham, N., Meko, D.M., Ni, F., Funkhouser, G., 2003. SW United States drought: tree-ring perspectives, In: The Physical Basis of Climate Change, Drought Implications Workshop, Intergovernmental Panel on Climate Change, Working Group 1.
  • Taking the risk out of retail credit modelling
    • U Wolf
    Wolf, U., 2001. Taking the risk out of retail credit modelling. Technical Report, ERisk. World Data Center For Paleoclimatology. http://www.ncdc.noaa.gov/paleo/data.html.
  • Synoptic dendroclimatology: overview and prospectus
    • K Hirschboeck
    • F Ni
    • M Wood
    • C Woodhouse
    Hirschboeck, K., Ni, F., Wood, M., Woodhouse, C., 1996. Synoptic dendroclimatology: overview and prospectus. In: Dean, J., Meko, D., Swetnam, T. (Eds.), Tree-Rings, Environment and Humanity: Proceedings of the International Conference, Tucson, Arizona, 17-21 May 1994, pp. 205-223.
  • Becoming a better vintner
    • J L Breeden
    Breeden, J.L., 2002. Becoming a better vintner. RMA J. 9, 25-31.
  • World Data Center For Paleoclimatology
    • World
    • Center For Paleoclimatology
  • SW United States drought: tree-ring perspectives, In: The Physical Basis of Climate Change
    • Hughes
  • Cultural interaction in the prehistoric Southwest
    • L S Cordell
    • G J Gumerman
    Cordell, L.S., Gumerman, G.J., 1989. Cultural interaction in the prehistoric Southwest. In: Cordell, L.S., Gumerman, G.J. (Eds.), Dynamics of Southwest Prehistory. Smithsonian Institute Press, Washington, DC, pp. 1-17.
  • Identification of parameters in distributed systems distributed parameter systems: identification, estimation, and control
    • E R Goodson
    • M P Polis
    Goodson, E.R., Polis, M.P., 1978. Identification of parameters in distributed systems distributed parameter systems: identification, estimation, and control. In: Ray, W.H., Lainiotis, D. (Eds.), Control and Systems Theory, vol. 6. Marcel Dekker, New York.
  • Chapter
    The North American Southwest extends from southeastern Utah and southwestern Colorado into Chihuahua and Sonora, and from central New Mexico to the Grand Canyon and the lower Colorado River (Fig. 1). As delineated, this geographically heterogeneous area is united by an arid to semiarid climate, a condition that has had major impact on cultural manifestations. The archaeological record extends some 11,000 years and encompasses ways of life characterized by highly mobile hunting and gathering, semisedentary and sedentary horticulture, and following the introduction of domestic livestock by Europeans, economies of mixed herding and horticulture. The diversity of lifeways pursued over time, and at anyone time, has led anthropologists to debate whether the Southwest is best described as a single culture area, as more than one culture area, or as a regional zone of cultural interaction (see Daifuku 1952; Kirchoff 1954; Kroeber 1939; Martin and Rinaldo 1951). Some scholars have employed a variety of trait lists in suppon of the general cultural unity of the Southwest (see Jennings and Reed 1956; Rouse 1962). We also favor emphasizing the essential unity of the Southwest, although our view focuses on the dynamics of cultural interactions over time in the area. The unity is reflected in the consistent sychroneity of changes described in the framework of southwestern culture history that emerged from the conference and is described below. This framework also indicates that the greatest similarities in the material remains of the archaeological record of the area occur at the start of the time period under consideration, a time when both horticulture and sedentism began to set the Southwest apart from two adjacent regions-the Great Plains and the Great Basin-where hunting and gathering continued to be the primary mode of subsistence. Despite increased regional variation over time in the Southwest. The complex and changing web of cultural interactions within the area becomes a hallmark of its unity and overshadows interactions with neighboring regions. The Southwest enjoys a unique position in American archaeology. Due to the exceptional preservation of archaeological remains. The unmatched predsion of temporal control and palaeoenvironmental data for the prehistoricperiods. and the continued existence in the area of vital American Indian cultures. The Southwest is often seen as a natural laboratory appropriate for evaluating archaeological method and theories of cultural development and change. With the recent phenomenal growth of public archaeology in the western United States. The pace and scale of southwestern archaeological research has greatly accelerated. The vast quantity of literature of the 1970s and 1980s is difficult for even area spedalists to control. Boththe general interest in the prehistory of the Southwest and the avalanche of recent infonnation suggested to the conference partidpants that this volume would be of interest to a broad community of scholars. The Southwest has long been known for its archaeological conferences.The first. and still most famous, was the Pecos Conference of 1927. A. V. Kidder invited scholars to Pecos Pueblo in order to devise a scheme that would reflect the broad outlines of all southwestern prehistoric development. and to resolve issues of nomenclature and terminology (Kidder 1927). The Pecos Conference is now an annual event held at the end of the summer field season. and therefore serves primarily as a forum for discussion of current fieldwork with some topically oriented sessions. In additionto the Pecos Conference with its pan-southwestern focus. The recent volume of data recovery has been so great that there are now biannual conferences devoted to the Mogollon (e.g ., Benson and Upham 1986). Hohokam (e.g ., Dittert and Dove 1985a. 1985b) and the Anasazi (e.g ., Smith 1983). In January of 1988. The first. in what may become a regularly scheduled pansouthwesternconference devoted to topical syntheses. was held in Tempe, Arizona. Over the past decade. there have also been topically oriented conferences that have brought together spedalists from diverse disdplines in order to address a particular problem area. These have included conferences aimed at developing detailed palaeoenvironmental reconstructions and relating these to prehistoric settlement and technological changes (e.g ., Dean et al. 1985; Gumennan 1988) and conferences on specific topics in cultural resource management (e.g ., Cordell and Green 1983; Green and Plog 1983; Plog and Wait 1982). Of a slightly different nature are conferences that bring southwestern archaeologiSts together to address broader thematic issues. The 1955 Seminars in Archaeology (Jennings and Reed 1956) focused on deter mining the extra-southwestern origins of various southwestern, culture traits. The School of American Research seminar on prehistoric Pueblo social organization (Longacre 1970) was more restricted geographically, examining only those southwestern societies that could be construed as Pueblo. The seminar grew out of early processual archaeology in the attempt to reconstruct nonmaterialaspects of prehistoric culture and had an impact far beyond the scholarship of the Southwest. The aims of the current seminar were, in some respects, broader than either the 1955 seminar or the seminar on prehistoric social organization. The conference was concerned, first, with synthesizing the culture history of key regions of the Southwest and, second, with describing and explaining .underlying patterns of stability and change among the prehistoric cultures represented. In papers prepared for the seminar, participants were asked to provide background on the environment and paleoenvironment of their areas, review the culture history of their areas, and discuss the dynamics behind that culture history. As might be expected, considerable seminar time was spent learning the details of the sequences presented. Nevertheless, the seminar produced two worthwhile results: first, a framework with descriptive nomenclature relating to synchronous periods of stability and change, and second, discussions of cultural dynamics pertinent to periods ofapparent isolation and interaction among local areas within the Southwest. The patterns of interaction and isolation described during the conference were diverse. For example, a pattern such as the synchronous appearance of a distinctive ceramic style might suggest interaction that could have been the result of trade, migration, or the political ascendancy of a group whose pottery other cultures replicated. On the other hand, a pattern such as a synchronous shift in settlement distribution might be the result of purely local responses to a regional climatic event. There are also examples of times when distinctive stylistic features were restricted to local areas. These were generally interpreted as reflecting cultural isolation. After reviewing the local culture histories, much conference time was spent discussing the processes underlying similarities among the areas that seemed to indicate cultural interaction. It was in conceptualizing, describing, and discussing possible forms of political, social, and religious interactions that there was the most diversity in the vocabulary of the participants. Some used the term "alliances," some "interaction spheres," and some just the term "systern." In part, the diversity reflects controversy over the nature of the interactions, and in part just the novelty in approaching these questions. Over the past ten years, southwesternists have moved far from developing the culture histories of single river valleys into discussions of a broadly regional nature. Yet, there is still uncertainty about the way in which regional phenomena should be described. One area of conference discussion involved attributing some of the evidence for interaction to the effects socially complex systems had on broad areas of the Southwest. Traditionally, the prehistoric Southwest has been considered an area that supported only egalitarian societies. Recently, there hasbeen a great deal of discussion about systems that were socially hierarchical. While many investigators continue to be leery of ascribing hierarchical orsanization to any prehistoric southwestern group, most of the seminar participants acknowledged that at least a few of the prehistoric systems,notably those centered at Chaco Canyon, Casas Grandes, and the Hohokam region, influenced areas well beyond their own borders. Their degree of influence suggests organizational complexity beyond that of egalitarian groups as generally defined. There is continued disagreement among those participating in the conference (and probably among southwestern archaeologists in general) about the degree of social complexity involved. Here we find Gregory Johnson's (this volume) observations and discussion most useful. Johnson's background in studying social hierarchies theoretically and in the early civilizations of the Near East allows us to see that in comparison to the early complex societies of the Old World, the southwestern examples are simple and modular. A key element in this simplicity is the lack of obvious economic stratification in the Southwest that, in tum, is a reflection of the relatively low levelof environmental productivity. From our perspective, the limitations of the southwestern environment and fluctuations in climate are crucial components in understanding prehistoric change. This is an issue we address in some detail below. We also believe that we need to develop broader and more innovative approaches to finding appropriate ethnographic analogs for the prehistoric societies of interest. © 2006 by The University of Alabama Press. All rights reserved.
  • Article
    Introduction. Survival distributions. Single sample nonparametric methods. Dependence on explanatory variables. Model formulation. The multiplicative log-linear hazards model. Partial likelihood. Several types of failure. Further problems. Exercises. Bibliography. Index.
  • Article
    Trends in loan delinquencies and losses over time and among credit types contain important information for credit managers and market analysts. The results of this study provide information about the relationship between trends in delinquency rates of portfolios of consumer credit contracts and variables related to lenders' market share, credit market growth, household financial condition and general business conditions. The results of the analysis of monthly delinquency rates for open and closed-end loan portfolios held by commercial banks between 1975 and 1986 indicated that the debt burden measure was significantly and positively associated with delinquency rates, for all types of consumer loans analyzed. Banks' market share of consumer credit outstanding was positively associated with the delinquency rate for the average portfolio of closed-end consumer loans, suggesting that banks increased credit risk to win market share during the analysis period. However, the rate of growth of credit outstanding during the period was negatively associated with delinquency rates for closed-end loans. The rapid growth of revolving credit outstanding in the last 10 years has been statistically associated with a decline in delinquency rates for the average portfolio of revolving credit held by commercial banks. The average delinquency rate for portfolios of revolving credit accounts was significantly positively associated with the household debt burden and with the unemployment rate.
  • Book
    Introduction.- Estimating the Survival and Hazard Functions.- The Cox Model.- Residuals.- Functional Form.- Testing Proportional Hazards.- Influence.- Multiple Events per Subject.- Frailty Models.- Expected Survival.
  • Article
    We describe new reconstructions of northern extratropical summer temperatures for nine subcontinental-scale regions and a composite series representing quasi "Northern Hemisphere" temperature change over the last 600 years. These series are based on tree ring density data that have been processed using a novel statistical technique (age band decomposition) designed to preserve greater long-timescale variability than in previous analyses. We provide time-dependent and timescale-dependent uncertainty estimates for all of the reconstructions. The new regional estimates are generally cooler in almost all precalibration periods, compared to estimates obtained using earlier processing methods, particularly during the 17th century. One exception is the reconstruction for northern Siberia, where 15th century summers are now estimated to be warmer than those observed in the 20th century. In producing a new Northern Hemisphere series we demonstrate the sensitivity of the results to the methodology used once the number of regions with data, and the reliability of each regional series, begins to decrease. We compare our new hemisphere series to other published large-regional temperature histories, most of which lie within the 1σ confidence band of our estimates over most of the last 600 years. The 20th century is clearly shown by all of the palaeoseries composites to be the warmest during this period.
  • Article
    The chronology of interdecadal climatic regime shifts is examined, using instrumental data over the North Pacific, North America and the tropical oceans, and reconstructed climate records for North America. In the North Pacific and North America, climatic regime shifts around 1890 and in the 1920s with alternating polarities are detected, whose spatial structure is similar to that of the previously-known climatic shifts observed in the 1940s and 1970s. Sea-surface temperatures in the tropical Indian Ocean-maritime continent region exhibit changes corresponding to these four shifts. Spectra obtained by the Multi-Taper-Method suggest that these regime shifts are associated with 50–70 year climate variability over the North Pacific and North America.The leading mode of the empirical orthogonal functions of the air-temperature reconstructed from tree-rings in North America exhibits a spatial distribution that is reminiscent of instrumentally observed air-temperature differences associated with the regime shifts. The temporal evolution of this mode is characterized by a 50–70 year oscillation in the eighteenth and nineteenth centuries. This result, combined with the results of the analyses of the instrumental data, indicates that the 50–70 year oscillation is prevalent from the eighteenth century to the present in North America.
  • Article
    A new growth function, which is flexible enough in shape to accommodate most biological growth behavior, is created by adding an expanding factor to the Weibull distribution function. Many monotonically increasing biological growth phenomena can be excellently modelled by this function with various numerical values for the scale, the shape, and the upper asymptote parameters. The function is illustrated with height–age and volume–age curves for single trees and two polymorphic stand volume–age curves.
  • Article
    Full-text available
    Alaskan salmon stocks have exhibited enormous fluctuations in production during the 20th century. In this paper, we investigate our hypothesis that large-scale salmon-production variability is driven by climatic processes in the Northeast Pacific Ocean. Using a time-series analytical technique known as intervention analysis, we demonstrate that Alaskan salmonids alternate between high and low production regimes. The transition from a high(low) regime to a low(high) regime is called an intervention. To test for interventions, we first fitted the salmon time series to univariate autoregressive integrated moving average (ARIMA) models. On the basis of tentatively identified climatic regime shifts, potential interventions were then identified and incorporated into the models, and the resulting fit was compared with the non-intervention models. A highly significant positive step intervention in the late 1970s and a significant negative step intervention in the late 1940s were identified in the four major Alaska salmon stocks analyzed. We review the evidence for synchronous climatic regime shifts in the late 1940s and late 1970s that coincide with the shifts in salmon production. Potential mechanisms linking North Pacific climatic processes to salmon production are identified.
  • Article
    The problem of constructing millennia-long tree-ring chronologies from overlapping segments of cross-dated ring-width series is reviewed, with an emphasis on preserving very low-frequency signals potentially due to climate. In so doing, a fundamental statistical problem coined the 'segment length curse' is introduced. This 'curse' is related to the fact that the maximum wavelength of recoverable climatic information is ordinarily related to the lengths of the individual tree-ring series used to construct the millennia-long chronology. Simple experiments with sine waves are used to illustrate this fact. This is followed by more realistic experiments using a long bristlecone pine series that is randomly cut into a number of 1000-, 500- and 200-year segments and standardized using three very conservative methods. When compared against the original, uncut series, the resulting 'chronologies' show the effects of segment length even when the most conservative and noncommittal method of tree-ring standardiza tion is applied (i.e., a horizontal line through the mean). Alternative schemes of chronology development are described that seek to exorcise the segment length curse. While they show some promise, none is universal in its applicability and this problem still remains largely unsolved.
  • Article
    Efforts to develop computational methods for the identification and optimal control of linear and nonlinear systems governed by distributed parameter systems are reported on. Specifically, approximation methods for determining Optimal LOG compensators (feedback control and estimator gains) and functional parameters in linear and nonlinear partial differential equations and hereditary systems were developed, analyzed and tested. The study included theoretical, experimental and numerical components. Covergence theories for spline-based and modal finite element schemes were established and extensive numerical studies on both conventional (serial) and vector supercomputers were carried out. A parameter estimation scheme was tested using experimental data taken from the RPL structure, a laboratory experiment designed to test control algorithms for the large angle slewing of spacecraft with flexible appendages, and other projects involving the identification of flexible structures based upon experimental data were initiated.
  • Article
    Prominent and persistent anomalies in the at-mospheric flow (troughs and ridges) occur sporadically over the central North Pacific, and can have profound conse-quences for the weather of North America. We have exam-ined how these events are associated with large scale cen-tral North Pacific sea surface temperature (SST) anomalies, using an index for the Pacific Decadal Oscillation (PDO). The anomalies in turbulent air-sea heat fluxes and low-level baroclinity associated with the P DO are manifested differ-ently during troughs than during ridges in their effects on the transient eddies (storms). These effects may help ex-plain why prominent troughs (ridges) occur about 3 (2.5) times more frequently during periods when the PDO is sig-nificantly positive (negative) than of opposite sign. Our re-sults suggest that the state of the mid-latitude Pacific Ocean more fundamentally affects the atmosphere than has been thought.
  • Article
    Full-text available
    The two leading patterns of Pacific decadal sea surface temperature (SST) variability are strongly linked to large-scale patterns of warm-season drought and streamflow in the United States, recent analysis shows. The predictive potential of this link may contribute to the development of warm-season hydroclimate forecasts in the United States. Understanding of low-frequency variations in drought and streamflow would be important for both agriculture and water resources management. The two leading patterns are what we call the Pacific Decadal Oscillation (PDO) and the North Pacific mode. Their link with drought and streamflow patterns was notably expressed in the 1960s when severe drought in the northeast (the 1962-66 "Northeastern" drought) and exceptional positive SST anomalies in the North Pacific Ocean (Figures 1a, 1b) both occurred. Analysis of upper tropospheric circulation anomalies showed the North Pacific to be a source region of wave activity affecting the drought area in these summers. The anomalous circulation was vertically coherent and opposed the climatological low-level moisture inflow over the eastern United States associated with the western extension of the Bermuda High.
  • Article
    Full-text available
    Tree-ring standardization methods were compared. Traditional methods along with the recently introduced approaches of regional curve standardization (RCS) and power-transformation (PT) were included. The difficulty in removing non-climatic variation (noise) while simultaneously preserving the low-frequency variability in the tree-ring series was emphasized. The potential risk of obtaining inflated index values was analysed by comparing methods to extract tree-ring indices from the standardization curve. The material for the tree-ring series, previously used in several palaeoclimate predictions, came from living and dead wood of high-latitude Scots pine in northernmost Europe. This material provided a useful example of a long composite tree-ring chronology with the typical strengths and weaknesses of such data, particularly in the context of standardization. PT stabilized the heteroscedastic variation in the original tree-ring series more efficiently than any other standardization practice expected to preserve the low-frequency variability. RCS showed great potential in preserving variability in tree-ring series at centennial time scales; however, this method requires a homogeneous sample for reliable signal estimation. It is not recommended to derive indices by subtraction without first stabilizing the variance in the case of series of forest-limit tree-ring data. Index calculation by division did not seem to produce inflated chronology values for the past one and a half centuries of the chronology (where mean sample cambial age is high). On the other hand, potential bias of high RCS chronology values was observed during the period of anomalously low mean sample cambial age. An alternative technique for chronology construction was proposed based on series age decomposition, where indices in the young vigorously behaving part of each series are extracted from the curve by division and in the mature part by subtraction. Because of their specific nature, the dendrochronological data here should not be generalized to all tree-ring records. The examples presented should be used as guidelines for detecting potential sources of bias and as illustrations of the usefulness of tree-ring records as palaeoclimate indicators.
  • Article
    This article describes flexible statistical methods that may be used to identify and characterize nonlinear regression effects. These methods are called "generalized additive models". For example, a commonly used statistical model in medical research is the logistic regression model for binary data. Here we relate the mean of the binary response ¯ = P (y = 1) to the predictors via a linear regression model and the logit link function: log
  • Article
    Full-text available
    Preserving multicentennial climate variability in long tree-ring records is critically important for reconstructing the full range of temperature variability over the past 1000 years. This allows the putative “Medieval Warm Period” (MWP) to be described and to be compared with 20th-century warming in modeling and attribution studies. We demonstrate that carefully selected tree-ring chronologies from 14 sites in the Northern Hemisphere (NH) extratropics can preserve such coherent large-scale, multicentennial temperature trends if proper methods of analysis are used. In addition, we show that the average of these chronologies supports the large-scale occurrence of the MWP over the NH extratropics.
  • Article
    Full-text available
    The cause of decadal climate variability over the North Pacific Ocean and North America is investigated by the analysis of data from a multidecadal integration with a state-of-the-art coupled ocean-atmosphere model and observations. About one-third of the low-frequency climate variability in the region of interest can be attributed to a cycle involving unstable air-sea interactions between the subtropical gyre circulation in the North Pacific and the Aleutian low-pressure system. The existence of this cycle provides a basis for long-range climate forecasting over the western United States at decadal time scales.
  • Article
    This article describes a new and intuitive practical method for tabulating the exact loss distribution arising from correlated credit events for any arbitrary portfolio of counterparty exposures, down to the individual contract level, with the losses measured on a marked-to-market basis that explicitly recognises the potential impact of defaults and credit migrations