Article

Optimal Properties of Exponentially Weighted Forecasts

Journal of The American Statistical Association - J AMER STATIST ASSN 01/1960; 55(290):299-306. DOI: 10.1080/01621459.1960.10482064

ABSTRACT The exponentially weighted average can be interpreted as the expected value of a time series made up of two kinds of random components: one lasting a single time period (transitory) and the other lasting through all subsequent periods (permanent). Such a time series may, therefore, be regarded as a random walk with “noise” superimposed. It is also shown that, for this series, the best forecast for the time period immediately ahead is the best forecast for any future time period, because both give estimates of the permanent component. The estimate of the permanent component is imperfect, and so the estimate of a regression coefficient is inconsistent in a relation involving the permanent (e.g. consumption as a function of permanent income). Its bias is small, however.

0 Bookmarks
 · 
22 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This note presents a simple generalization of the adaptive expectations mechanism in which the learning parameter is time variant. Expectations generated in this way minimize mean squared forecast errors for any linear state space model.Highlights► Time-varying adaptive expectations are rational for any linear state space process. ► Rationality is defined as minimization of mean squared forecast errors. ► Previous results are special cases of a more general property of the Kalman Filter.
    Economics Letters 04/2012; 115(1):4-6. · 0.45 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we propose a unified framework to analyse contemporaneous and temporal aggregation of exponential smoothing (EWMA) models. Focusing on a vector IMA(1,1) model, we obtain a closed form representation for the parameters of the contemporaneously and temporally aggregated process as a function of the parameters of the original one. In the framework of EWMA estimates of volatility, we present an application dealing with Value-at-Risk (VaR) prediction at different sampling frequencies for an equally weighted portfolio composed of multiple indices. We apply the aggregation results by inferring the decay factor in the portfolio volatility equation from the estimated vector IMA(1,1) model of squared returns. Empirical results show that VaR predictions delivered using this suggested approach are at least as accurate as those obtained by applying the standard univariate RiskMetrics TM methodology.
    Journal of Banking & Finance 05/2013; · 1.29 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: With the widespread application of computer and communication technologies, more and more real-time systems are implemented whose large amounts of time-stamped data consequently require more efficient processing approaches. For large-scale time series, precise values are often hard or even impossible to predict in limited time at limited costs. Meanwhile, precision is not absolutely necessary for human to think and reason, so credible changing ranges of time series are satisfactory for some decision-making problems. This study aims to develop fast interval predictors for large-scale, nonlinear time series with noisy data using fuzzy granular support vector machines (FGSVMs). Six information granulation methods are proposed which can granulate large-scale time series into subseries. FGSVM predictors are developed to forecast credible changing ranges of large-scale time series. Five performance indicators are presented to measure the quality and efficiency of FGSVMs. Four time series are used to examine the effectiveness and efficiency of the proposed granulation methods and the developed FGSVMs, whose results show the effectiveness and advantages of FGSVMs for large-scale, nonlinear time series with noisy data.
    Applied Soft Computing 09/2013; 13(9):3981-4000. · 2.68 Impact Factor