Optimal Properties of Exponentially Weighted Forecasts

Journal of the American Statistical Association (Impact Factor: 1.98). 06/1960; 55(290):299-306. DOI: 10.1080/01621459.1960.10482064


The exponentially weighted average can be interpreted as the expected value of a time series made up of two kinds of random components: one lasting a single time period (transitory) and the other lasting through all subsequent periods (permanent). Such a time series may, therefore, be regarded as a random walk with “noise” superimposed. It is also shown that, for this series, the best forecast for the time period immediately ahead is the best forecast for any future time period, because both give estimates of the permanent component. The estimate of the permanent component is imperfect, and so the estimate of a regression coefficient is inconsistent in a relation involving the permanent (e.g. consumption as a function of permanent income). Its bias is small, however.

30 Reads
  • Source
    • "He also prized simple over complicated theories. It is worth recalling that Muth's (1960) concept of rational expectations was an optimal forecast which is obtained by using the correct (time series) model. At the time expectations were nearly always based on partial or adaptive expectations models which have the uncomfortable property of making forecast errors persistent unless the correct model is used, which in the case of AE is an ARIMA(0,1,1). "
    [Show abstract] [Hide abstract]
    ABSTRACT: This lecture is about how best to evaluate economic theories in macroeconomics and finance, and the lessons that can be learned from the past use and misuse of evidence. It is argued that all macro/finance models are ‘false’ so should not be judged solely on the realism of their assumptions The role of theory is to explain the data. Models should be judged by their ability to do this. Data-mining will often improve the statistical properties of a model but does not improve economic understanding. These propositions are illustrated from the last 50 years of macro and financial econometrics.
    Manchester School 12/2015; 83(S2). DOI:10.1111/manc.12114 · 0.26 Impact Factor
  • Source
    • "We show that in practice the standard errors can make a di¤erence, especially when the time series is short (such as when stationarity is of concern). Third, we also establish the asymptotic properties of our statistic under several plausible alternative models including a multivariate Muth (1960) "
    [Show abstract] [Hide abstract]
    ABSTRACT: We propose several multivariate variance ratio statistics. We derive the asymptotic distribution of the statistics and scalar functions thereof under the null hypothesis that returns are unpredictable after a constant mean adjustment (i.e., under the weak form E¢ cient Market Hypothesis). We do not impose the no leverage assumption of Lo and MacKinlay (1988) but our asymptotic standard errors are relatively simple and in particular do not require the selection of a bandwidth parameter. We extend the framework to allow for a time varying risk premium through common systematic factors. We show the limiting behaviour of the statistic under a multivariate fads model and under a moderately explosive bubble process: these alternative hypotheses give opposite predictions with regards to the long run value of the statistics. We apply the methodology to …ve weekly size-sorted CRSP portfolio returns from 1962 to 2013 in three subperiods. We …nd evidence of a reduction of linear predictability in the most recent We thank 1 period, for small and medium cap stocks. The main …ndings are not substantially a¤ected by allowing for a common factor time varying risk premium.
    Do be do be do; 04/2015
  • Source
    • "The results as in (6) are the multivariate extension of the univariate results (see for example [Muth, 1960] "
    [Show abstract] [Hide abstract]
    ABSTRACT: Simple exponential smoothing is widely used in forecasting economic time series. This is because it is quick to compute and it generally delivers accurate forecasts. On the other hand, its multivariate version has received little attention due to the complications arising with the estimation. Indeed, standard multivariate maximum likelihood methods are affected by numerical convergence issues and bad complexity, growing with the dimensionality of the model. In this paper, we introduce a new estimation strategy for multivariate exponential smoothing, based on aggregating its observations into scalar models and estimating them. The original high-dimensional maximum likelihood problem is broken down into several univariate ones, which are easier to solve. Contrary to the multivariate maximum likelihood approach, the suggested algorithm does not suffer heavily from the dimensionality of the model. The method can be used for time series forecasting. In addition, simulation results show that our approach performs at least as well as a maximum likelihood estimator on the underlying VMA(1) representation, at least in our test problems.
    International Journal of Production Economics 11/2014; 162. DOI:10.1016/j.ijpe.2015.01.017 · 2.75 Impact Factor
Show more