"The starting hypothesis of this paper was the actual occurrence of important interactions between demographic and socio-economic factors when trying to reach population forecasts that may be more efficient than those obtained by mere extrapolative methods. In order to be able to implement this approach to the Spanish case it has been necessary to reconstruct first the Spanish population series by age and sex groups from 1910 to 1980. Later, we proceed to obtain population forecasts using alternative modeling strategies and comment on the potential problems that the new demographic situation may have for future public policy."
"This study considers the accuracy of national population forecasts of the Netherlands and the Czechoslovak Socialist Republic.... We look at the demographic components employed in each forecast, the procedure to extrapolate fertility and the level at which assumptions for each component are formulated. Errors in total population size, fertility, mortality and foreign migration, and age structure are considered. We discuss trends in errors and methodology since 1950 and compare the situations in the two countries. The findings suggest that methodology has only a very limited impact on the accuracy of national population forecasts."
"The main theme of this paper is an investigation into the importance of error structure as a determinant of the forecasting accuracy of the logistic model. The relationship between the variance of the disturbance term and forecasting accuracy is examined empirically. A general local logistic model is developed as a vehicle to be used in this investigation. Some brief comments are made on the assumptions about error structure, implicit or explicit, in the literature." The results suggest that "the variance of the disturbance term, when using the logistic to forecast human populations, is proportional to at least the square of population size."
"The role of household projections as a basis for forecasts of households at [the] national and sub-national level is discussed and a number of criteria for such projections are outlined. The projection method used by the Department of the Environment [in the United Kingdom] is examined in the context of these criteria and it is concluded that it is both practical and robust. However, it is open to criticism, first because of its failure to make the best use of the available data and of theoretical knowledge, and secondly because of its 'black box' nature. An alternative two-stage strategy is developed. The first stage involves constructing projections using a new curve-fitting method which takes account of within cohort life-cycle headship rate changes. The second is a method of analysing the resulting projections by modelling transition rates between different household states. Worked examples of both methods are presented."
Forecasts are an inherent part of economic science and the quest for perfect foresight occupies economists and researchers in multiple fields. The release of economic forecasts (and its revisions) is a popular and often publicized event, with a multitude of institutions and think-tanks devoted almost exclusively to that task. The European Central Bank (ECB) also publishes its forecasts for the euro area, however ECB’s forecast accuracy is not a deeply researched theme. The ECB forecasts’ accuracy is the main point developed in this paper, which tries to contribute to understand the nature of the errors committed by the ECB forecasts and its main differences compared to other projections. What we try to infer is whether the ECB is accurate in its projections, making less errors than the others, maybe due to some informational advantage. We conclude that the ECB seems to consistently underestimate the HICP inflation rate and overestimate GDP growth. Comparing it with the others, the ECB shows a superior performance, committing almost always fewer errors. So, this signals a possible informational advantage from the ECB. Since the forecasting errors could jeopardize ECB’s credibility public criticism could be avoided if the ECB simply let forecasts for the others. Naturally, this change should be weighted against the benefits of publishing forecasts.
This special issue of the Journal of Forecasting jointly celebrates the 40th anniversary of the publication of George Box and Gwilym Jenkins' highly influential book Time Series Analysis: Forecasting and Control, which introduced a robust and easily implementable strategy for modelling time series, and the 50th anniversary of the appearance of Rudolf Kalman's article 'A new approach to linear filtering and prediction problems' in the Journal of Basic Engineering, which has had an extraordinary impact in many diverse fields, has led to major advances in recursive estimation, and has introduced the term Kalman filter into the lexicon of time series analysis and forecasting. The huge number of papers published in the Journal of Forecasting that reference these two publications bears testament to their seminal status and long-lasting influence, making them a natural choice to base a special issue around. Copyright (C) 2010 John Wiley & Sons, Ltd.
There exists a large number of quantitative extrapolative forecasting methods which may be applied in research work or implemented in an organizational setting. For instance, the lead article of this issue of the Journal of Forecasting compares the ability to forecast the future of over twenty univariate forecasting methods. Forecasting researchers in various academic disciplines as well as practitioners in private or public organizations are commonly faced with the problem of evaluating forecasting methods and ultimately selecting one. Thereafter, most become advocates of the method they have selected. On what basis are choices made? More specifically, what are the criteria used or the dimensions judged important? If a survey was taken among academicians and practitioners, would the same criteria arise? Would they be weighted equally? Before you continue reading this note, write on a piece of paper your criteria in order of importance and answer the last two questions. This will enable you to see whether or not you share the same values as your colleagues and test the accuracy of your perception.
We investigate the salary returns to the ability to play football with both feet. The majority of footballers are predominantly right footed. Using two data sets, a cross-section of footballers in the five main European leagues and a panel of players in the German Bundesliga, we find robust evidence of a substantial salary premium for two-footed ability, even after controlling for available player performance measures. We assess how this premium varies across the salary distribution and by player position.
This paper reports on a comprehensive study of the distributions of summary measures of error for a large collection of quarterly multiperiod predictions of six variables representing inflation, real qrowth, unemployment,and percentage changes in nominal GNP and two of its more volatile components.The data come from surveys conducted since 1968 by the National Bureau of Economic Research and the American Statistical Association and cover more than 70 individuals professionally engaged in forecasting the course of the U. S.economy (mostly economists, analysts, and executives from the world of corporate business and finance). There is considerable differentiation among these forecasts, across the individuals, variables, and predictive horizons covered. Combining corresponding predictions from different sources can result insignificant gains; thus the group mean forecasts are on the average over timemore accurate than most of the corresponding sets of individual forecasts. But there is also a moderate deqree of consistency in the relative performance of a sufficient number of the survey members, as evidenced in positive rank correlations among ratios of the individual to group root mean square errors.