MethodPDF Available

Abstract

A checklist of requirements for following The Golden Rule of Forecasting. From Armstrong, Green, and Graefe (2015).
Golden Rule Checklist
(With evidence on error reduction, and number of comparisons)
Software and Checklist available from goldenruleofforecasting.com
Comparisons*
Guideline
N
Error
reduction
1.
Problem formulation
%
1.1
Use all important knowledge and information by…
1.1.1
!
selecting evidence-based methods validated for the situation
7
18
1.1.2
!
decomposing to best use knowledge, information, judgment
17
35
1.2
Avoid bias by…
1.2.1
!
concealing the purpose of the forecast
1.2.2
!
specifying multiple hypotheses and methods
1.2.3
!
obtaining signed ethics statements before and after forecasting
1.3
!
Provide full disclosure for independent audits, replications, extensions
1
2.
Judgmental methods
2.1
!
Avoid unaided judgment
2
45
2.2
!
Use alternative wording and pretest questions
2.3
!
Ask judges to write reasons against the forecasts
2
8
2.4
!
Use judgmental bootstrapping
11
6
2.5
!
Use structured analogies
3
57
2.6
!
Combine independent forecasts from judges
18
15
3.
Extrapolation methods
3.1
!
Use the longest time-series of valid and relevant data
3.2
!
Decompose by causal forces
1
64
3.3
Modify trends to incorporate more knowledge if the…
3.3.1
!
series is variable or unstable
8
12
3.3.2
!
historical trend conflicts with causal forces
1
31
3.3.3
!
forecast horizon is longer than the historical series
1
43
3.3.4
!
short and long-term trend directions are inconsistent
3.4
Modify seasonal factors to reflect uncertainty if…
3.4.1
!
estimates vary substantially across years
2
4
3.4.2
!
few years of data are available
3
15
3.4.3
!
causal knowledge is weak
3.5
!
Combine forecasts from alternative extrapolation methods, data
1
16
4.
Causal methods
4.1
!
Use prior knowledge to specify variables, relationships, and effects
1
32
4.2
!
Modify effect estimates to reflect uncertainty
1
5
4.3
!
Use all important variables
5
45
4.4
!
Combine forecasts from dissimilar models
5
22
5.
!
Combine forecasts from diverse evidence-based methods
15
15
6.
!
Avoid unstructured judgmental adjustments to forecasts
4
64
Totals and Unweighted Average
109
31
* N: Number of papers with findings on effect direction.
n: Number of papers with findings on effect size. %: Average effect size (geometric mean)
Article
Full-text available
This article introduces the Special Issue on simple versus complex methods in forecasting. Simplicity in forecasting requires that (1) method, (2) representation of cumulative knowledge, (3) relationships in models, and (4) relationships among models,forecasts, and decisions are all sufficiently uncomplicated as to be easily understood by decision-makers. Our review of studies comparing simple and complex methods—including those in this special issue—found 97 comparisons in 32 papers. None of the papers provide a balance of evidence that complexity improves forecast accuracy.Complexity increases forecast error by 27 percent on average in the 25 papers with quantitative comparisons. The finding is consistent with prior research to identify valid forecasting methods: all 22 previously identified evidence-based forecasting procedures are simple. Nevertheless, complexity remains popular among researchers, forecasters, and clients. Some evidence suggests that the popularity of complexity may be due to incentives:(1) researchers are rewarded for publishing in highly ranked journals, which favor complexity; (2) forecasters can use complex methods to provide forecasts that support decision-makers’ plans; and (3) forecasters’ clients may be reassured by incomprehensibility. Clients who prefer accuracy should accept forecasts only from simple evidence-based procedures. They can rate the simplicity of forecasters’ procedures using the questionnaire at simple-forecasting.com.
ResearchGate has not been able to resolve any references for this publication.