PresentationPDF Available

Are global warming forecasts scientific? Evidence from a forecasting audit and a validation study

Authors:
Are global warming forecasts
scientific?
Evidence from a forecasting audit
and a validation study
Kesten C Green
kesten@me.com
University of South Australia
RMIT, 12:30PM July 29, 2011
Green & Armstrong
forecasting audit
“Global Warming: Forecasts by Scientists versus
Scientific Forecasts” in E&E (2007) 18(7+8)*
• Experts on forecasting methods, not climate
• No funding for research on climate forecasting
• Sponsorship from the International Institute of
Forecasters for publicpolicyforecasting.com
and theclimatebet.com
*available at publicpolicyforecasting.com
2publicpolicyforecasting.com
The Forecasting Problem
For policy recommendations based on
global warming, forecasts must be
accurate for each of the following:
1. Long-term temperature change
2. Effects of temperature changes
3. Effects of feasible policy changes
3publicpolicyforecasting.com
Do we know enough to forecast
climate changes?
(Or, is the science really settled… this time?)
“It is once for all clear… that the earth is in
the middle of the world
Ptolemy, 2nd Century A.D.
“There is… growing consensus among
leading climatologists that the world is
undergoing a cooling trend.
CIA report, 1974
4publicpolicyforecasting.com
What are scientific forecasts?
“Forecasts derived from evidence-based
methods
Evidence from over half a century of research in
Economics Psychology
Finance Marketing
Weather Production and inventory
Sociology Engineering
Medicine Demography
etc.
Summarized as 140 principles
forecastingprinciples.com (or ForPrin.com)
(first in Google search for “Forecasting”)
Principles of Forecasting (Armstrong 2001)
(39 authors and 123 reviewers)
5publicpolicyforecasting.com
Examples of principles
Avoid experts unaided* judgments
Avoid complex models
Be conservative when uncertainty is high
* Unaided by scientific forecasting principles
6publicpolicyforecasting.com
Is climate change immune
from forecasting principles?
People who assert yes have
been unable to provide any
supporting evidence.
7publicpolicyforecasting.com
Forecasts from climate modelers
Some climate modelers claim that their models
do not make forecasts.
However, other climate modelers do claim to
make forecasts:
“forecast” and derivatives occurred 37 times, and
“predict” and derivatives 90 times in Chapter 8 of the
2007 IPCC report.
Climate experts use models to express their
judgmentsthe assumptions that go in, “tuning”,
and adjustments to what comes out. Thus, they
make “expert forecasts.
8publicpolicyforecasting.com
“Today’s scientists have substituted
mathematics for experiments, and
they wander off through equation
after equation and eventually build a
structure which has no relation to
reality.”
Nikola Telsa, inventor and electrical engineer,
1934.
9publicpolicyforecasting.com
Can experts make useful
climate forecasts?
Armstrong (1978) summarized studies to date:
people with much expertise are no better at
forecasting than those with little expertise.
Tetlock (20 0 5 ) : e valuated 8 2 , 3 6 1 f o recasts m a d e
over 20 years by 284 professional commentators
and advisors on politics and economics and found
that expertise did not lead to better forecasts.
(Fortunately for pundits, the Seer-sucker theory offers hope:
No matter how much evidence exists that seers do
not exist, seers will find suckers.” (A rm s tr ong 1 9 7 8 ))
10publicpolicyforecasting.com
Identifying key papers on
forecasting climate change
Sent requests to 240 climate experts (70% were
IPCC authors or reviewers),
“We want to know which forecasts people regard
as the most credible and how those forecasts were
derived…
In your opinion, which scientific article is the
source of the most credible forecasts of global
average temperatures over the rest of this
century?”
51 people sent responses, of which
42 included references, of which
30 referred to latest IPCC report
11publicpolicyforecasting.com
Scientific Literature
in IPCC Chapter 8
Of the roughly 650 references cited in
IPCC Ch. 8, none had any obvious
relationship to scientific forecasting
methods
12publicpolicyforecasting.com
Forecasting audit process
All elements of the forecasting process
were examined independently by two
people to:
identify relevant principles
assess whether proper
procedures were used.
13publicpolicyforecasting.com
Audit of IPCC Chapter 8
127 of the 140 principles in Forecasting
Audit were relevant.
Authors rated the forecasting procedures
independently, then resolved differences
via email.
89 principles rateable.
72 principles contravened.
13% of principles properly applied.
14publicpolicyforecasting.com
Some contraventions
Use simple forecasting methods:û
Test on out-of-sample data: û
Provide easy access to data: û
15publicpolicyforecasting.com
Full disclosure &
open peer review
Our audit is fully-disclosed at
publicpolicyforecasting.com
All invited to apply the Forecasting Audit
to Ch. 8or to another climate
forecasting paper and publish on
publicpolicyforecasting.com
We welcome commentary and continuing
peer review of our paper.
16publicpolicyforecasting.com
17
Validation study
publicpolicyforecasting.com
Green, Armstrong, & Soon
validation study
GAS (2009). Validity of climate change forecasting
for public policy decision making. International
Journal of Forecasting, 25, 826-832.*
• Experts on forecasting methods, and climate
• No funding for this research
• Sponsorship from the International Institute of
Forecasters for publicpolicyforecasting.com
and theclimatebet.com
*available at publicpolicyforecasting.com
18publicpolicyforecasting.com
19
Characteristics of
temperature series
Temperature varies over time, but
No persistent trend
Apparent short and long trends
Of varied length
That reverse without warning
publicpolicyforecasting.com
20
Antarctic temperature changes
publicpolicyforecasting.com
21
Hadley annual temperature 1850-2008
publicpolicyforecasting.com
22
Graph from drroyspencer.com
publicpolicyforecasting.com
Satellite-based temperature anomalies: 1979 -June 2011
The question is, can we forecast
what will happen over the 21st
Century?
“A trend is a trend is a trend
But the question is, will it bend?
Will it alter its course
Through some unforeseen force
And come to a premature end?
Cairncross (1969)
No scientific forecasts to date
Climate is complex.
Uncertainty is high
causes of changes are disputed,
causal factors are difficult to forecast,
data are subject to error.
In such conditions, climate models, even if
properly developed forecasting models, will
struggle to beat the simple naïve model,
which assumes complete ignorance about
climate.
24publicpolicyforecasting.com
25
Conditions favor
conservatism
Many opinions by experts, but no
evidence that the climate is
different now
publicpolicyforecasting.com
26
No-change benchmark model
Tempyear+1,2,…100 = Tempyear
publicpolicyforecasting.com
27
Test of the benchmark
Used HadCRUt3 “best estimate” of global
mean temperatures from 1850 to 2007
Each year’s temperature a forecast for up to
100 subsequent years
157 one-year-ahead forecasts…
58 hundred-year-ahead forecasts
10,750 forecasts across all horizons
Absolute errors calculated vs HadCRUt3
publicpolicyforecasting.com
28publicpolicyforecasting.com
29
Validity of IPCC projection: 1851-
1975 *
IPCC/No-change error ratio** < 1 means forecast errors
are smaller (better) than no-change errors
IPCC/No-change Error
Ratio n
Rolling (1-100 years) 7.7 10,750
1-10 years 1.5 1,205
41-50 years 6.8 805
91-100 years 12.6 305***
* Green, Armstrong & Soon (2009).
** A.k.a. Cumulative Relative Absolute Error or CumRAE
*** Covers only 1941 through 2007
publicpolicyforecasting.com
Summary
Policy decisions require scientific long term
forecasts of temperature and effects of
policies
Such forecasts do not exist
Climate data and knowledge are uncertain,
and climate is complex
The situation calls for simple methods and
conservative forecasts
The no-change benchmark performs well
IPCC projections do not compare well
Scientific forecasting suggests appropriate
policy decision is “don’t just do something,
stand there!
Conclusions with respect to
the forecasting process
There are no scientific forecasts of:
(1) manmade global warming, or
(2) net harmful effects due to warming, or
(3) net beneficial effects from proposed policies.
Forecasts of dangerous manmade global warming are
the product of an anti-scientific political movement.
Enormous expenditures on forecasting lead to a
loss of objectivity*.
*See Kealey’s “Economics laws of scientific research” or “Sex,
science, & profits” 31
publicpolicyforecasting.com
“ Czech President Klaus: Global
Warming Not Science, but a 'New
Religion’ ”
"Politicians and their fellow travelers, the media and
the business community, simply understood that this
is a very good topic to take on. It's an excellent idea
to escape from the current reality. Not to solve the
crisis, but to talk about the world in 2050, 2080,
2200. This is for them an excellent job. They will not
be punished by the voters for making a totally wrong
decision, a wrong forecast."
President Vaclav Klaus
Graduate of University of Economics, Prague
Gene J. Koprowski, FoxNews.com, 18 December 2009
http://www.foxnews.com/scitech/2009/12/18/czech-president-klaus-global-warming-
science-new-religion/32
publicpolicyforecasting.com
Analogies in forecasting
Analogies are commonly used
in an unstructured manner, and
after making a forecast in order to support the
forecast.
Analogies do contain useful information and
can aid forecasts
if identified and analyzed by experts…
in a structured and unbiased manner.
33publicpolicyforecasting.com
Has anything similar happened
before?*
Alarms based on predictions of serious
harm that could only be averted at
great cost.
*Green & Armstrong (2007). Structured
Analogies in Forecasting, International Journal
of Forecasting, 23 365-376: provides evidence
on validity of forecasting with “structured
analogies”
34publicpolicyforecasting.com
List of the 26 relevant analogies
Population growth and famine (Malthus) 1798
Timber famine economic threat 1865
Uncontrolled reproduction and degeneration (Eugenics) 1883
Lead in petrol and brain and organ damage 1928
Soil erosion agricultural production threat 1934
Asbestos and lung disease 1939
Fluoride in drinking water health effects 1945
DDT and cancer 1962
Population growth and famine (Ehrlich) 1968
Global cooling; through to 1975 1970
Supersonic airliners, the ozone hole, & skin cancer, etc.1970
Environmental tobacco smoke health effects 1971
Population growth and famine (Meadows) 1972
Industrial production and acid rain 1974
Organophosphate pesticide poisoning 1976
Electrical wiring and cancer, etc. 1979
CFCs, the ozone hole, and skin cancer, etc. 1985
Listeria in cheese 1985
Radon in homes and lung cancer 1985
Salmonella in eggs 1988
Environmental toxins and breast cancer 1990
Mad cow disease (BSE) 1996
Dioxin in Belgian poultry 1999
Mercury in fish effect on nervous system development 2004
Mercury in childhood inoculations and autism 2005
Cell phone towers and cancer, etc. 2008
35publicpolicyforecasting.com
Government intervention was called for
in 25 of the 26 analogous situations
Government actions typically called for:
• Increased government taxes
• Increased government spending
• Restricting individual liberties
36
publicpolicyforecasting.com
How accurate were
the alarming forecasts?
Preliminary coding revealed of the forecasts made in the 26
analogous situations:
!categorically wrong 19
!wrong in degree 7
!accurate 0
37publicpolicyforecasting.com
Did government intervention help?
Among the 23 analogous situations in which
government policies were implemented:
n
Harm was caused 20
Policies were ineffective/uncertain 3
Policies were effective 0
38
publicpolicyforecasting.com
Conclusions about the
global warming alarm movement
We predict that the global warming alarm movement will
follow the same path as those traced out by the 26
analogies:
1.The alarming forecasts will be seen to be unreliable.
2.The imposition of costs will be unpopular.
3.Governments will avoid or cheat on agreements in order
to reduce costs.
4.Government actions will nevertheless continue to cause
widespread harm.
39publicpolicyforecasting.com
Alarms based on bad
forecasting
are a familiar social
phenomenon
“As soon as one predicted disaster doesn't
occur, the doomsayers skip to another... why
don't [they] see that, in the aggregate, things
are getting better? Why do they always think
we're at a turning pointor at the end of the
road?”
Julian Lincoln Simon, 1990
“On what principle is it that when we see
nothing but improvement behind us, we are to
expect nothing but deterioration before us?”
Thomas Babington Macaulay, 1830
40
publicpolicyforecasting.com
The working paper and
analogy list are available
at publicpolicyforecasting.com
“Effects of the global warming alarm: A forecasting project
using the structured analogies method”
Green and Armstrong (2011)
We welcome your suggestions and analogy
codings.
kesten@me.com
41publicpolicyforecasting.com
QUESTIONS?
What causes temperature change?
42
-0.8
-0.7
-0.6
-0.5
-0.4
-0.3
-0.2
-0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
Hadley Temperature Series
publicpolicyforecasting.com
Does the increase in consumer price
index causes global temperatures to rise?
43
y= 0.0003x -0.324
R2= 0.72
-0.8
-0.7
-0.6
-0.5
-0.4
-0.3
-0.2
-0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0500 1000 1500 2000 2500 3000
Hadley Temperature Series
Price Index
publicpolicyforecasting.com
Correlations between global temperatures
and upwardly mobile time series
44
Series Correlation
Atmospheric CO21850-2008 0.86
U.S. Postal rates 1885-2009 0.85
U.S. Price Index 1850-2009 0.85
NOAA* expenditure 1970-2006 0.82
Books published in U.S. 1881-2008 0.73
No change (naïve model) 0.00
*National Oceanic and Atmospheric Administration
publicpolicyforecasting.com
Causal model out-of-sample*
forecasting performance
“Causal” variable WtdCumRAE**
U.S. Price Index 1850-2009 0.6
Naïve Model 1.0
NOAA expenditure 1970-2006 1.1
Atmospheric CO21850-2008 1.9
Books published in U.S. 1881-2008 2.1
U.S. Postal rates 1885-2009 14.0
*Models estimated using 1st half of data series (e.g. 1850-1929 for the U.S.
Price Index series), then models used to forecast the 2nd half temperatures
(e.g. 1930-2009 for the U.S. Price Index series) using actual values of the
“causal” variable.
**Weighted Cumulative Relative Absolute Error; relative to no-change
benchmark, weighted so that errors for each forecasting horizon are
counted equally. (Note: WtdCumRAE < 1 means more accurate than
benchmark.)
45publicpolicyforecasting.com
Fit not related to forecast accuracy
Results from this validation study
consistent with research on time-
series forecasting:
correlation Error
with temp ratio*
Naïve model 0.00 1.0
IPCC 0.86 7.7
* Averaged over all forecast horizons
46
publicpolicyforecasting.com
“The Precautionary Principle
It is a political principle. . . if the government is persuaded
that there is a risk with a high possible cost, there is no
need for a rational analysis.
Contrary to scientific analyses of costs and benefits.
Brings to mind the slogan on the Ministry of Truth building
in George Orwell’s 1984: “Ignorance is Strength.”
Scientific forecasting suggests appropriate policy decision is
“don’t just do something, stand there!”
For more see “Evidence-based forecasting for climate
change: Uncertainty, the Precautionary Principle, and
Climate Change” on theclimatebet.com Sept 1, 2008
What is needed to forecast the effects
of policies to stop climate change?
Scientific forecasts for alternative possible
policies:
how they would actually be implemented
all their effects
all the costs and benefits of all their effects
48
publicpolicyforecasting.com
References
ARMSTRONG, J. S. (1978). Long-range forecasting: From crystal ball to computer. New York: Wiley-
Interscience.
BRAY, D. & VON STORCH, H. (2007). Climate scientists perceptions of climate change science.
GKSS Forschungszentrum Geesthacht GmbH.
CAIRNCROSS (1969). Economic Forecasting, Economic Journal, 79, 797-812.
CAMERER, C. (1998). Can Asset Markets Be Manipulated? A Field Experiment with Racetrack
Betting, Journal of Political Economy, 106, 457-482.
GORE, A. (2006). An inconvenient truth: The planetary emergency of global warming and what we can
do about it. Emmaus, PA: Rodale Press.
GORE, A. (2007). The assault on reason. New York: Penguin.
GREEN, K. C. & ARMSTRONG J. S. (2007). Global Warming: Forecasts by Scientists versus
Scientific Forecasts, Energy and Environment, 18, No. 7+8, 995-1019.
GREEN, K. C. & ARMSTRONG J. S. (2007). Structured Analogies in Forecasting, International
Journal of Forecasting, 23 365-376.
HANSEN, J., SCHMIDT, C. & STROBEL, M. (2004). Manipulation in Political Stock Markets -
Preconditions and Evidence, Applied Economics Letters, 11, 459-463.
HANSON, R., OPREA, R. & PORTER, D. (2006). Information Aggregation and Manipulation in an
Experimental Market, Journal of Economic Behavior & Organization, 60, 449-459.
ORESKES, N. (2004). The Scientific Consensus on Climate Change. Science, 306, 1686.
PEISER, B. (2005). The letter Science Magazine refused to publish. Available at
http://www.staff.livjm.ac.uk/spsbpeis/Scienceletter.htm
RHODE, P. W. & STRUMPF, K. S. (2004). Historical Presidential Betting Markets, Journal of
Economic Perspectives, 18, 127-141.
SCHULTE K. M. (2008). Scientific consensus on climate change? Energy & Environment, 19, 281-
286.
TETLOCK, P. E. (2005). Expert political judgment: How good is it? How can we know? Princeton, NJ:
49publicpolicyforecasting.com
References for the Armstrong and Green talks
[Papers available at http://publicpolicyforecasting.com unless otherwise indicated]
Armstrong, J. S., Green, K. C., & Soon, W. (2008). Polar Bear Population Forecasts: A
Public-Policy Forecasting Audit. Interfaces, 38, 5, 382405. (includes commentary &
response).
Green, K.C., & Armstrong, J. S. (2007). Global Warming: Forecasts by Scientists versus
Scientific Forecasts. Energy and Environment, 18, No. 7+8, 995-1019.
Green, K.C., & Armstrong, J. S. (2007). Structured Analogies in Forecasting. International
Journal of Forecasting, 23, 365-376.
Green, K.C., Armstrong, J. S., & Soon, W. (2009). Val idity of Climat e Cha nge Forecasting for
Public Policy Decision Making. International Journal of Forecasting, Forthcoming.
Tetlock, P. E. (2005). Expert Political Judgment: How Good Is It? How Can We Know?
Princeton University Press, Princeton, NJ.
Details on Gore-Armstrong bet at http://theclimatebet.com
50publicpolicyforecasting.com
References
Armstrong, J. S. (1978; 1985), Long-Range Forecasting: From Crystal Ball to
Computer.New York: Wiley-Interscience.
Armstrong, J. S. (1980), “The Seer-Sucker Theory: The Value of Experts in
Forecasting,” Technology Review, 83 (June/July), 18-24.
Armstrong, J. S., Green, K.C., & Soon, W. (2008), “Polar Bear Population Forecasts:
A Public-Policy Forecasting Audit,” Interfaces, 38, No. 5, 382405. [Includes
commentary and response]
Green, K. C. & Armstrong, J. S. (2007a), “Global warming: Forecasts by scientists
versus scientific forecasts,” Energy and Environment, 18, No. 7+8, 995-1019.
Green, K. C. & Armstrong, J. S. (2007b), “Structured analogies for forecasting,”
International Journal of Forecasting, 23, 365-376.
Green, K. C. & Armstrong J. S. (2011), “Effects of the global warming alarm: A
forecasting project using the structured analogies method,” Working Paper.
Green, K. C., Armstrong, J. S. & Soon W. (2009), “Validity of Climate Change
Forecasting for Public Policy Decision Making,” International Journal of Forecasting,
25, 826-832.
Gregory & Duran (2001), “Scenarios and acceptance of forecasts.” In J. S.
Armstrong, Principles of Forecasting. Kluwer Academic Publishers (Springer).
Tetlock, P. E. (2005), Expert Political Judgment. Princeton, NJ: Princeton University
Press.
51publicpolicyforecasting.com
52
References
Althuizen, Niek A.P. and Wierenga, Berend, The Value of Analogical Reasoning for the Design of
Creative Sales Promotion Campaigns: A Case-Based Reasoning Approach (December 2008, 02).
ERIM Report Series Reference No. ERS-2008-006-MKT. Available at SSRN:
http://ssrn.com/abstract=1333466
Crittenden & Woodside (2007). Building Skills in Thinking: Toward a Pedagogy in Metathinking.
Journal of education for business, 83(1), 37-43.
Green, K. C. & Armstrong, J. S. (2007). Global warming: Forecasts by scientists versus scientific
forecasts. Energy and Environment, 18, 997-1021.
Savio, N. and Nikolopoulos, K. (2009). Forecasting the economic impact of new policies. Foresight,
11 ( 2), 7-18.
Savio, N. and Nikolopoulos, K. (2009). Forecasting the effectiveness of policy implementation
strategies: working with semi-experts. Foresight, 11 (6), 86-93.
Savio, N. and Nikolopoulos, K. (2010). Forecasting the effectiveness of policy implementation
strategies. International Journal of Public Administration, 33(2), 88-97.
Simon, J. L. (1996). The ultimate resource, 2. Princeton: Princeton, NJ.
Tierney, J. (1990). Betting on the planet. New York Times Magazine, 2 December, pp 52-81.
Wildavsky, A. (1995). But is it true? A citizen’s guide to environmental health and safety issues.
Harvard University Press: Cambridge, MA.
Xue, H-D. & Zhu, Q-X. (2010). Time Series Prediction Algorithm Based on Structured Analogy.
Computer Engineering, 36(1), 211-214.
http://www.ecice06.com/CN/article/downloadArticleFile.do?attachType=PDF&id=8788
publicpolicyforecasting.com
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
People often use analogies when forecasting, but in an unstructured manner. We propose a structured judgmental procedure whereby experts list analogies, rate similarity to the target, and match outcomes with possible target outcomes. An administrator would then derive a forecast from the information. When predicting decisions made in eight conflict situations, unaided experts’ forecasts were little better than chance at 32% accurate. In contrast, 46% of structured-analogies forecasts were accurate. Among experts who were able to think of two or more analogies and who had direct experience with their closest analogy, 60% of forecasts were accurate. Collaboration did not help. Key words: availability, case-based reasoning, comparison, decision, method.
Article
Full-text available
When the beach patrol raises the alarm that a shark has been sighted we know what to do, but how should we respond to an alarm that is based on predictions of what will happen 100 years from now and the person raising the alarm tells us we must make great sacrifices now to avoid the predicted catastrophe? To answer this question, we forecast effects and outcomes of the current global warming alarm using a structured analysis of analogous situations. To do this, we searched the literature and asked experts to identify phenomena that were similar to the alarm currently being raised over dangerous manmade global warming. We obtained 71 proposed analogies. Of these, 26 met our criteria that the alarm be: (1) based on forecasts of human catastrophe arising from effects of human activity on the physical environment, (2) endorsed by experts, politicians and the media, and (3) that were accompanied by calls for strong action. None of the 26 alarms were based on scientific forecasting procedures. None of the alarming forecasts were accurate.Governments took action in 23 of the analogous situations and those actions proved to be harmful in 20. The government programs remained in place after the predicted disasters failed to materialize. The global warming alarm movement appears to be the latest manifestation of a common social phenomenon: false alarms based on unscientific forecasts of human-caused environmental disasters. We predict that the alarm over forecasts of dangerous manmade global warming will, like previous similar alarms, result in harm.
Article
Full-text available
Policymakers need to know whether prediction is possible and, if so, whether any proposed forecasting method will provide forecasts that are substantially more accurate than those from the relevant benchmark method. An inspection of global temperature data suggests that temperature is subject to irregular variations on all relevant time scales, and that variations during the late 1900s were not unusual. In such a situation, a "no change" extrapolation is an appropriate benchmark forecasting method. We used the UK Met Office Hadley Centre's annual average thermometer data from 1850 through 2007 to examine the performance of the benchmark method. The accuracy of forecasts from the benchmark is such that even perfect forecasts would be unlikely to help policymakers. For example, mean absolute errors for the 20- and 50-year horizons were 0.18 Â oC and 0.24 Â oC respectively. We nevertheless demonstrate the use of benchmarking with the example of the Intergovernmental Panel on Climate Change's 1992 linear projection of long-term warming at a rate of 0.03 Â oC per year. The small sample of errors from ex ante projections at 0.03 Â oC per year for 1992 through 2008 was practically indistinguishable from the benchmark errors. Validation for long-term forecasting, however, requires a much longer horizon. Again using the IPCC warming rate for our demonstration, we projected the rate successively over a period analogous to that envisaged in their scenario of exponential CO2 growth--the years 1851 to 1975. The errors from the projections were more than seven times greater than the errors from the benchmark method. Relative errors were larger for longer forecast horizons. Our validation exercise illustrates the importance of determining whether it is possible to obtain forecasts that are more useful than those from a simple benchmark before making expensive policy decisions.
Article
(Abstract)Based on the thoughts of structured analogy forecasting, a novel algorithm is proposed to solve the numeric time series probability prediction problem named Structured Analogy Prediction for Time Series(SAP-TS). SAP-TS constructs the conditional probability distribution through analogies, which avoids the obstacles encountered by classical probability methods, either weak predictability or intractable extremely large contingency tables, which also incurs lack of data problem. Furthermore, SAP-TS offers integrated confidence index to evaluate the prediction accuracy instantaneously. When applying SAP-TS to predict the acetic acid amount of Purified Terephthalic Acid(PTA) solvent system, the prediction results are more precise than the results of Generalized Regression Neural Network(GRNN). The previous best method, and the integrated confidence index also effectively evaluate the prediction accuracy.
Article
Purpose Policy implementation strategies (PIS) are schemes designed by a government with an aim of hitting targets or attaining objectives set out by a policy. Forecasting by analogies (FBA) has been shown to be successful in situations of high uncertainty and low quantitative data as is that of PIS effectiveness forecasts. What is more, a structured approach to FBA helps the expert structure his thoughts in an organized manner before making a prediction, which is hypothesized to improve accuracy. This paper aims to discuss these issues. Design/methodology/approach This research suggests a semi‐structured analogies (S‐SA) approach for such a task. A pilot experiment was carried to test the performance of the S‐SA approach in the hands of semi‐experts when compared with unaided judgment (UJ). Findings The results of the experiment showed that for this level of expertise, there is no statistical evidence to suggest any approach is better than the other. Possible explanations of this result is that analogy recall of experts was hindered by four constructs: information, complexity, worldview, and expertise. It was concluded that the structured analogies approach for forecasting PIS effectiveness must be investigated further by means of a study involving “true experts”. Research limitations/implications The sample sizes were small. Practical implications Forecasting PIS effectiveness is seen as an important tool for deciding upon which PIS to ultimately adopt (as there may be many available!) and this then has important implications for governmental budgeting. Originality/value The paper offers further insight into the performance of a structured analogies approach to forecasting PIS effectiveness in the hands of individuals with a mid‐level of expertise.
Article
Primarily, policies are intended to address economic, social and environmental problems. When implementing a policy, any government will be faced with the decision as to what strategy to adopt in order to meet the objectives set out by the policy in the most cost effective way. Several such Policy Implementation Strategies (PIS) may be available, making such a decision not so straightforward. With limited funds available, such a decision has particular importance for budgeting. This paper proposes forecasting PIS effectiveness as a decision support tool. The nature of Structured Analogies (SA) is considered suitable for generating such forecasts. A simpler version of SA, semi-structured analogies (S-SA), where experts do not need to recollect the exact outcome of analogies, is tested. Empirical findings suggest that in the hands of non-experts, the S-SA approach improves forecast accuracy when compared to unaided judgment. Accuracy improves further when forecasts are produced in groups.
Article
Purpose Once a policy proposed by the European Commission is approved by European Parliament or Council, its implementation strategy is the responsibility of the member states. Often, there will be several parallel strategies shaped by a series of incentives financed by the government and naturally, the aim is to choose the most cost effective one. For strategy and planning as well as budgeting purposes, forecasts of the adoption rate of these policy implementation strategies will be an indicator as to their effectiveness. A new hybrid approach combining structured analogies and econometric modelling is proposed for producing such forecasts. Design/methodology/approach With every different policy, there will be different qualitative and quantitative data available for producing such implementation strategy adoption rate forecasts. Hence, the proposed hybrid approach, which combines the strengths and reduces the weaknesses of each of its constituents, can be adjusted to match the quantity and nature of the available data. Findings This paper reveals a lack of emphasis on such a forecasting application in the existing literature, while stressing its importance to governmental decision makers. What is more, the paper reveals a lack of documentation of this forecasting process in large governmental structures. Practical implications If shown to improve the ability to produce such forecasts, the proposed approach could be very beneficial to decision makers when faced with several possible implementation strategies. Originality/value The use of expertise is quite common in forecasting policy impact but in an unstructured way. The advanced model proposes structuring the use of analogies in an objective manner. Furthermore, combining with econometric modelling, the incorporation of valuable quantitative information is made possible.
Article
Sumario: Were the early scares justified by the evidence? Cranberries, Dieldrin, Saccharin -- PCBs and DDT: too much of a good thing? -- Dioxin, Agent Orange and Times Beach -- Love Canal: was there evidence of harm? -- Superfund's abandoned hazardous waste sites -- No runs, no hits, all errors: the asbestos and Alar Scares -- How does science matter? -- Do Rodent studies predict cancer in human beings? -- The effects of acid rain on the United States -- CFCs and ozone depletion -- Who's on first? A global warming scorecard -- Reporting environmental science -- Citizenship in science -- Detecting errors in environmental and safety studies -- Conclusion: Rejecting the precautionary principle