Project

Markets for Forecasting (marketsforforecasting.com)

Goal: To assess the predictive validity and usefulness of prediction markets under different conditions.

Updates
0 new
0
Recommendations
0 new
0
Followers
0 new
2
Reads
3 new
1015

Project log

Kesten Green
added 5 research items
Earlier versions of this Tree appear in various publications and presentations. For more information on the methods and how to implement them, see "Forecasting Methods and Principles: Evidence-based Checklists."
Earlier versions of this Tree appear in various publications and presentations. Note that the presence of a method in the Tree does not signal its predictive validity. For evidence on which methods are valid, see "Forecasting Methods and Principles: Evidence-based Checklists."
Problem How to help practitioners, academics, and decision makers use experimental research findings to substantially reduce forecast errors for all types of forecasting problems. Methods Findings from our review of forecasting experiments were used to identify methods and principles that lead to accurate forecasts. Cited authors were contacted to verify that summaries of their research were correct. Checklists to help forecasters and their clients undertake and commission studies that adhere to principles and use valid methods were developed. Leading researchers were asked to identify errors of omission or commission in the analyses and summaries of research findings. Findings Forecast accuracy can be improved by using one of 15 relatively simple evidence-based forecasting methods. One of those methods, knowledge models, provides substantial improvements in accuracy when causal knowledge is good. On the other hand, data models – developed using multiple regression, data mining, neural nets, and “big data analytics” – are unsuited for forecasting. Originality Three new checklists for choosing validated methods, developing knowledge models, and assessing uncertainty are presented. A fourth checklist, based on the Golden Rule of Forecasting, was improved. Usefulness Combining forecasts within individual methods and across different methods can reduce forecast errors by as much as 50%. Forecasts errors from currently used methods can be reduced by increasing their compliance with the principles of conservatism (Golden Rule of Forecasting) and simplicity (Occam’s Razor). Clients and other interested parties can use the checklists to determine whether forecasts were derived using evidence-based procedures and can, therefore, be trusted for making decisions. Scientists can use the checklists to devise tests of the predictive validity of their findings.
Kesten Green
added a project goal
To assess the predictive validity and usefulness of prediction markets under different conditions.