Article

Quantifying the Effects of Expert Selection and Elicitation Design on Experts' Confidence in Their Judgments About Future Energy Technologies

Wiley
Risk Analysis
Authors:
  • RFF-CMCC European Institute on Economics and the Environment
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Expert elicitations are now frequently used to characterize uncertain future technology outcomes. However, their usefulness is limited, in part because: estimates across studies are not easily comparable; choices in survey design and expert selection may bias results; and overconfidence is a persistent problem. We provide quantitative evidence of how these choices affect experts' estimates. We standardize data from 16 elicitations, involving 169 experts, on the 2030 costs of five energy technologies: nuclear, biofuels, bioelectricity, solar, and carbon capture. We estimate determinants of experts' confidence using survey design, expert characteristics, and public R&D investment levels on which the elicited values are conditional. Our central finding is that when experts respond to elicitations in person (vs. online or mail) they ascribe lower confidence (larger uncertainty) to their estimates, but more optimistic assessments of best-case (10th percentile) outcomes. The effects of expert affiliation and country of residence vary by technology, but in general: academics and public-sector experts express lower confidence than private-sector experts; and E.U. experts are more confident than U.S. experts. Finally, extending previous technology-specific work, higher R&D spending increases experts' uncertainty rather than resolves it. We discuss ways in which these findings should be seriously considered in interpreting the results of existing elicitations and in designing new ones.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... This could explain why studies involving small groups of experts are not uncommon for anticipating technological progress (Wiser et al. 2016). Nemet et al. conducted a large survey of 166 "experts," but went further to identify a subset as "leading-experts" (Nemet et al. 2017) which suggests an increased value in the opinions of the smaller group. In that study, efforts were made to reduce overconfidence by asking for quantiles of 10%, 50% and 90% as well as through administering a module on overconfidence. ...
... This study attempted to address this limitation with the naïve calibration technique, but this technique was not shown to be particularly effective and a standard calibration could have proved more useful. The study also did not include instructions to inform participants of common mistakes in forecasting associated with overconfidence or other cognitive biases as one study identified in the literature had (Nemet et al. 2017). Yet another limitation could lie in the ordering of the questions, which may have had an unintended effect of priming participants for the later responses (Kahneman 2011). ...
Article
Full-text available
While labor-displacing AI has the potential to transform critical aspects of society in the near future, previous work has ignored the possibility of the extreme labor displacement scenarios that could result. To explore this we surveyed attendees of three AI conferences in 2018 about near-to-mid-term AI labor displacement as well as five more extreme labor-displacing AI scenarios. Practitioners indicated that a median of 22% of tasks which humans are currently paid to do could be automated with existing AI; they anticipate this figure rising to 40% in 5 years and 60% in 10 years. Median forecasts indicated a 50% probability of AI systems being capable of automating 90% of human tasks in 25 years and 99% of human tasks in 50 years. Practitioners surveyed at the different conferences had similar forecasts for AI labor displacement this decade, but attendees of the Human-level AI Conference had significantly shorter and more precise forecasts for the more extreme labor-displacing AI scenarios. Interestingly, median forecasts of a 10% probability of 90% and 99% of human tasks being automated were 10 years and 15 years, respectively. We conclude that future of work researchers should more carefully consider these relatively high likelihoods of extreme labor-displacing AI scenarios.
... As this elicitation aims at better characterizing the expert agreements and disagreements for a hypothetical EGS scenario rather than eliciting consensual probability distributions, we have invited a diverse group of experts, varying country, sector, discipline, and experience 33,34 . ...
... Following the standard procedures of expert selection 25,31,33 , we have first identified and approached the key academic experts that worked in the last several years on EGS induced 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 A c c e p t e d M a n u s c r i p t seismicity hazard and risk in different countries. In the interviews with these experts, we have used snowball sampling and asked them to identify others who have the relevant expertise, especially in other countries, disciplines and in industry or public administration. ...
Article
Full-text available
Induced seismicity is a concern for multiple geoenergy applications, including low-carbon enhanced geothermal systems (EGS). We present the results of an international expert elicitation (n = 14) on EGS induced seismicity hazard and risk. Using a hypothetical scenario of an EGS plant and its geological context, we show that expert best-guess estimates of annualized exceedance probabilities of an M ≥ 3 event range from 0.2%–95% during reservoir stimulation and 0.2%–100% during operation. Best-guess annualized exceedance probabilities of M ≥ 5 event span from 0.002%–2% during stimulation and 0.003%–3% during operation. Assuming that tectonic M7 events could occur, some experts do not exclude induced (triggered) events of up to M7 too. If an induced M = 3 event happens at 5 km depth beneath a town with 10 000 inhabitants, most experts estimate a 50% probability that the loss is contained within 500 000 USD without any injuries or fatalities. In the case of an induced M = 5 event, there is 50% chance that the loss is below 50 million USD with the most-likely outcome of 50 injuries and one fatality or none. As we observe a vast diversity in quantitative expert judgements and underlying mental models, we conclude with implications for induced seismicity risk governance. That is, we suggest documenting individual expert judgements in induced seismicity elicitations before proceeding to consensual judgements, to convene larger expert panels in order not to cherry-pick the experts, and to aim for multi-organization multi-model assessments of EGS induced seismicity hazard and risk.
... Instead we asked experts to 7 For an example, see http://rincon.lbl.gov/lcoe_v2/lcoe_calculator.html. 8 Some research shows that elicitations relying on self-administered, web-based surveys yield results different from those relying on in-person interviews Nemet et al. 2016), whereas other research shows less evidence of such differences (Anadon et al. 2013;Baker et al. 2015); where differences exist, the relative accuracy of the two methods remains unclear, though it is generally believed that in-person interviews represent the "gold standard." On the other hand, Baker et al. (2015) and Nemet et al. (2016) also suggest there is value in including diverse and relatively large groups of experts when conducting elicitations, suggesting-all else being equal, and due to the resource intensity of in-person elicitation-that there is value to online elicitations. ...
... 8 Some research shows that elicitations relying on self-administered, web-based surveys yield results different from those relying on in-person interviews Nemet et al. 2016), whereas other research shows less evidence of such differences (Anadon et al. 2013;Baker et al. 2015); where differences exist, the relative accuracy of the two methods remains unclear, though it is generally believed that in-person interviews represent the "gold standard." On the other hand, Baker et al. (2015) and Nemet et al. (2016) also suggest there is value in including diverse and relatively large groups of experts when conducting elicitations, suggesting-all else being equal, and due to the resource intensity of in-person elicitation-that there is value to online elicitations. 9 Our interest in assessing the effect of respondent type on elicitation results follows related work conducted by identify which advancements they believe would be larger contributors to cost reductions. ...
Article
Full-text available
Wind energy supply has grown rapidly over the last decade. However, the long-term contribution of wind to future energy supply, and the degree to which policy support is necessary to motivate higher levels of deployment, depends—in part—on the future costs of both onshore and offshore wind. Here, we summarize the results of an expert elicitation survey of 163 of the world’s foremost wind experts, aimed at better understanding future costs and technology advancement possibilities. Results suggest significant opportunities for cost reductions, but also underlying uncertainties. Under the median scenario, experts anticipate 24–30% reductions by 2030 and 35–41% reductions by 2050 across the three wind applications studied. Costs could be even lower: experts predict a 10% chance that reductions will be more than 40% by 2030 and more than 50% by 2050. Insights gained through expert elicitation complement other tools for evaluating cost-reduction potential, and help inform policy and planning, R&D and industry strategy.
... We thereby leverage the strengths of experience curves to expand the toolkit for technology forecasting. Rather than asking experts for notoriously difficult cost predictions, 77 we use expert interviews to empirically assess two inherent characteristics of existing technology components-factors that experts can judge more easily. We then leverage recent advancements in understanding inherent technology characteristics to derive component-level experience rates to perform probabilistic cost extrapolations using multi-component experience curves. ...
... Uncertainty is an issue that is widely acknowledged in the literature but rarely assessed [5]. Nemet et al. highlight that experts tend to be overly optimistic regarding the field or technology they are involved in, which leads to an overconfidence bias in future cost reductions and rate of deployment [118]. There are analytical approaches for addressing uncertainty in key model parameters such as Monte-Carlo or Global Sensitivity Analysis [5,119,120]. ...
Article
Full-text available
Energy system models are important tools to guide our understanding of current and future carbon dioxide emissions as well as to inform strategies for emissions reduction. These models offer a vital evidence base that increasingly underpins energy and climate policies in many countries. In light of this important role in policy formation, there is growing interest in, and demands for, energy modellers to integrate more diverse perspectives on possible and preferred futures into the modelling process. The main purpose of this is to ensure that the resultant policy decisions are both fairer and better reflect people's concerns and preferences. However, while there has been a focus in the literature on efforts to bring societal dimensions into modelling tools, there remains a limited number of examples of well-structured participatory energy systems modelling processes and no available how-to guidance. This paper addresses this gap by providing good practice guidance for integrating stakeholder and public involvement in energy systems modelling based on the reflections of a diverse range of experts from this emergent field. The framework outlined in this paper offers multiple entry points for modellers to incorporate participatory elements either throughout the process or in individual stages. Recognising the messiness of both fields (energy systems modelling and participatory research), the good practice principles are not comprehensive or set in stone, but rather pose important questions to steer this process. Finally, the reflections on key issues provide a summary of the crucial challenges and important areas for future research in this critical field.
... Recommendations on the involved steps such as the identification of elicitation variables, selection of experts, conducting the elicitation and the post elicitation analysis vary depending on the type of guidance and whether it is domain-specific (Bojke et al., 2021). Relying on expert judgments is common practice in several applications such as in medicine and pharmacology (e.g., Bennett et al., 2005;Grigore et al., 2013;Walley et al., 2015), environmental studies (e.g., Kotra et al., 1996;Choy et al., 2009;Nemet et al., 2017), economics (e.g., Leal et al., 2007;Iglesias et al., 2016) as well as in PSHA (Budnitz et al., 1997). McBride et al., 2012). ...
Thesis
Full-text available
In seismic hazard assessments the importance of knowing different input parameters accurately depends on their weight within the hazard model. Many aspects of such assessments require inputs based on knowledge and data from experts. When it comes to decisions about data collection, facility owners and seismic hazard analysts need to balance the possible added value brought by acquiring new data against the budget and time available for its collection. In other words, they need to answer the question “Is it worth paying to obtain this information?”. Assessing the value of information (VoI) before data collection should lead to optimising the time and money that one is willing to invest. This thesis presents a method that combines available data and expert judgment to facilitate the decision-making process within the site-response component of seismic hazard assessments. The approach integrates influence diagrams and decision trees to map the causal-relationships between input parameters in site-response analysis, and Bayesian inference to update the model when new evidence is considered. Here, the VoI is assessed for univariate, bivariate and multivariate uncertain parameters to infer an optimal seismic design for typical buildings and critical facilities. For the first time in the field of seismic hazard assessment and earthquake engineering, a framework is developed to integrate prior knowledge, ground investigation techniques characteristics and design safety requirements. The consistent findings across different applications show that VoI is highly sensitive to prior probabilities and to the accuracy of the test to be performed. This highlights the importance of defining those from available data as well as only considering tests that are suitable for our needs and budget. The developed VoI framework constitutes a useful decision-making tool for hazard analysts and facility owners, enabling not only the prioritisation of data collection for key input parameters and the identification of optimal tests, but also the justification of the associated decisions. This approach can enhance the accuracy and reliability of seismic hazard assessments, leading to more effective risk management strategies.
... Over the years, others have demonstrated the use of training for improving accuracy of point forecasts (Goodwin & Fildes, 1999), probability forecasts (Benson & Onkal, 1992) and judgmental prediction intervals (Bolger & Önkal-Atay, 2004). Moreover, training to reduce overconfidence has been used for a variety of elicitation formats, from forecasting tournaments to surveys (Nemet et al., 2017). most recently demonstrated the use of online training and pedagogical materials regarding calibration and probabilistic judgments to be effective at reducing overconfidence, while also improving calibration and accuracy, in under an hour. ...
Article
We describe an exploratory study examining the effectiveness of an interactive app and a novel training process for improving calibration and reducing overconfidence in probabilistic judgments. We evaluated the training used in the app by conducting an American college football forecasting tournament involving 153 business school students making 52 forecasts over 11 weeks. A coarsened exact matching analysis found statistical evidence that, in under 30 min, the more challenging training was able to modestly reduce overconfidence, improve calibration and improve the accuracy of probabilistic judgments (measured by the Brier score). The experimental results also suggest that the generic training can generalize across domains and that effective calibration training is possible without expert facilitators or pedagogical training materials. Although no previous studies have reported similar results, due to the modest effect, we conclude that these results should only be interpreted as a proof of concept and that further evaluation and validation of mechanisms of the app's effect is necessary.
... Bias is indeed one of the most important challenges faces when expert panels are used in research (in this case, validation, and quantification of the model). At times, the bias from the experts' side can be caused to the overconfidence in the subjects known well by the experts (Nemet, Anadon and Verdolini, 2017). In order to better understand bias, there's a need for a definition for it in the relevant literature. ...
Thesis
Full-text available
Significant gaps in the practical transformation of clinical knowledge into practices, increasing healthcare costs, costly medical errors, healthcare institutions’ obligations towards improving safety, clinical outcomes, and efficacy of care from one side; and the rise of disruptive innovations, the adoption of electronic health records and novel diagnostic tools, and the plethora of data from the other side has made the need for a new approach in managing the U.S healthcare systems an imperative. Continuous learning has been utilized to mitigate some of these issues have been in healthcare organizations. Continuous learning is especially important in the research centers that act as innovation hubs within University Hospitals. These centers align with learning and improving current systems and practices in a specific area of healthcare with goals of better serving the population in need of those specific services or treatments. Maturity Models are organizational management tools that have been used as a way of responding to the constant pressure of trying to achieve and maintain competitive advantage through concurrent innovation, quality improvement, and cost reduction. In the context of continuous learning in healthcare organizations, a mature system can be defined as a system that generates timely actions to the information that it derives from internal and external data to create meaningful measurement regarding system learning and increased efficacy and effectiveness in health outcomes. However, there is a lack of a model that provides managers and decision-makers with a systematic, multi-criteria, validated, quantifiable, and repeatable maturity model to assess and enhance health organizations' performance in continuous learning and technology management. This research proposes a multi-criteria model to assess technology management maturity and continuous learning in research centers within university hospitals by using Hierarchical Decision Model (HDM). The model can help these research centers with pinpointing their strengths and opportunities in terms of continuous learning from the data they have access to while giving them organizational self-awareness and guide them in setting their strategies and resource allocation. The model will serve as a much-needed technology management tool for healthcare organizations to assess their technology management maturity and continuous learning efforts and assist them in creating more effective roadmaps.
... Understanding the range of future costs of energy technologies is essential for the design of cost-effective and robust energy and decarbonization policies (20). As previously mentioned, the two most commonly used classes of methods to make technology cost forecasts in general-and in the energy sector in particular-are expert-based methods and model-based methods (16,(32)(33)(34). ...
Article
Full-text available
Significance Forecasting is essential to design efforts to address climate change. We conduct a systematic comparison of probabilistic technology cost forecasts produced by expert elicitation and model-based methods. We assess their performance by generating probabilistic cost forecasts of energy technologies rooted at various years in the past and then comparing these with observed costs in 2019. Model-based methods outperformed expert elicitations both in terms of capturing 2019 observed values and producing forecast medians that were closer to the observed values. However, all methods underestimated technological progress in almost all technologies. We also produce 2030 cost forecasts and find that elicitations generally yield narrower uncertainty ranges than model-based methods and that model-based forecasts are lower for more modular technologies.
... First, our estimate of the climate value of wind energy is based on expert elicitation on the future cost of wind energy, analyzed using a specific IAM (GCAM, PNNL 2020) and a simplified damage function from DICE. According to Nemet et al. (2017), expert elicitation's usefulness could be limited in part because the choices in survey design and expert selection may bias results, leading to over or underconfidence. The literature on expert elicitation highlights the critical issues of properly designing an elicitation protocol that minimizes expert biases as much as possible and how to present, analyze, and aggregate the data collected (Verdolini et al. 2018). ...
Article
Full-text available
We conduct uncertainty analysis on the impacts of the future cost of wind energy on global electricity generation and the value of wind energy to climate change mitigation. We integrate data on global onshore and offshore wind energy cost and resources into the Global Climate Assessment Model (GCAM), and then propagate uncertainty based on distributions derived from an expert elicitation study on the future cost of onshore and offshore wind energy. The share of wind energy electricity generation in 2035, without a global policy on CO2 emissions, ranges between 4% and more than triple the 2019 share of 5.3%. Under a 1.5°C cap, this range is wider, with shares up to 34%. This range of uncertainty implies the need for flexible systems and policies, allowing large amounts to be deployed if needed. We explore whether a breakthrough in wind energy could prevent the demand for natural gas as a bridge technology to a low carbon economy, and find that uncertainty in wind energy is only pertinent for medium-stringency policies, such as a $60/t carbon tax. Under this scenario, there is a 95% chance that the cost of wind energy will be low enough to lead to an immediate reduction in the share of natural gas. In contrast, under a business-as-usual scenario without a breakthrough in cost, natural gas is highly likely to continue increasing in share of electricity generation. Under a 1.5°C cap, natural gas will decrease in share regardless of wind energy cost.
... Increased country and technology overlap can help improve understanding of the uncertainty in results for a particular method, such as expert estimates (cf. Nemet et al., 2017), and compare the precision delivered by different methods. ...
Article
Full-text available
Many models in energy economics assess the cost of alternative power generation technologies. As an input, the models require well-calibrated assumptions for the cost of capital or discount rates to be used, especially for renewable energy for which the cost of capital differs widely across countries and technologies. In this article, we review the spectrum of estimation methods for the private cost of capital for renewable energy projects and discuss appropriate use of the methods to yield unbiased results. We then evaluate the empirical evidence from 46 countries for the period 2009–2017. We find a globally consistent rank order among technologies, with the cost of capital increasing from solar PV to onshore wind to offshore wind power. On average, the cost of capital in developing countries is significantly higher than in industrialized countries, with large heterogeneity also within the groups of industrialized or developing countries.
... Large groups are also less conducive to information exchange and consensus building. Studies have investigated the ideal size of expert panels [12,13]. The Handbook of Decision Analysis recommends targeting panels of 6-12 experts to balance the need for gaining enough perspective while still enabling meaningful information exchange [14]. ...
Conference Paper
Full-text available
The formulation of science-driven space mission concepts is challenging-possibly even more so than the development and production of the space systems themselves. The formulation of these missions involves defining science objectives, surveying the state of the art of instrument capabilities, documenting the Program of Record and forecasting satellite lifetimes, defining feasible alternatives for spacecraft platforms and access to space, and identifying potentially enabled applications to cite only some of the tasks faced by mission design teams. The trade space is vast, especially in an era of novel platform concepts where constellations of SmallSats are changing the current paradigm of spaceborne observations. A crucial component of the formulation of science mission concepts is the assessment of the alternatives defined in this trade space. The assessment of the concepts is so complex that a heuristic approach does not sufficiently articulates the benefits of the alternatives under consideration. This complexity can be attributed to several factors. Science missions have to satisfy multiple science goals and their associated science objectives, therefore entering the realm of multi-criteria decision problems. In addition, multiple instruments, platforms, launchers, and ground system options are combined to define the architectures. The alternatives under assessment in these multi-criteria decision problems are numerous, as are the possible components of the segments that make up the architectures. Finally, stakeholders involved in the design and assessment of these science mission concepts have varying value systems: priorities relevant to stakeholders vary from group to group based on interests, objectives, and experiences. The complexity is such that the assessment requires a deliberate and structured approach to provide a comprehensive assessment of the mission concepts. This paper presents an approach that enables the assessment of the science benefits achieved by a space mission concept in the formulation phase. The approach combines Utility and Quality assessments provided by Subject Matter Experts to produce a Science Benefit score for each identified science objective. The paper discusses how this approach was tailored for the assessment of Observing Systems in the Aerosols, Cloud, Convection, and Precipitation (ACCP) study. In this Earth Science application, Utility quantifies how important a given geophysical variable is to addressing an identified science objective, while Quality quantifies how well an architecture obtains a geophysical variable with respect to Minimum levels listed in the Science Traceability Matrix. The resulting Benefit score articulates the science capability of a given architecture to address a given objective. This paper also presents the processes implemented to obtain the assessments from Subject Matter Experts in the ACCP study.
... Increased country and technology overlap can help improve understanding of the uncertainty in results for a particular method, such as expert estimates (cf. Nemet et al., 2017), and compare the precision delivered by different methods. ...
... ;Connor and Siegrist 2016;van Dijk et al. 2017;Dixon et al. 2016;Graaff, Bröer, and Wester 2017;Kojo and Innola 2017;Moyer and Song 2016;Sütterlin and Siegrist 2016;Tumlison, Moyer, and Song 2017). With an increased emphasis on evidencebased policy-making, scholars have also examined the perceptions of professionals and experts(Bertoldo et al. 2016;Falahee et al. 2016;Nemet, Anadon, and Verdolini 2017) particularly as they apply to technologies affecting health and energy. ...
Article
Full-text available
The societal impacts of scientific and technological advances – whether desirable or undesirable – have been one of the primary foci of contemporary policy research, much of which employs distinct interdisciplinary approaches. In this paper, we seek to characterize the topical, methodological, theoretical, and geographical trends of recent science and technology policy studies. Utilizing reputable scholarly journals within the related subfield of policy science, we review the recent research articles on science and technology policy as they relate to policy development, while paying particular attention to capturing any systemic patterns of trends. We conclude with a discussion of the various ramifications of our findings for future research directions and, in the process, highlight issues including technological innovation and diffusion, science and technology for sustainable energy development, evidence-based decision-making, and information technology and cyber security.
... If learning is taken to be an increase in knowledge, it holds that learning corresponds to a reduction in epistemic uncertainty. If probability distributions are used to represent an epistemic uncertainty, then a definition of 'reducing uncertainty' is required, because an investment in learning may not result in a reduction in uncertainty if it is measured using a normalised measure, such as (90th-10th)/50th percentiles(Nemet et al., 2016) ...
Thesis
Full-text available
In this thesis, a sensitivity analysis is used to systematically classify and rank parametric uncertainties in an energy system optimisation model of the United Kingdom, ETI-ESME. A subset of the most influential uncertainties are then evaluated in a model which investigates the process of resolving uncertainty over time — learning. The learning model identifies strategies and optimal pathways for staged investment in these critical uncertainties. By soft-linking the learning model to an energy system optimisation model, the strategies also take into account the system-wide trade-offs for investment across individual or portfolios of technologies. A global sensitivity analysis method, the Method of Morris, was used to efficiently analyse the model over the full range and combination of input parameter values covering technology costs and efficiencies, resource costs, and technology/infrastructure build-rate and resource-constraints. The results of the global sensitivity analysis show that very few parameters are responsible for the majority of variation in the outputs from the model. These critical uncertainties can be separated into two groups according to their suitability for learning. Some of the important uncertainties identified, such as the price of fossil fuel resources available to the UK, are not amenable to learning and must be managed through risk-based approaches. The parameters which are amenable to learning, the availability of domestic biomass, and the rate at which carbon capture and storage technologies can be deployed, are then investigated using the learning model. The learning model is formulated as a stochastic mixed-integer programme, and gives insights into the dynamic trade-offs between competing learning options within the context of the whole energy system. A UK case study shows that, if the resources are known to be available, total discounted net benefit of the availability of 150TWh/year of domestic biomass is £30bn, while the ability to build CCS plant at a rate of 2GW/year is worth up to £34bn. Together, the value increases non-linearly to a maximum of £59bn. This represents up to 17% of UK’s discounted total energy system cost over the next four decades as quantified by the ETI-ESME model. The learning model quantifies the cost threshold below which investment in an uncertain learning project is optimal. The threshold is a proxy for maximum no-regret investment over the aggregate total of research, commercialisation and deployment and could be of use to research funding agencies. The results show that when the likelihood of success of the project is 20%, one-stage learning projects of £10bn or below are always undertaken. For the same likelihood of suc- cess, dividing a project into two-stages more than doubles the investment threshold to £22bn as it allows strategies in which investment switches away from a project if it fails. Dividing a project into multiple stages is particularly beneficial if most of the uncertainty is front-loaded, enabling switching to an alternative. The precise strategy to follow is a complex function of the cost, duration, net benefit and probability of success of each learning project, as well as the interac- tions between the project outcomes.
Article
Introducing a price floor in emissions trading schemes (ETS) theoretically stabilizes expectations on future carbon prices and thus fosters low-carbon investment. Yet, ex post evidence on high carbon prices is scant and the relevance of carbon pricing for investment decisions is frequently contested. We provide empirical ex ante evidence on how a price floor in the EU ETS would impact the size and portfolio of energy firms’ investments. Analyzing survey responses of high-level managers in 113 German energy and industry companies, we find that the level of the price floor is crucial. A low price floor trajectory only provides insurance against downward price fluctuations and would leave investments largely unchanged except for industries receiving electricity price compensation, which reduce their investments. A high floor, significantly increasing the price level beyond current expectations, leads to higher investment by the majority of firms, especially by green firms, while investment in fossil energy would partially be abolished. Our studies implies that price floors can be important design components of ETS. However, policymakers need to ensure that they are at sufficiently high levels to affect investment decisions in a meaningful way.
Article
Full-text available
The COVID-19 lockdown has increased the use of flexible workplace practices (FWP) especially work from home, demonstrating their importance to the resilience of transportation systems and regional economies. This study compares experiences and perceptions of FWP and related policy interventions before and during the COVID-19 shutdown, using a mixed-methods approach focusing on the South Bay region of Los Angeles County, to inform projections about the use of FWP and policy implications post-COVID. Pre-shutdown surveys and focus groups interviews confirmed that major obstacles to FWP expansion were a combination of managerial and executive resistance, alongside occupational constraints. Pre-shutdown interviews suggested that costs associated with manager training and cultural transition are major concerns for executives. A small sample of follow-up interviews with executives, managers, and staff, conducted during the shutdown period has revealed some of the practical issues with full-time FWP such as work-life balance, childcare, productivity, IT hardware and software, and network connectivity. Although organizations have been forced into flexible arrangements, many are considering continuing to utilize the practices after the pandemic settles down. In terms of policy interventions, pre-COVID participants perceived government subsidies and incentives as the most desirable government programs. However, in a resource-constrained post-COVID world, policy makers might instead focus on training programs and promotional campaigns tied to public health messaging, and the implications of reduced commuting for transportation system design and commercial zoning and land use.
Article
Background Many decisions in health care aim to maximise health, requiring judgements about interventions that may have higher health effects but potentially incur additional costs (cost-effectiveness framework). The evidence used to establish cost-effectiveness is typically uncertain and it is important that this uncertainty is characterised. In situations in which evidence is uncertain, the experience of experts is essential. The process by which the beliefs of experts can be formally collected in a quantitative manner is structured expert elicitation. There is heterogeneity in the existing methodology used in health-care decision-making. A number of guidelines are available for structured expert elicitation; however, it is not clear if any of these are appropriate for health-care decision-making. Objectives The overall aim was to establish a protocol for structured expert elicitation to inform health-care decision-making. The objectives are to (1) provide clarity on methods for collecting and using experts’ judgements, (2) consider when alternative methodology may be required in particular contexts, (3) establish preferred approaches for elicitation on a range of parameters, (4) determine which elicitation methods allow experts to express uncertainty and (5) determine the usefulness of the reference protocol developed. Methods A mixed-methods approach was used: systemic review, targeted searches, experimental work and narrative synthesis. A review of the existing guidelines for structured expert elicitation was conducted. This identified the approaches used in existing guidelines (the ‘choices’) and determined if dominant approaches exist. Targeted review searches were conducted for selection of experts, level of elicitation, fitting and aggregation, assessing accuracy of judgements and heuristics and biases. To sift through the available choices, a set of principles that underpin the use of structured expert elicitation in health-care decision-making was defined using evidence generated from the targeted searches, quantities to elicit experimental evidence and consideration of constraints in health-care decision-making. These principles, including fitness for purpose and reflecting individual expert uncertainty, were applied to the set of choices to establish a reference protocol. An applied evaluation of the developed reference protocol was also undertaken. Results For many elements of structured expert elicitation, there was a lack of consistency across the existing guidelines. In almost all choices, there was a lack of empirical evidence supporting recommendations, and in some circumstances the principles are unable to provide sufficient justification for discounting particular choices. It is possible to define reference methods for health technology assessment. These include a focus on gathering experts with substantive skills, eliciting observable quantities and individual elicitation of beliefs. Additional considerations are required for decision-makers outside health technology assessment, for example at a local level, or for early technologies. Access to experts may be limited and in some circumstances group discussion may be needed to generate a distribution. Limitations The major limitation of the work conducted here lies not in the methods employed in the current work but in the evidence available from the wider literature relating to how appropriate particular methodological choices are. Conclusions The reference protocol is flexible in many choices. This may be a useful characteristic, as it is possible to apply this reference protocol across different settings. Further applied studies, which use the choices specified in this reference protocol, are required. Funding This project was funded by the NIHR Health Technology Assessment programme and will be published in full in Health Technology Assessment ; Vol. 25, No. 37. See the NIHR Journals Library website for further project information. This work was also funded by the Medical Research Council (reference MR/N028511/1).
Article
A considerable amount of research has been conducted on the performance of project delivery methods and applications in project development within the transportation industry. However, decision makers are still seeking a trusted decision model that considers all options and impacts of risk needed for successful project delivery. This study developed a comprehensive multicriteria decision making (MCDM) model capable of evaluating and choosing the most effective project assessment tool for measuring the success of project delivery performance and outcomes. A case study approach was adopted to explore and assess the assessment tools and innovative processes that are used in the development phase within transportation project life cycle phases. A scenario analysis technique was applied, and 62 subject-matter experts from the transportation industry were interviewed. The data they provided were quantified, and they validated the study results. The results ranked the VE-RACRDAM as the most important project assessment alternative. This study significantly contributes to project management knowledge, decision modeling, project development, and delivery success.
Chapter
This chapter presents a methodological approach to measure an organization’s technology transfer capabilities. The integrated approach is a combination of action research in the first phase and a hierarchical decision modeling (HDM) in the second phase, and rather than focusing on assessing a single technology or project/program, focuses on assessing the organization as a whole, i.e., the model brings insights on how ready the organization is in order to successfully transfer technologies from the research stage into an operational stage. The following sections bring a detailed explanation on action research as a research approach and on HDM as a decision-making method, as well as the presentation of the assessment framework with the necessary steps to build the model and to apply it.
Article
The formulation and use of scenarios is now a fundamental part of national and global efforts to assess and plan for climate change. While scenario development initially focused on the technical dimensions of energy, emissions and climate response, in recent years parallel sets of shared socio-economic pathways have been developed to portray the values, motivations, and sociopolitical and institutional dimensions of these systems. However, integrating the technical and social aspects of evolving energy systems is difficult, with transitions dependent on highly uncertain technological advances, social preferences, political governance, climate urgency, and the interaction of these elements to maintain or overcome systemic inertia. A broad range of interdisciplinary knowledge is needed to structure and evaluate these processes, many of which involve a mix of qualitative and quantitative factors. To structure and facilitate the necessary linkages this paper presents an approach for generating a plausible range of scenarios for an emerging energy technology. The method considers influences among technical and social factors that can encourage or impede necessary improvements in the performance and cost of the technology, as well the processes affecting public acceptance and the establishment of governance structures necessary to support effective planning and implementation. A Bayesian network is used to capture relationships among the technological and socioeconomic factors likely to affect the probability that the technology will achieve significant penetration and adoption. The method is demonstrated for carbon capture and storage (CCS): a potential technology on the pathway to deep decarbonization. A preliminary set of expert elicitations is conducted to illustrate how relationships between these factors can be estimated. This establishes a prior or baseline network that can be subsequently analyzed by choosing either optimistic or pessimistic assumptions for respective groups of technical and social variables, identifying sets of key factors that limit or encourage successful deployment.
Article
Expert elicitation is a structured approach for obtaining judgments from experts about items of interest to decision makers. This method has been increasingly applied in the energy domain to collect information on the future cost, technical performance, and associated uncertainty of specific energy technologies. This article has two main objectives: (1) to introduce the basics of expert elicitations, including their design and implementation, highlighting their advantages and disadvantages and their potential to inform policymaking and energy system decisions; and (2) to discuss and compare the results of a subset of the most recent expert elicitations on energy technologies, with a focus on future cost trajectories and implied cost reduction rates. We argue that the data on future energy costs provided by expert elicitations allows for more transparent and robust analyses that incorporate technical uncertainty, which can then be used to support the design and assessment of energy and climate change mitigation policies.
Article
Public energy research and development (R&D) is recognized as a key policy tool for transforming the world’s energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain. Correction online 22 November 2017
Article
Full-text available
Mitigating climate change will require innovation in energy technologies. Policy makers are faced with the question of how to promote this innovation, and whether to focus on a few technologies or to spread their bets. We present results on the extent to which public R&D might shape the future cost of energy technologies by 2030. We bring together three major expert elicitation efforts carried out by researchers at UMass Amherst, Harvard, and FEEM, covering nuclear, solar, Carbon Capture and Storage (CCS), bioelectricity, and biofuels. The results show experts believe that there will be cost reductions resulting from R&D and report median cost reductions around 20 % for most of the technologies at the R&D budgets considered. Although the improvements associated to solar and CCS R&D show some promise, the lack of consensus across studies, and the larger magnitude of the R&D investment involved in these technologies, calls for caution when defining what technologies would benefit the most from additional public R&D. In order to make R&D funding decisions to meet particular goals, such as mitigating climate change or improving energy security, or to estimate the social returns to R&D, policy makers need to combine the information provided in this study on cost reduction potentials with an analysis of the macroeconomic implications of these technological changes. We conclude with recommendations for future directions on energy expert elicitations.
Article
Full-text available
This responds to an “evaluation” of the classical model for structured expert judgment by Bolger and Rowe in this issue. This response references extensive expert judgment performance data in the public domain which played no role in their evaluation.
Article
Full-text available
This article presents the synthesis of results from the Stanford Energy Modeling Forum Study 27, an inter-comparison of 18 energy-economy and integrated assessment models. The study investigated the importance of individual mitigation options such as energy intensity improvements, carbon capture and storage (CCS), nuclear power, solar and wind power and bioenergy for climate mitigation. Limiting the atmospheric greenhouse gas concentration to 450 or 550 ppm CO2 equivalent by 2100 would require a decarbonization of the global energy system in the 21st century. Robust characteristics of the energy transformation are increased energy intensity improvements and the electrification of energy end use coupled with a fast decarbonization of the electricity sector. Non-electric energy end use is hardest to decarbonize, particularly in the transport sector. Technology is a key element of climate mitigation. Versatile technologies such as CCS and bioenergy are found to be most important, due in part to their combined ability to produce negative emissions. The importance of individual low-carbon electricity technologies is more limited due to the many alternatives in the sector. The scale of the energy transformation is larger for the 450 ppm than for the 550 ppm CO2e target. As a result, the achievability and the costs of the 450 ppm target are more sensitive to variations in technology availability.
Article
Full-text available
This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants' broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy.
Article
Full-text available
Residential photovoltaic (PV) systems were twice as expensive in the United States as in Germany (median of 5.29/Wvs.5.29/W vs. 2.59/W) in 2012. This price discrepancy stems primarily from differences in non-hardware or “soft” costs between the two countries, which can only in part be explained by differences in cumulative market size and associated learning. A survey of German PV installers was deployed to collect granular data on PV soft costs in Germany, and the results are compared to those of a similar survey of U.S. PV installers. Non-module hardware costs and all analyzed soft costs are lower in Germany, especially for customer acquisition, installation labor, and profit/overhead costs, but also for expenses related to permitting, interconnection, and inspection procedures. Additional costs occur in the United States due to state and local sales taxes, smaller average system sizes, and longer project-development times. To reduce the identified additional costs of residential PV systems, the United States could introduce policies that enable a robust and lasting market while minimizing market fragmentation. Regularly declining incentives offering a transparent and certain value proposition—combined with simple interconnection, permitting, and inspection requirements—might help accelerate PV cost reductions in the United States.
Article
Full-text available
Characterization of the anticipated performance of energy technologies to inform policy decisions increasingly relies on expert elicitation. Knowledge about how elicitation design factors impact the probabilistic estimates emerging from these studies is, however, scarce. We focus on nuclear power, a large-scale low-carbon power option, for which future cost estimates are important for the design of energy policies and climate change mitigation efforts. We use data from three elicitations in the USA and in Europe and assess the role of government research, development, and demonstration (RD&D) investments on expected nuclear costs in 2030. We show that controlling for expert, technology, and design characteristics increases experts' implied public RD&D elasticity of expected costs by 25%. Public sector and industry experts' cost expectations are 14% and 32% higher, respectively than academics. US experts are more optimistic than their EU counterparts, with median expected costs 22% lower. On average, a doubling of public RD&D is expected to result in an 8% cost reduction, but the uncertainty is large. The difference between the 90th and 10th percentile estimates is on average 58% of the experts' median estimates. Public RD&D investments do not affect uncertainty ranges, but US experts are less confident about costs than Europeans.
Article
Full-text available
Resilient infrastructure systems are essential for cities to withstand and rapidly recover from natural and human-induced disasters, yet electric power, transportation, and other infrastructures are highly vulnerable and interdependent. New approaches for characterizing the resilience of sets of infrastructure systems are urgently needed, at community and regional scales. This article develops a practical approach for analysts to characterize a community's infrastructure vulnerability and resilience in disasters. It addresses key challenges of incomplete incentives, partial information, and few opportunities for learning. The approach is demonstrated for Metro Vancouver, Canada, in the context of earthquake and flood risk. The methodological approach is practical and focuses on potential disruptions to infrastructure services. In spirit, it resembles probability elicitation with multiple experts; however, it elicits disruption and recovery over time, rather than uncertainties regarding system function at a given point in time. It develops information on regional infrastructure risk and engages infrastructure organizations in the process. Information sharing, iteration, and learning among the participants provide the basis for more informed estimates of infrastructure system robustness and recovery that incorporate the potential for interdependent failures after an extreme event. Results demonstrate the vital importance of cross-sectoral communication to develop shared understanding of regional infrastructure disruption in disasters. For Vancouver, specific results indicate that in a hypothetical M7.3 earthquake, virtually all infrastructures would suffer severe disruption of service in the immediate aftermath, with many experiencing moderate disruption two weeks afterward. Electric power, land transportation, and telecommunications are identified as core infrastructure sectors.
Article
Full-text available
Assessing the uncertainty due to possible systematic errors in a physical measurement unavoidably involves an element of subjective judgment. Examination of historical measurements and recommended values for the fundamental physical constants shows that the reported uncertainties have a consistent bias towards underestimating the actual errors. These findings are comparable to findings of persistent overconfidence in psychological research on the assessment of subjective probability distributions. Awareness of these biases could help in interpreting the precision of measurements, as well as provide a basis for improving the assessment of uncertainty in measurements.
Article
Full-text available
Analysts and decision makers frequently want estimates of the cost of technologies that have yet to be developed or deployed. Small modular reactors (SMRs), which could become part of a portfolio of carbon-free energy sources, are one such technology. Existing estimates of likely SMR costs rely on problematic top-down approaches or bottom-up assessments that are proprietary. When done properly, expert elicitations can complement these approaches. We developed detailed technical descriptions of two SMR designs and then conduced elicitation interviews in which we obtained probabilistic judgments from 16 experts who are involved in, or have access to, engineering-economic assessments of SMR projects. Here, we report estimates of the overnight cost and construction duration for five reactor-deployment scenarios that involve a large reactor and two light water SMRs. Consistent with the uncertainty introduced by past cost overruns and construction delays, median estimates of the cost of new large plants vary by more than a factor of 2.5. Expert judgments about likely SMR costs display an even wider range. Median estimates for a 45 megawatts-electric (MWe) SMR range from 4,000to4,000 to 16,300/kWe and from 3,200to3,200 to 7,100/kWe for a 225-MWe SMR. Sources of disagreement are highlighted, exposing the thought processes of experts involved with SMR design. There was consensus that SMRs could be built and brought online about 2 y faster than large reactors. Experts identify more affordable unit cost, factory fabrication, and shorter construction schedules as factors that may make light water SMRs economically viable.
Article
Full-text available
There are at least three motivations for government intervention in GHG mitigation: (1) inducing the private sector to reduce GHG emissions directly by setting a price on emissions, (2) increasing the amount of innovative activity in GHG mitigation technology development, and (3) educating the public regarding GHG-reducing investment opportunities, allowing consumers to make better private decisions. This paper discusses the pros and cons of policy instruments that might be used to respond to these motivations and makes recommendations for an appropriate mix of policy instruments over time given both economic and policital/instituional considerations.
Article
As scientific and observational evidence on global warming piles up every day, questions of economic policy in this central environmental topic have taken center stage. But as author and prominent Yale economist William Nordhaus observes, the issues involved in understanding global warming and slowing its harmful effects are complex and cross disciplinary boundaries. For example, ecologists see global warming as a threat to ecosystems, utilities as a debit to their balance sheets, and farmers as a hazard to their livelihoods. In this important work, William Nordhaus integrates the entire spectrum of economic and scientific research to weigh the costs of reducing emissions against the benefits of reducing the long-run damages from global warming. The book offers one of the most extensive analyses of the economic and environmental dynamics of greenhouse-gas emissions and climate change and provides the tools to evaluate alternative approaches to slowing global warming. The author emphasizes the need to establish effective mechanisms, such as carbon taxes, to harness markets and harmonize the efforts of different countries. This book not only will shape discussion of one the world's most pressing problems but will provide the rationales and methods for achieving widespread agreement on our next best move in alleviating global warming.
Article
Many decisions are based on beliefs concerning the likelihood of uncertain events such as the outcome of an election, the guilt of a defendant, or the future value of the dollar. Occasionally, beliefs concerning uncertain events are expressed in numerical form as odds or subjective probabilities. In general, the heuristics are quite useful, but sometimes they lead to severe and systematic errors. The subjective assessment of probability resembles the subjective assessment of physical quantities such as distance or size. These judgments are all based on data of limited validity, which are processed according to heuristic rules. However, the reliance on this rule leads to systematic errors in the estimation of distance. This chapter describes three heuristics that are employed in making judgments under uncertainty. The first is representativeness, which is usually employed when people are asked to judge the probability that an object or event belongs to a class or event. The second is the availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development, and the third is adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available.
Article
Solar photovoltaic (PV) system prices in the United States display considerable heterogeneity both across geographic locations and within a given location. Such heterogeneity may arise due to state and federal policies, differences in market structure, and other factors that influence demand and costs. This paper examines the relative importance of such factors on equilibrium solar PV system prices in the United States using a detailed dataset of roughly 100,000 recent residential and small commercial installations. As expected, we find that PV system prices differ based on characteristics of the systems. More interestingly, we find evidence suggesting that search costs and imperfect competition affect solar PV pricing. Installer density substantially lowers prices, while regions with relatively generous financial incentives for solar PV are associated with higher prices.
Article
Research indicates that uncertainty in science news stories affects public assessment of risk and uncertainty. However, the form in which uncertainty is presented may also affect people's risk and uncertainty assessments. For example, a news story that features an expert discussing both what is known and what is unknown about a topic may convey a different form of scientific uncertainty than a story that features two experts who hold conflicting opinions about the status of scientific knowledge of the topic, even when both stories contain the same information about knowledge and its boundaries. This study focuses on audience uncertainty and risk perceptions regarding the emerging science of nanotechnology by manipulating whether uncertainty in a news story about potential risks is attributed to expert sources in the form of caveats (individual uncertainty) or conflicting viewpoints (collective uncertainty). Results suggest that the type of uncertainty portrayed does not impact audience feelings of uncertainty or risk perceptions directly. Rather, the presentation of the story influences risk perceptions only among those who are highly deferent to scientific authority. Implications for risk communication theory and practice are discussed. © 2015 Society for Risk Analysis.
Article
Behavioral decision research has demonstrated that judgments and decisions of ordinary people and experts are subject to numerous biases. Decision and risk analysis were designed to improve judgments and decisions and to overcome many of these biases. However, when eliciting model components and parameters from decisionmakers or experts, analysts often face the very biases they are trying to help overcome. When these inputs are biased they can seriously reduce the quality of the model and resulting analysis. Some of these biases are due to faulty cognitive processes; some are due to motivations for preferred analysis outcomes. This article identifies the cognitive and motivational biases that are relevant for decision and risk analysis because they can distort analysis inputs and are difficult to correct. We also review and provide guidance about the existing debiasing techniques to overcome these biases. In addition, we describe some biases that are less relevant because they can be corrected by using logic or decomposing the elicitation task. We conclude the article with an agenda for future research. © 2015 Society for Risk Analysis.
Article
Expert elicitations of future energy technology costs can improve energy policy design by explicitly characterizing uncertainty. However, the recent proliferation of expert elicitation studies raises questions about the reliability and comparability of the results. In this paper, we standardize disparate expert elicitation data from five EU and US studies, involving 65 experts, of the future costs of photovoltaics (PV) and evaluate the impact of expert and study characteristics on the elicited metrics. The results for PV suggest that in-person elicitations are associated with more optimistic 2030 PV cost estimates and in some models with a larger range of uncertainty than online elicitations. Unlike in previous results on nuclear power, expert affiliation type and nationality do not affect central estimates. Some specifications suggest that EU experts are more optimistic about breakthroughs, but they are also less confident in that they provide larger ranges of estimates than do US experts. Higher R&D investment is associated with lower future costs. Rather than increasing confidence, high R&D increases uncertainty about future costs, mainly because it improves the base case (low cost) outcomes more than it improves the worst case (high cost) outcomes.
Article
In the present paper we use the output of multiple expert elicitation surveys on the future cost of key low-carbon technologies and use it as input of three Integrated Assessment models, GCAM, MARKAL_US and WITCH. By means of a large set of simulations we aim to assess the implications of these subjective distributions of technological costs over key model outputs. We are able to detect what sources of technology uncertainty are more influential, how this differs across models, and whether and how results are affected by the time horizon, the metric considered or the stringency of the climate policy. In unconstrained emission scenarios, within the range of future technology performances considered in the present analysis, the cost of nuclear energy is shown to dominate all others in affecting future emissions. Climate-constrained scenarios, stress the relevance, in addition to that of nuclear energy, of biofuels, as they represent the main source of decarbonization of the transportation sector and bioenergy, since the latter can be coupled with Carbon Capture and Storage (CCS) to produce negative emissions.
Article
In this book we describe how to elicit and analyze expert judgment. Expert judgment is defined here to include both the experts' answers to technical questions and their mental processes in reaching an answer. It refers specifically to data that are obtained in a deliberate, structured manner that makes use of the body of research on human cognition and communication. Our aim is to provide a guide for lay persons in expert judgment. These persons may be from physical and engineering sciences, mathematics and statistics, business, or the military. We provide background on the uses of expert judgment and on the processes by which humans solve problems, including those that lead to bias. Detailed guidance is offered on how to elicit expert judgment ranging from selecting the questions to be posed of the experts to selecting and motivating the experts to setting up for and conducting the elicitation. Analysis procedures are introduced and guidance is given on how to understand the data base structure, detect bias and correlation, form models, and aggregate the expert judgments.
Article
Good policy making should be based on available scientific knowledge. Sometimes this knowledge is well established through research, but often scientists must simply express their judgment, and this is particularly so in risk scenarios that are characterized by high levels of uncertainty. Usually in such cases, the opinions of several experts will be sought in order to pool knowledge and reduce error, raising the question of whether individual expert judgments should be given different weights. We argue—against the commonly advocated “classical method”—that no significant benefits are likely to accrue from unequal weighting in mathematical aggregation. Our argument hinges on the difficulty of constructing reliable and valid measures of substantive expertise upon which to base weights. Practical problems associated with attempts to evaluate experts are also addressed. While our discussion focuses on one specific weighting scheme that is currently gaining in popularity for expert knowledge elicitation, our general thesis applies to externally imposed unequal weighting schemes more generally.
Article
The elicitation of scientific and technical judgments from experts, in the form of subjective probability distributions, can be a valuable addition to other forms of evidence in support of public policy decision making. This paper explores when it is sensible to perform such elicitation and how that can best be done. A number of key issues are discussed, including topics on which there are, and are not, experts who have knowledge that provides a basis for making informed predictive judgments; the inadequacy of only using qualitative uncertainty language; the role of cognitive heuristics and of overconfidence; the choice of experts; the development, refinement, and iterative testing of elicitation protocols that are designed to help experts to consider systematically all relevant knowledge when they make their judgments; the treatment of uncertainty about model functional form; diversity of expert opinion; and when it does or does not make sense to combine judgments from different experts. Although it may be tempting to view expert elicitation as a low-cost, low-effort alternative to conducting serious research and analysis, it is neither. Rather, expert elicitation should build on and use the best available research and analysis and be undertaken only when, given those, the state of knowledge will remain insufficient to support timely informed assessment and decision making.
Article
Energy scenarios suggest that CO2 capture and storage (CCS) from power plants might contribute significantly to global greenhouse gas emission reduction. Since CCS from power generation is an emerging technology that has not been demonstrated on a commercial scale, related cost and performance information is still uncertain. This paper presents a detailed analysis of the impact of adding CO2 capture and compression process equipment to fossil-fuelled power plants. For coal-fired power generation, no single capture technology outperforms available alternative capture processes in terms of cost and performance.
Article
Probabilistic estimates of the cost and performance of future nuclear energy systems under different scenarios of government research, development, and demonstration (RD&D) spending were obtained from 30 U.S. and 30 European nuclear technology experts. We used a novel elicitation approach which combined individual and group elicitation. With no change from current RD&D funding levels, experts on average expected current (Gen. III/III+) designs to be somewhat more expensive in 2030 than they were in 2010, and they expected the next generation of designs (Gen. IV) to be more expensive still as of 2030. Projected costs of proposed small modular reactors (SMRs) were similar to those of Gen. IV systems. The experts almost unanimously recommended large increases in government support for nuclear RD&D (generally 2-3 times current spending). The majority expected that such RD&D would have only a modest effect on cost, but would improve performance in other areas, such as safety, waste management, and uranium resource utilization. The U.S. and E.U. experts were in relative agreement regarding how government RD&D funds should be allocated, placing particular focus on very high temperature reactors, sodium-cooled fast reactors, fuels and materials, and fuel cycle technologies.
Article
Recently, several authors have presented interesting contributions on how to meet deep or severe uncertainties in a risk analysis setting. In this article, we provide some reflections on some of the foundational pillars that this work is based on, including the meaning of concepts such as deep uncertainty, known probabilities, and correct models, the aim being to contribute to a strengthening of the scientific platform of the work, as well as providing new insights on how to best implement management policies meeting these uncertainties. We also provide perspectives on the boundaries and limitations of analytical approaches for supporting decision making in cases of deep uncertainties. A main conclusion of the article is that deep uncertainties call for managerial review and judgment that sees beyond the analytical frameworks studied in risk assessment and risk management contexts, including those now often suggested to be used, such as robust optimization techniques. This managerial review and judgment should be seen as a basic element of the risk management.
Article
This article reviews the concept of an energy technology innovation system (ETIS). The ETIS is a systemic perspective on innovation comprising all aspects of energy transformations (supply and demand); all stages of the technology development cycle; and all the major innovation processes, feedbacks, actors, institutions, and networks. We use it as an analytical framework to describe key features and drivers of energy innovation. A global snapshot of the ETIS is provided using investments as the main indicator. Rationales for government policy in energy innovation are discussed, and policy design guidelines for an effectively functioning ETIS are presented. The proposed guidelines are based on a survey of the literature and empirical case studies; they diverge substantially from polices implied by partial perspectives on innovation. Key research, information, and data needs are also identified.
Book
The authors explain the ways in which uncertainty is an important factor in the problems of risk and policy analysis. This book outlines the source and nature of uncertainty, discusses techniques for obtaining and using expert judgment, and reviews a variety of simple and advanced methods for analyzing uncertainty.
Article
The purpose of this book is to help people make better decisions. Written in a clear and non-technical way it deals with the basis of intuitive judgement, demonstrates the limitations on the human ability to make judgements, and suggests the means of overcoming potential shortcomings. At the same time it stresses the importance of learning the limits to one's judgemental ability. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
The mental models approach, a leading strategy to develop risk communications, involves a time- and labor-intensive interview process and a lengthy questionnaire to elicit group-level risk perceptions. We propose that a similarity ratings approach for structural knowledge elicitation can be adopted to assist the risk mental models approach. The LinkIT game, inspired by games with a purpose (GWAP) technology, is a ludic elicitation tool designed to elicit group understanding of the relations between risk factors in a more enjoyable and productive manner when compared to traditional approaches. That is, consistent with the idea of ludic elicitation, LinkIT was designed to make the elicitation process fun and enjoyable in the hopes of increasing participation and data quality in risk studies. Like the mental models approach, the group mental model obtained via the LinkIT game can hence be generated and represented in a form of influence diagrams. In order to examine the external validity of LinkIT, we conducted a study to compare its performance with respect to a more conventional questionnaire-driven approach. Data analysis results conclude that the two group mental models elicited from the two approaches are similar to an extent. Yet, LinkIT was more productive and enjoyable than the questionnaire. However, participants commented that the current game has some usability concerns. This presentation summarizes the design and evaluation of the LinkIT game and suggests areas for future work.
Book
This is an extensive survey and critical examination of the literature on the use of expert opinion in scientific inquiry and policy making. Cooke considers how expert opinion is being used today, how an expert’s uncertainty is or should be represented, how people do or should reason with uncertainty, how the quality and usefulness of expert opinion can be assessed, and how the views of several experts might be combined. He argues for the importance of developing practical models with a transparent mathematical foundation for the use of expert opinion in science, and presents three tested models. Detailed case studies illustrate how they can be applied to a diversity of real problems in engineering and planning.
Article
The relationship between R&D investments and technical change is inherently uncertain. In this paper we combine economics and decision analysis to incorporate the uncertainty of technical change into climate change policy analysis. We present the results of an expert elicitation on the prospects for technical change in nuclear power. We then use the results of the expert elicitations as inputs to the MiniCAM integrated assessment model, to derive probabilistic information about the impacts of R&D investments on the costs of emissions abatement. We find that nuclear R&D appears to be a risk-complement with R&D into Carbon Capture, providing large benefits at lower levels of abatement; and that investments in improving Light Water Reactors have the greatest expected return.
Article
Risk assessors attempting to use probabilistic approaches to describe uncertaintyoften find themselves in a data-sparse situation: available data are only partially relevant to the parameter of interest, so one needs to adjust empirical distributions, use explicit judgmental distributions, or collect new data. In determining whether ornot to collect additional data, whether by measurement or by elicitation of experts, it is useful to consider the expected value of the additional information. The expected value of information depends on the prior distribution used to represent current information; if the prior distribution is too narrow, in many risk-analytic cases the calculated expected value of information will be biased downward. The well-documentedtendency toward overconfidence, including the neglect of potential surprise, suggeststhis bias may be substantial. We examine the expected value of information, includingthe role of surprise, test for bias in estimating the expected value of information, and suggest procedures to guard against overconfidence and underestimation of the expected value of information when developing prior distributions and when combining distributions obtained from multiple experts. The methods are illustrated with applications to potential carcinogens in food, commercial energy demand, and global climate change.
Article
The relationship between R&D investments and technical change is inherently uncertain. In this paper we combine economics and decision analysis to incorporate the uncertainty of technical change into climate change policy analysis. We present the results of an expert elicitation on the prospects for technical change in carbon capture and storage. We find a significant amount of disagreement between experts, even over the most mature technology; and this disagreement is most pronounced in regards to cost estimates. We then use the results of the expert elicitations as inputs to the MiniCAM integrated assessment model, to derive probabilistic information about the impacts of R&D investments on the costs of emissions abatement. We conclude that we need to gather more information about the technical and societal potential for Carbon Storage; cost differences among the different capture technologies play a relatively smaller role.
Article
The monetized value of avoided premature mortality typically dominates the calculated benefits of air pollution regulations; therefore, characterization of the uncertainty surrounding these estimates is key to good policymaking. Formal expert judgment elicitation methods are one means of characterizing this uncertainty. They have been applied to characterize uncertainty in the mortality concentration-response function, but have yet to be used to characterize uncertainty in the economic values placed on avoided mortality. We report the findings of a pilot expert judgment study for Health Canada designed to elicit quantitative probabilistic judgments of uncertainties in Value-per-Statistical-Life (VSL) estimates for use in an air pollution context. The two-stage elicitation addressed uncertainties in both a base case VSL for a reduction in mortality risk from traumatic accidents and in benefits transfer-related adjustments to the base case for an air quality application (e.g., adjustments for age, income, and health status). Results for each expert were integrated to develop example quantitative probabilistic uncertainty distributions for VSL that could be incorporated into air quality models.
Article
We develop and apply a judgment-based approach to selecting robust alternatives, which are defined here as reasonably likely to achieve objectives, over a range of uncertainties. The intent is to develop an approach that is more practical in terms of data and analysis requirements than current approaches, informed by the literature and experience with probability elicitation and judgmental forecasting. The context involves decisions about managing forest lands that have been severely affected by mountain pine beetles in British Columbia, a pest infestation that is climate-exacerbated. A forest management decision was developed as the basis for the context, objectives, and alternatives for land management actions, to frame and condition the judgments. A wide range of climate forecasts, taken to represent the 10-90% levels on cumulative distributions for future climate, were developed to condition judgments. An elicitation instrument was developed, tested, and revised to serve as the basis for eliciting probabilistic three-point distributions regarding the performance of selected alternatives, over a set of relevant objectives, in the short and long term. The elicitations were conducted in a workshop comprising 14 regional forest management specialists. We employed the concept of stochastic dominance to help identify robust alternatives. We used extensive sensitivity analysis to explore the patterns in the judgments, and also considered the preferred alternatives for each individual expert. The results show that two alternatives that are more flexible than the current policies are judged more likely to perform better than the current alternatives on average in terms of stochastic dominance. The results suggest judgmental approaches to robust decision making deserve greater attention and testing.
Article
This paper contributes to the induced innovation literature by extending the analysis of supply and demand determinants of innovation in energy technologies to account for international knowledge flows and spillovers. We select a sample of 38 innovating countries and study how knowledge related to energy-efficient and environmentally friendly technologies flows across geographical and technological space. We demonstrate that higher geographical and technological distances are associated with lower probabilities of knowledge flow. Next, we use previous estimates to construct internal and external knowledge stocks for a panel of 17 countries. We then present an econometric analysis of the supply and demand determinants of innovation accounting for international knowledge spillovers. Our results confirm the role of demand-pull effects, proxied by energy prices, and of technological opportunity, proxied by the knowledge stocks. Our results show that spillovers between countries have a significant positive impact on further innovation in energy-efficient and environmentally friendly technologies.
Article
We present a reactor-by-reactor analysis of historical busbar costs for 99 nuclear reactors in the United States, and compare those costs with recent projections for next-generation US reactors. We argue that cost projections far different from median historical costs require more justification than estimates that lie close to those medians. Our analysis suggests that some recent projections of capital costs, construction duration, and total operations and maintenance costs are quite low—far enough from the historical medians that additional scrutiny may be required to justify using such estimates in current policy discussions and planning.
Article
In this paper we structure, obtain and analyze results of an expert elicitation on the relationship between U. S. government Research & Development funding and the likelihood of achieving advances in cellulosic biofuel technologies. While there was disagreement among the experts on each of the technologies, the patterns of disagreement suggest several distinct strategies. Selective Thermal Processing appears to be the most promising path, with the main question being how much funding is required to achieve success. Thus, a staged investment in this path looks promising. With respect to gasification, there remains fundamental disagreement over whether success is possible even at higher funding levels. Thus, basic research into the viability of the path makes sense. The Hydrolysis path induced the widest range of responses from the experts, indicating there may be value in collecting more information on this technology.