Book

Simulating Nature: A Philosophical Study of Computer-Simulation Uncertainties and Their Role in Climate Science and Policy Advice

Authors:
... It is not surprising to see computer simulation as a basis of, or information resource for, political decision-making being heavily criticized: a lack of trust in models and modellers, the questionable accuracy of simulation results and the inadequacy of the computing process itself are only some points of criticism that have been raised (e.g. Hellström 1996;Petersen 2006;Brugnach et al. 2007;Ivanovic and Freer 2009;Fisher et al. 2010;Wagner et al. 2010). ...
... Deficits that have been identified highlight both the aspects of the simulation tool itself and the people dealing with them. On the side of the simulation tool deficits refer to deficiant levels of simulation quality, and the lack of adequate outside communication of the role of complexity and uncertainty (Hellström 1996;Ivanovic and Freer 2009;Petersen 2006). The issue of complexity and uncertainty points to the important aspect whether simulation adequately match with real world phanomena and reality 92 5 The Science-Policy Interface of Subsurface Environmental Modelling (King and Kraemer 1993;Pilkey and Pilkey-Jarvis 2007). ...
... These uncertainties have been reported and analysed extensively in literature, e.g. Morgan and Henrion (1990), , Petersen (2006), . , for instance, developed an uncertainty categorisation using the following terminology: determinism, statistical uncertainty, scenario uncertainty, recognised ignorance and total ignorance while apply this approach for the risk estimation of brine migration resulting from CO 2 injection into saline aquifers. ...
Book
This book provides a broad overview of essential features of subsurface environmental modelling at the science-policy interface, offering insights into the potential challenges in the field of subsurface flow and transport, as well as the corresponding computational modelling and its impact on the area of policy- and decision-making. The book is divided into two parts: Part I presents models, methods and software at the science-policy interface. Building on this, Part II illustrates the specifications using detailed case studies of subsurface environmental modelling. It also includes a systematic research overview and discusses the anthropogenic use of the subsurface, with a particular focus on energy-related technologies, such as carbon sequestration, geothermal technologies, fluid and energy storage, nuclear waste disposal, and unconventional oil and gas recovery.
... It is not surprising to see computer simulation as a basis of, or information resource for, political decision-making being heavily criticized: a lack of trust in models and modellers, the questionable accuracy of simulation results and the inadequacy of the computing process itself are only some points of criticism that have been raised (e.g. Hellström 1996;Petersen 2006;Brugnach et al. 2007;Ivanovic and Freer 2009;Fisher et al. 2010;Wagner et al. 2010). ...
... Deficits that have been identified highlight both the aspects of the simulation tool itself and the people dealing with them. On the side of the simulation tool deficits refer to deficiant levels of simulation quality, and the lack of adequate outside communication of the role of complexity and uncertainty (Hellström 1996;Ivanovic and Freer 2009;Petersen 2006). The issue of complexity and uncertainty points to the important aspect whether simulation adequately match with real world phanomena and reality 92 5 The Science-Policy Interface of Subsurface Environmental Modelling (King and Kraemer 1993;Pilkey and Pilkey-Jarvis 2007). ...
... These uncertainties have been reported and analysed extensively in literature, e.g. Morgan and Henrion (1990), , Petersen (2006), . , for instance, developed an uncertainty categorisation using the following terminology: determinism, statistical uncertainty, scenario uncertainty, recognised ignorance and total ignorance while apply this approach for the risk estimation of brine migration resulting from CO 2 injection into saline aquifers. ...
Chapter
Hydraulic fracturing—fracking, for short—refers to the stimulation of rock via the injection of fluids, typically water with additives, aiming at increasing the rock’s permeability. For hydrocarbon-bearing rocks, this facilitates the extraction and production of natural gas or oil. It is also common to denote such hydrocarbons as shale gas or shale oil when it is stored in shale with such low permeabilities that it cannot be produced with conventional technologies.
... It is not surprising to see computer simulation as a basis of, or information resource for, political decision-making being heavily criticized: a lack of trust in models and modellers, the questionable accuracy of simulation results and the inadequacy of the computing process itself are only some points of criticism that have been raised (e.g. Hellström 1996;Petersen 2006;Brugnach et al. 2007;Ivanovic and Freer 2009;Fisher et al. 2010;Wagner et al. 2010). ...
... Deficits that have been identified highlight both the aspects of the simulation tool itself and the people dealing with them. On the side of the simulation tool deficits refer to deficiant levels of simulation quality, and the lack of adequate outside communication of the role of complexity and uncertainty (Hellström 1996;Ivanovic and Freer 2009;Petersen 2006). The issue of complexity and uncertainty points to the important aspect whether simulation adequately match with real world phanomena and reality 92 5 The Science-Policy Interface of Subsurface Environmental Modelling (King and Kraemer 1993;Pilkey and Pilkey-Jarvis 2007). ...
... These uncertainties have been reported and analysed extensively in literature, e.g. Morgan and Henrion (1990), , Petersen (2006), . , for instance, developed an uncertainty categorisation using the following terminology: determinism, statistical uncertainty, scenario uncertainty, recognised ignorance and total ignorance while apply this approach for the risk estimation of brine migration resulting from CO 2 injection into saline aquifers. ...
Chapter
Coding a discretised mathematical model like the ones presented in Chaps. 2 and 3 yields a computational model in the form of software. Performing a numerical simulation corresponds to running a computational model for a specific scenario.
... It is not surprising to see computer simulation as a basis of, or information resource for, political decision-making being heavily criticized: a lack of trust in models and modellers, the questionable accuracy of simulation results and the inadequacy of the computing process itself are only some points of criticism that have been raised (e.g. Hellström 1996;Petersen 2006;Brugnach et al. 2007;Ivanovic and Freer 2009;Fisher et al. 2010;Wagner et al. 2010). ...
... Deficits that have been identified highlight both the aspects of the simulation tool itself and the people dealing with them. On the side of the simulation tool deficits refer to deficiant levels of simulation quality, and the lack of adequate outside communication of the role of complexity and uncertainty (Hellström 1996;Ivanovic and Freer 2009;Petersen 2006). The issue of complexity and uncertainty points to the important aspect whether simulation adequately match with real world phanomena and reality 92 5 The Science-Policy Interface of Subsurface Environmental Modelling (King and Kraemer 1993;Pilkey and Pilkey-Jarvis 2007). ...
... These uncertainties have been reported and analysed extensively in literature, e.g. Morgan and Henrion (1990), , Petersen (2006), . , for instance, developed an uncertainty categorisation using the following terminology: determinism, statistical uncertainty, scenario uncertainty, recognised ignorance and total ignorance while apply this approach for the risk estimation of brine migration resulting from CO 2 injection into saline aquifers. ...
Chapter
A field of subsurface environmental engineering which is likely to (re-)receive enormous societal attention is the fate of vast amounts of nuclear waste world-wide. One of the most likely options, given today’s state of knowledge, is permanent geological storage. Thus, we have another topic of the subsurface where flow and transport processes play a dominant role in risk assessments. The public opinion on nuclear energy is not easy to evaluate. However, when it comes to hosting a nuclear waste repository in the neighbourhood, local opposition has to be expected.
... It is not surprising to see computer simulation as a basis of, or information resource for, political decision-making being heavily criticized: a lack of trust in models and modellers, the questionable accuracy of simulation results and the inadequacy of the computing process itself are only some points of criticism that have been raised (e.g. Hellström 1996;Petersen 2006;Brugnach et al. 2007;Ivanovic and Freer 2009;Fisher et al. 2010;Wagner et al. 2010). ...
... Deficits that have been identified highlight both the aspects of the simulation tool itself and the people dealing with them. On the side of the simulation tool deficits refer to deficiant levels of simulation quality, and the lack of adequate outside communication of the role of complexity and uncertainty (Hellström 1996;Ivanovic and Freer 2009;Petersen 2006). The issue of complexity and uncertainty points to the important aspect whether simulation adequately match with real world phanomena and reality 92 5 The Science-Policy Interface of Subsurface Environmental Modelling (King and Kraemer 1993;Pilkey and Pilkey-Jarvis 2007). ...
... These uncertainties have been reported and analysed extensively in literature, e.g. Morgan and Henrion (1990), , Petersen (2006), . , for instance, developed an uncertainty categorisation using the following terminology: determinism, statistical uncertainty, scenario uncertainty, recognised ignorance and total ignorance while apply this approach for the risk estimation of brine migration resulting from CO 2 injection into saline aquifers. ...
Chapter
Modelling activities in science are not limited to the scientific community itself, but relate to and impact other domains of society. With this chapter, we conceptually explore matters of modelling at the science-policy interface. Understanding scientific modelling as a tool and school to provide evidenced-based knowledge, there are several particularities when modelling enters the sphere of policy-making and public debate.
... It is not surprising to see computer simulation as a basis of, or information resource for, political decision-making being heavily criticized: a lack of trust in models and modellers, the questionable accuracy of simulation results and the inadequacy of the computing process itself are only some points of criticism that have been raised (e.g. Hellström 1996;Petersen 2006;Brugnach et al. 2007;Ivanovic and Freer 2009;Fisher et al. 2010;Wagner et al. 2010). ...
... Deficits that have been identified highlight both the aspects of the simulation tool itself and the people dealing with them. On the side of the simulation tool deficits refer to deficiant levels of simulation quality, and the lack of adequate outside communication of the role of complexity and uncertainty (Hellström 1996;Ivanovic and Freer 2009;Petersen 2006). The issue of complexity and uncertainty points to the important aspect whether simulation adequately match with real world phanomena and reality 92 5 The Science-Policy Interface of Subsurface Environmental Modelling (King and Kraemer 1993;Pilkey and Pilkey-Jarvis 2007). ...
... These uncertainties have been reported and analysed extensively in literature, e.g. Morgan and Henrion (1990), , Petersen (2006), . , for instance, developed an uncertainty categorisation using the following terminology: determinism, statistical uncertainty, scenario uncertainty, recognised ignorance and total ignorance while apply this approach for the risk estimation of brine migration resulting from CO 2 injection into saline aquifers. ...
Chapter
The energy transition has become a key political issue in many countries. The main emphasis of responses to climate change challenges is in transforming the energy system from high to low carbon energy supply and in decoupling energy demand from economic growth. The general principles of energy policy objectives comprise the three paradigms of economic efficiency, security of energy supply and environmental compatibility.
... For example, simulation models of ecological systems can give an impression of how such systems behave and, as such, they can suggest reasons for taking political measures, although due to their complexity they cannot reliably predict the future states of these systems. 62 In order for policymakers to evaluate the results of the simulation, it is important that the modeling assumptions be transparent. However, this situation should still be considered an ideal. ...
... Those in charge of formulating scientific policies, and other actors of society, generally, ask the scientific advisors to communicate the results of the simulations but not to stop at the uncertainties resulting from them. 62 Therefore, for a more authentic image of contemporary scientific research, educational simulations should be used in class not only to collect, manipulate, manage, and present data, 57 but also to analyze data using the models generated by the simulations. It is essential that students know the general methodology of computational simulations and how to evaluate them, since many of the results of these simulations, for example in the area of climate science, are used for the elaboration of environmental public policies. ...
... Parameter settings (mathematical models within simulation models that represent the net effect of unresolved processes in solved processes) often contain adjustable parameters whose configuration is not based on theory. 62 An example of this is atmospheric modeling. Given the large size and complexity of the atmosphere, the direct application of general theory such as the Navier-Stokes equations on fluid dynamics for the study of practical atmospheric problems is unfeasible, even if the fastest computers exist. ...
Article
Full-text available
Computer simulations are currently used in diverse scientific disciplines as well as in science teaching. The simulations proposed for physics teaching are designed for specific purposes and allow studying natural phenomena through exploration and/or modeling. In this work, we present an analysis that, based on the theoretical frame provided by the nature of science (NOS), considers some aspects of present discussions on computer simulations that should be taken into account to promote scientifically literate citizens. We examine different types of simulations used, specifically, in the teaching of physics for high school in the past few years. We also explore some of the current philosophical thoughts on scientific simulations and which of their aspects could become part of the teaching-learning process when using simulations in science teaching—for example, the role of the model in simulations, the relationship between simulations and experimentation, the validity of knowledge resulting from simulations, and the role of animation in relation to the simulation used. Discussing these aspects from the viewpoint of NOS would provide students with an updated vision of scientific activity so that they could analyze the information they daily receive and, as members of society, take a stand accordingly.
... Uncertainty is not equally found in all sources, or by all researchers, nor does it have an equivalent impact on the ability to produce knowledge. As Petersen wrote, "uncertainty takes many forms, whether epistemic, statistical, methodological, or sociocultural" [7]. Outside of the humanities, many models and tools have been developed, either for capturing uncertainty in data (such as Petersen's "Uncertainty Matrix" [7]) or for capturing aspects of processing that could introduce uncertainty (such as the NASA EOSDIS [8]). ...
... As Petersen wrote, "uncertainty takes many forms, whether epistemic, statistical, methodological, or sociocultural" [7]. Outside of the humanities, many models and tools have been developed, either for capturing uncertainty in data (such as Petersen's "Uncertainty Matrix" [7]) or for capturing aspects of processing that could introduce uncertainty (such as the NASA EOSDIS [8]). These would, however, be cumbersome if not impossible to adapt to and apply in the humanities, an aspect of the overall environment discussed further in Section 4 below. ...
... Historians do not have the leisure to simply undertake further data gathering, and, as a result, the W3C category of 'empirical uncertainty' would seem to collapse from their perspective into other categories in the taxonomy, such as 'incompleteness' or 'inconsistency.' Similarly, the long recognised and powerful Uncertainty Matrix developed by Petersen [7] introduces both a level of detail to its model and a conceptual framework so foreign to the work of the historian that even if some instances could be mapped to it, widespread adoption would be so resource intensive (given how central uncertainty is to the humanities) as to create more questions than answers. This is not to say that there could not be benefits inherent in such an exercise, only that asking a researcher whose discipline is not based on such models to adapt one from another discipline is like asking someone to translate into English text from a language without verbs: an already challenging exercise is rendered nearly futile by turning it from a means into an end in itself. ...
Article
Full-text available
This paper takes a high-level view of both the sources and status of uncertainty in historical research and the manners in which possible negative effects of this omnipresent characteristic might be managed and mitigated. It draws upon both the experience of a number of digital projects and research into the many-faceted concept of uncertainty in data, and in particular, it explores the conflicting strategies for the management of uncertainty in historical research processes that are reflected in the historiographical and digital humanities literature. Its intention is to support a dialogue between the humanities and computer science, able to realise the promise of digital humanities without a reversion to a new positivism in disciplines such as history and literary studies and it therefore concludes with recommendations for the developers of research tools and environments for digital history.
... The advent of computer technology allowed data to be used and developed through modelling and parameterisation to create global knowledge in ways that revolutionised how the concept of climate was interpreted and understood. Although the precise details of how these models and modelling techniques developed are beyond the scope of this book (see instead, Petersen, 2012;Edwards, 2010;Miller and Edwards, 2001), it is worth noting that the term 'modelling' refers to a number of different practices. Modelling involves the interpretation and manipulation of readings from weather instruments, the simulation of climate systems, the simulation of Earth's bio-geophysical systems more generally, and the integration of weather forecasting methods, climate simulations and socio-economic data in ways that allow climate scientists to present internally consistent simulations of future climate. ...
... These questions are asked by, or of, the climate science community and are engaged at the point of presenting and drawing conclusions about climate simulations. They require not just a range of subjective assumptions and assertions requiring epistemic value judgements (like those described for trans-science modes one and two above) as a result of the many uncertainties about the complex ways in which bio-geophysical and social-ecological systems interact (Petersen, 2012;Edwards, 2010), they also require or may otherwise be influenced by nonepistemic judgements regarding how things are (or ought to be) and how we envision the future (Hulme, 2009). ...
... Climate experts develop a body of trans-scientific evidence in the form of climate change models about the future trajectory of global and regional climates. This evidence is based on consensus expert understandings of the planet's bio-geophysical systems, the anthropogenic influences on them and a range of assertions and expert assumptions about how those influences will affect climate in the future (Petersen, 2012;Edwards, 2010). Importantly, this trans-scientific component is developed within the climate modelling community itself, with minimal input from a broader range of experts or non-expert evidence users. ...
... For example, in the IPCC government-appointed scientists, diplomats representing national governments, NGOs and business representatives interact in varying configurations. 22,23 Dual accountability: the leadership or management is simultaneously accountable to representatives of both science and politics. For example, the European Environment Agency has a Management Board to deal with political issues like salience and legitimacy, and a Scientific Board to attend to issues of scientific credibility. ...
... In IPCC, key texts like the Statement for Policymakers and the Synthesis Reports, but also methods of calculating anthropogenic carbon dioxide emissions are typical examples of boundary objects since they are the result of procedural and substantive intertwinement of scientific and political considerations. 22,23 ...
... 15 The SBSTA delimited discussions between the IPCC and the COP based on whether particular issues were considered political or value-based decisions (best dealt with by the COP) and scientific issues (best dealt with by the IPCC). Such interorganisational orchestration is not limited to the SBSTA however 23,22,45 ; for example Fogel 49 illustrates the complex mix of 'puzzling and powering' that occurs in both the SBSTA and the IPCC around issues such as defining the terms of reference of an IPCC special report (which occurred at the SBSTA), to struggles around the precise distinction between policy relevance and policy prescriptiveness (which occurred in the IPCC), to debates and struggles about the presentation and management of uncertainty (which occurred in the IPCC). ...
... climate prediction best guess of future climate climate projection a conditional climate prediction climate scenario a plausible, not necessarily likely, evolution of the climate in the future climate system the highly complex system consisting of five major components: the atmosphere, the hydrosphere, the cryosphere, the lithosphere and the biosphere, and the interactions between them (IPCC 2013, Glossary) xv xvi DEFINITIONS climate sensitivity see equilibrium climate sensitivity deep uncertainty hard to quantify uncertainty epistemic uncertainty the incompleteness and fallibility of knowledge (Petersen 2012(Petersen [2006, p. 52) equilibrium climate sensitivity the equilibrium (steady state) change in the annual global mean surface temperature following a doubling of the atmospheric equivalent carbon dioxide concentration (IPCC 2013, Glossary) ontic uncertainty the intrinsic indeterminate or variable character of the system under study (Petersen 2012(Petersen [2006, p. 52); also referred to as aleatory uncertainty (NRC 1996, p. 107) paradigm ideas and and traditions of scientific practice robust decision strategies that perform relatively well under a wide range of plausible futures and are relatively insensitive to unforeseen circumstances and (broken) assumptions. ...
... climate prediction best guess of future climate climate projection a conditional climate prediction climate scenario a plausible, not necessarily likely, evolution of the climate in the future climate system the highly complex system consisting of five major components: the atmosphere, the hydrosphere, the cryosphere, the lithosphere and the biosphere, and the interactions between them (IPCC 2013, Glossary) xv xvi DEFINITIONS climate sensitivity see equilibrium climate sensitivity deep uncertainty hard to quantify uncertainty epistemic uncertainty the incompleteness and fallibility of knowledge (Petersen 2012(Petersen [2006, p. 52) equilibrium climate sensitivity the equilibrium (steady state) change in the annual global mean surface temperature following a doubling of the atmospheric equivalent carbon dioxide concentration (IPCC 2013, Glossary) ontic uncertainty the intrinsic indeterminate or variable character of the system under study (Petersen 2012(Petersen [2006, p. 52); also referred to as aleatory uncertainty (NRC 1996, p. 107) paradigm ideas and and traditions of scientific practice robust decision strategies that perform relatively well under a wide range of plausible futures and are relatively insensitive to unforeseen circumstances and (broken) assumptions. ...
... climate prediction best guess of future climate climate projection a conditional climate prediction climate scenario a plausible, not necessarily likely, evolution of the climate in the future climate system the highly complex system consisting of five major components: the atmosphere, the hydrosphere, the cryosphere, the lithosphere and the biosphere, and the interactions between them (IPCC 2013, Glossary) xv xvi DEFINITIONS climate sensitivity see equilibrium climate sensitivity deep uncertainty hard to quantify uncertainty epistemic uncertainty the incompleteness and fallibility of knowledge (Petersen 2012(Petersen [2006, p. 52) equilibrium climate sensitivity the equilibrium (steady state) change in the annual global mean surface temperature following a doubling of the atmospheric equivalent carbon dioxide concentration (IPCC 2013, Glossary) ontic uncertainty the intrinsic indeterminate or variable character of the system under study (Petersen 2012(Petersen [2006, p. 52); also referred to as aleatory uncertainty (NRC 1996, p. 107) paradigm ideas and and traditions of scientific practice robust decision strategies that perform relatively well under a wide range of plausible futures and are relatively insensitive to unforeseen circumstances and (broken) assumptions. ...
Thesis
Full-text available
The strong global warming of the last 150 years and the expected continuation at an even higher rate has raised the concern among decision makers. The political debate and decisions are, however, complicated by the large disagreements and 'deep' uncertainties involved. Several decaces ago, the Intergovernmental Panel on Climate Change (IPCC) was established to provide a common frame to streamline this process. This was originally done by periodic assessments of all scientific efforts. In the course of time, the influence of the IPCC on scientific research and national assessments has gradually increased and a common practice based on General Circulation Model simulations has become dominant. Fully relying on this common practice does, however, not necessarily support 'robust' strategies and infrastructure that many decision makers aim for. In this thesis, the tenability of this 'climate modelling paradigm' is explored. In part I, it is argued that a 'paradigm shift' from scientific certainty to a full exploration of what might be possible better fits the objective of robust decisions without sacrificing on the scientific soundness. In part II, four peer-reviewed articles are presented of which three (implicitly) rely on the above mentioned climate modelling paradigm.
... Recent research has identified some of the issues associated with the use of uncertainty typologies; for example, that their creation can rely on potentially subjective expert judgement (Knol et al. 2009), that their successful implementation can depend on the skill and experience of the end-user (Gillund et al. 2008) and that no typology exists which 'includes all of its meanings in a way that is clear, simple, and adequate for each potential use of such a typology' (Petersen 2006). However, the full extent of these problems and their potential impacts are not clear. ...
... Aleatory uncertainty Aleatory uncertainty (Bedford and Cooke 2001;Ascough II et al. 2008) represents the inherent randomness displayed in human and natural systems. It is also referred to as physical (Vesely and Rasmuson 1984), stochastic (Helton 1994), variability (Hoffman and Hammonds 1994;Janssen et al. 2003;Walker et al. 2003;Hayes et al. 2006), random (Bevington and Robinson 2002;Regan, Colyvan, and Burgman 2002) or ontic (Petersen 2006;Knol et al. 2009). Aleatory uncertainty cannot be reduced, although additional research may help to better understand the complexities of the system(s) of interest. ...
... Epistemic uncertainty (Bedford and Cooke 2001;Walker et al. 2003;Petersen 2006;Ascough II et al. 2008;Knol et al. 2009) is the imperfection of knowledge concerning a system of interest. Epistemic uncertainty is also termed completeness (Vesely and Rasmuson 1984;Rowe 1994), subjective (Helton 1994), knowledgebased (Hoffman and Hammonds 1994;Janssen et al. 2003) or systematic (Bevington and Robinson 2002). ...
Article
Uncertainties, whether due to randomness or human or system errors, are inherent within any decision process. In order to improve the clarity and robustness of risk estimates and risk characterisations, environmental risk assessments (ERAs) should explicitly consider uncertainty. Typologies of uncertainty can help practitioners to understand and identify potential types of uncertainty within ERAs, but these tools are yet to be reviewed in earnest. Here, we have systematically reviewed 30 distinct typologies and the uncertainties they communicate and demonstrate that they: (1) use terminology that is often contradictory; (2) differ in the frequencies and dimensions of uncertainties that they include; (3) do not uniformly use systematic and robust methods to source information; and (4) cannot be applied, on an individual basis, to the domain of ERA. On the basis of these observations, we created a summary typology - consisting of seven locations (areas of occurrence) of uncertainty across five distinct levels (magnitude of uncertainty) - specifically for use with ERAs. This work highlights the potential for confusion, given the many versions of uncertainty typologies which exist for closely related risk domains and, through the summary typology, provides environmental risk analysts with information to form a solid foundation for uncertainty analysis (based on improved understanding) to identify uncertainties within an ERA.
... Being able to communicate uncertainties presupposes knowing what they are and this is no simple matter in the domain of climate change. The climate models that are used to produce projections of climate change are extremely complex and associated with many different uncertainties (Petersen 2012). In addition, 'there are significant differences in opinion amongst modellers, indeed what could be termed different cultures of doing climate modelling [..] These different cultures result in different sets of standards by which climate change science is evaluated. ...
... As a benchmark for assessing the latter, we refer to Petersen's (2012) typology of uncertainty in climate simulations: ...
... Van Asselt and Rotmans 2002) but they do not usually cover the whole range of paradigmatic uncertainty. Petersen (2012) asserts that each type of uncertainty can occur in five locations: in the conceptual model, the mathematical model (structure and parameters), the technical model, model inputs and in output data and interpretation. Only some of these uncertainties are statistically quantifiable; most can only be assessed through qualitative judgement for which 'the (variable) judgement and best practice of a scientific community provides a reference' (Petersen 2012 p.58). ...
Article
Full-text available
The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts com- munities worked together on ‘ end-to-end ’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.
... For example, in the IPCC government-appointed scientists, diplomats representing national governments, NGOs and business representatives interact in varying configurations. 22,23 Dual accountability: the leadership or management is simultaneously accountable to representatives of both science and politics. For example, the European Environment Agency has a Management Board to deal with political issues like salience and legitimacy, and a Scientific Board to attend to issues of scientific credibility. ...
... In IPCC, key texts like the Statement for Policymakers and the Synthesis Reports, but also methods of calculating anthropogenic carbon dioxide emissions are typical examples of boundary objects since they are the result of procedural and substantive intertwinement of scientific and political considerations. 22,23 ...
... 15 The SBSTA delimited discussions between the IPCC and the COP based on whether particular issues were considered political or value-based decisions (best dealt with by the COP) and scientific issues (best dealt with by the IPCC). Such interorganisational orchestration is not limited to the SBSTA however 23,22,45 ; for example Fogel 49 illustrates the complex mix of 'puzzling and powering' that occurs in both the SBSTA and the IPCC around issues such as defining the terms of reference of an IPCC special report (which occurred at the SBSTA), to struggles around the precise distinction between policy relevance and policy prescriptiveness (which occurred in the IPCC), to debates and struggles about the presentation and management of uncertainty (which occurred in the IPCC). ...
Article
In this article, we explore how climate change science is connected to climate change governance. When formally institutionalized, as in Intergovernmental Panel on Climate Change ( IPCC ) or UN Framework Convention on Climate Change ( UNFCCC ), these sites may be referred to as boundary organizations . These institutions engage not only in the quality assessment of scientific research, but also in the design of innovative policy instruments, or evaluation of policy impacts—activities that we refer to as boundary work . Boundary work is inherently ‘tricky business’. Science and politics are normally demarcated spheres with different sacred stories. Scientists aspire to ‘speak truth to power’, while policymakers want ‘politics on top and science on tap’. Boundary work endeavors to coordinate these apparently incompatible aspirations. In this article, we describe, analyze, and assess whether, to what extent, and how the major international and some national boundary organizations in climate change governance have been able to avoid over‐politicization and over‐scientization. We demonstrate that the nature and success of boundary organizations and the ways they work depend on: (1) the degree to which the climate change problem is defined as ‘wicked’ or unstructured, or as (relatively) ‘tame’ and structured; (2) the stage of the policy process; and (3) characteristics of the policy network and the socio‐political context: the degree to which relevant players insist on strict separation and a linear relation from science to politics, or, alternatively, are tolerant of a blurring of the boundaries and hence a two‐way, coproductive relation between science and politics. Anna Wesselink's contribution to this article was financially supported by the European Union (European Commission, European Reintegration Grant PERG08‐GA‐2010‐276934). WIREs Clim Change 2013, 4:283–300. doi: 10.1002/wcc.225 This article is categorized under: Policy and Governance > Multilevel and Transnational Climate Change Governance Social Status of Climate Change Knowledge > Climate Science and Decision Making
... Computer simulations involve mathematical models implemented on a computer imitating one or more natural processes. Models are based on general theories and fundamental principles, idealizations, approximations, mathematical concepts, metaphors, analogies, facts, and empirical data [36,37]. Judgments and arbitrary choices must be made in model construction to apply fundamental laws to describe turbulent fluid flow. ...
... Models of the atmosphere rely on parameterizations of physical processes that cannot be directly simulated. A parameterization is a separate mathematical model calculating the net effects of unresolved processes on the processes that can be directly simulated [36]. ...
Article
Full-text available
Civil infrastructure provides the physical backbone of all societies. Water supply, wastewater treatment, transportation systems, and civil structures must be sustainable over multiple decades (e.g. 20, 30, 50 years) for human populations to survive and flourish. Over such a long time-period, climate changes are inevitable. The global atmospheric system is dynamic. Weather and climates are constantly adjusting. To date the effects of carbon dioxide have been evaluated almost exclusively using a global reference frame. However, civil infrastructure is stationary and local in nature. A locational reference frame is introduced here as an alternative framework for evaluating the effect of carbon dioxide on civil infrastructure. Temperature data from the City of Riverside, California from 1901 to 2017 are analyzed to illustrate application of a local reference frame. No evidence of significant climate change beyond natural variability was observed in this temperature record. Using a Climate Sensitivity best estimate of 2°C, the increase in temperature resulting from a doubling of atmospheric CO 2 is estimated at approximately 0.009°C/yr which is insignificant compared to natural variability.
... The manner in which simulations can inform decision-making processes depends in turn on their validation and evaluation. In complex systems it is practically 107 impossible to adequately represent or make predictions about such systems, primarily because of their complex nature, the implausibility of simulating open worlds in a (necessarily) closed simulation and the lack of availability of comprehensive datasets that describe such systems (Batty, 2012; Petersen, 2012). Under such circumstances, it is hard for a modeller to design and build a model without understanding exactly how the model is going to be used. ...
... They prefer to test the assumptions in the model, explore it from different perspectives, watch how the model evolves over time and compare different scenarios and starting points for the model. This could be because of the intractable nature of complex systems, and policy makers prefer an understanding and exploration of the dynamics and processes in the system over predictions of future states (Lee, 1994; Petersen, 2012). They " like to play with " and " like to play in " the model and do not perceive the two as different: ...
... Sometimes the negative side effects of entrenched technologies cause a U-turn from structured to unstructured problem. Issues like the car mobility problem (Hendriks, 1999;Hoppe & Grin, 2000), the building of nuclear power plants in the Netherlands in the 1980"s (Hisschemöller,1993: 71-78), contemporary planning for a nuclear phase-out in Belgium (Laes et al, 2004), and anthropogenic global warming (Peterse, 2006) belong in this category. Sometimes it is the unbridled research and innovation drive, which leads to new, unstructured problems. ...
... Given uncertain knowledge, and thus uncertain effectiveness and efficiency of interventions, moderately structured problems (ends) also frequently raise issues of bargaining about who will be responsible for expenditures in financing or otherwise enabling certain interventions; and for risks in case of ineffectiveness or negative side effects. Issues like traffic safety (Hoppe and Grin, 2000), ambient particulate matter (Peterse, 2006), fighting obesitas (VWS/ Department of Public Health, 2009) and many issues of policies for routinely agreed-upon socio-economic goals like maximizing gross domestic product and minimizing inflation (Halffman & Hoppe, 2005) belong to this problem type. ...
Chapter
This chapter introduces the main themes of this book. It discusses the governance of problems, puzzling and powering, governance as a quest for political participation and institutional alignment, and implications for policy analysis. The major thesis of the book is that contemporary democracies, in order to maintain a sufficiently responsive system for the governance of problems, ought to develop more reflexive institutions and practices of policy-oriented and polity-oriented problem structuring. A structural mismatch between problem perception and structuring by larger segments, if not a majority, of the citizenry and their proximate policy makers, is a real possibility; and thus a threat. Better governance implies political sensitivity to different types of problems; and more and better reflexive problem structuring through better institutional, interactive, and deliberative designs for public debate and political choice.
... Sometimes the negative side effects of entrenched technologies cause a U-turn from structured to unstructured problem. Issues like the car mobility problem (Hendriks, 1999;Hoppe & Grin, 2000), the building of nuclear power plants in the Netherlands in the 1980"s (Hisschemöller,1993: 71-78), contemporary planning for a nuclear phase-out in Belgium (Laes et al, 2004), and anthropogenic global warming (Peterse, 2006) belong in this category. Sometimes it is the unbridled research and innovation drive, which leads to new, unstructured problems. ...
... Given uncertain knowledge, and thus uncertain effectiveness and efficiency of interventions, moderately structured problems (ends) also frequently raise issues of bargaining about who will be responsible for expenditures in financing or otherwise enabling certain interventions; and for risks in case of ineffectiveness or negative side effects. Issues like traffic safety (Hoppe and Grin, 2000), ambient particulate matter (Peterse, 2006), fighting obesitas (VWS/ Department of Public Health, 2009) and many issues of policies for routinely agreed-upon socio-economic goals like maximizing gross domestic product and minimizing inflation (Halffman & Hoppe, 2005) belong to this problem type. ...
Chapter
This chapter addresses the socio-cognitive aspects of problem structuring in defining political environments or task fields from the viewpoint of policy players. From this perspective, the discussion shows it to be useful to distinguish between four types of problem structures: structured, unstructured, and two types of moderately structured problems. The first section develops the perspective of a politics of meaning. It views politics as the collective attempt to control a polity's shared response to the adversities and opportunities of the human condition. The second section gives an overview of how others have approached the social and political analysis of policy issues or problems. The third section discusses properties of structured versus unstructured problems; and analyses how unstructured problems get to be structured through problem decomposition and constraint sequencing.
... With wizards and scientists together, the fourth section contrasts their strengths and challenges when forecasting the future. The fifth section continues this discussion in the context of imperfect models (Berger and Smith 2019;Petersen [2006] 2012; Judd and Smith 2004). The sixth section then explores the intentions of wizards and scientists and clarifies the discussion of opacity. ...
Article
Full-text available
Myths inform real-world decision making. Scientific simulation models can do the same. Neither reflects their real-world targets perfectly. They are most useful in an apophatic sense: employing them as the best tools available without confusing their indications with Truth. The actions and choices of wizards often reflect those of scientists; drawing parallels is informative and several questions will be explored within this framework. How is a climate scientist to respond when offered a Faustian agreement promising limited insights? As scientists, how might we better communicate scientific limits regarding which aspects of the future we can see clearly and which we cannot? Should we risk casting doubt on the as-good-as-it-gets science underlying anthropogenic climate change? If an electorate requires certainty of a threat before it will vote for action, are lies of omission or misrepresentation justified? Is it ethical for scientists whose research is relevant to the policy process to pause their typical vigorous scientific criticism of overinterpretation by others (particularly in sciences downstream from physical science) when their science is thought not adequate for purpose? Should scientists merely advise, presenting the relevant science as neutrally as they can, or advocate by emphasizing evidence that supports their preferred course of action, or become activists overselling their science to achieve well-motivated policy ends by whatever means required?
... (There have been important changes in the procedures; unless I specify otherwise, I will be talking about how things are done now. An excellent, brief introduction to the IPCC and how its procedures as well as conclusions have been challenged and have changed over time is Petersen (2012). ...
... However, more detailed physical representations of the processes that shape the Earth's surface involve a larger number of parameters that are typically estimated from proxy data or theoretical considerations, or are completely unknown (Oreskes et al., 1994;Petersen, 2012). If LEMs are to be operationally used for prediction or as decision-making tools in the future, their outputs must be evaluated against the uncertainty in the input parameters -a task that is increasingly difficult for a large number of parameters. ...
Article
Full-text available
The evaluation and verification of landscape evolution models (LEMs) has long been limited by a lack of suitable observational data and statistical measures which can fully capture the complexity of landscape changes. This lack of data limits the use of objective function based evaluation prolific in other modelling fields, and restricts the application of sensitivity analyses in the models and the consequent assessment of model uncertainties. To overcome this deficiency, a novel model function approach has been developed, with each model function representing an aspect of model behaviour, which allows for the application of sensitivity analyses. The model function approach is used to assess the relative sensitivity of the CAESAR-Lisflood LEM to a set of model parameters by applying the Morris method sensitivity analysis for two contrasting catchments. The test revealed that the model was most sensitive to the choice of the sediment transport formula for both catchments, and that each parameter influenced model behaviours differently, with model functions relating to internal geomorphic changes responding in a different way to those relating to the sediment yields from the catchment outlet. The model functions proved useful for providing a way of evaluating the sensitivity of LEMs in the absence of data and methods for an objective function approach.
... As Petersen writes, "Uncertainty takes many forms, whether epistemic, statistical, methodological, or sociocultural. " [14] Outside of the humanities, many models and tools have been developed, either for capturing uncertainty in data (such as Petersen's "Uncertainty Matrix" [14]) or for capturing aspects of processing that could introduce uncertainty (such as the NASA EODIS [12]). These would, however, be cumbersome to adapt to and apply in the humanities. ...
Conference Paper
This paper takes a high-level view of both the sources and status of uncertainty in humanities research and the attributes a digital system would ideally have. It draws upon both the experience of a number of digital projects and research into the many-faceted concept of uncertainty in data. Its intention is to support a dialogue between the humanities and computer science, able to realise the promise of digital humanities without a reversion to a new positivism in disciplines such as history and literary studies.
... They prefer to test the assumptions in the model, explore it from different perspectives, watch how the model evolves over time and compare different scenarios and starting points for the model. This could be because of the intractable nature of complex systems, and policy makers prefer an understanding and exploration of the dynamics and processes in the system over predictions of future states (Lee, 1994;Petersen, 2012). They like to play in and like to play in the model (which is the game) and do not perceive the two as different. ...
Article
Background. The increasing cognizance of complexity in systems has brought into focus important questions about the methods and tools we use to address them. Games for design, where games and computer simulations are used together to create concrete and tangible designs in a pluralistic way, with multiple stakeholders within the game is a new area for simulation gaming. Aim. In this article about gaming for design, embedded in the design science approach towards game science, we raise important philosophical questions about this new area, as well as attempt to address practical questions at the application level. We attempt to bridge the analytical science and design science approaches to games, and analyze them through meta-constructs of games such as fidelity, abstraction and resolution. Results. Results from two applications, through analysis of game play and debriefing of game sessions from two applications, COMPLEX and ProtoWorld are gathered and analyzed to understand the respresentational requirements for simulations and games. Conclusion. Results point to the need for rigor in gaming, particularly when modeling reference systems and rigor in assessing effects, both during game play and while debriefing. Results also point to expanded definitions of meta-constructs of games, as well as to their linked nature.
... . Related terms for the same phenomena are "ontological"(Lane & Maxwell, 2005); "deep"(Petersen, 2006), "Knightian"(Knight, 1921), or "model"(Chatfield, 1995). It is a form of uncertainty in which the system model generating outcomes and the input parameters to the ...
Article
Full-text available
We propose conviction narrative theory (CNT) to broaden decision-making theory in order to better understand and analyse how subjectively means–end rational actors cope in contexts in which the traditional assumptions in decision-making models fail to hold. Conviction narratives enable actors to draw on their beliefs, causal models, and rules of thumb to identify opportunities worth acting on, to simulate the future outcome of their actions, and to feel sufficiently convinced to act. The framework focuses on how narrative and emotion combine to allow actors to deliberate and to select actions that they think will produce the outcomes they desire. It specifies connections between particular emotions and deliberative thought, hypothesising that approach and avoidance emotions evoked during narrative simulation play a crucial role. Two mental states, Divided and Integrated, in which narratives can be formed or updated, are introduced and used to explain some familiar problems that traditional models cannot.
... 2 Related terms for the same phenomena are 'ontological' (Lane and Maxwell, 2005); "deep" (Petersen, 2006), "Knightian' (Knight 1931), or 'model' (Chatfield, 1995). It is a form of uncertainty in which the system model generating outcomes and the input parameters to the system model are not known or widely agreed on by the stakeholders to a decision (Lempert, 2002). ...
Article
Full-text available
We propose conviction narrative theory (CNT) to broaden decision-making theory for it better to understand and analyse how subjectively means-end rational actors cope in contexts in which the traditional assumptions in decision-making models fail to hold. Conviction narratives enable actors to draw on their beliefs, causal models and rules of thumb to identify opportunities worth acting on, to simulate the future outcome of their actions and to feel sufficiently convinced to act. The framework focuses on how narrative and emotion combine to allow actors to deliberate and to select actions that they think will produce the outcomes they desire. It specifies connections between particular emotions and deliberative thought, hypothesizing that approach and avoidance emotions evoked during narrative simulation play a crucial role. Two mental states, Divided and Integrated, in which narratives can be formed or updated, are introduced and used to explain some familiar problems that traditional models cannot.
... The variability in each of the stages of the workflow results in ambiguity, and, if not articulated, makes it even harder to reproduce results. Overall, moments of choice add an uncertainty margin to the results [19,27]. Last but not least, we can ask ourselves whether clear delineations exist between topics in practice. ...
Article
This paper describes how semantic indexing can help to generate a contextual overview of topics and visually compare clusters of articles. The method was originally developed for an innovative information exploration tool, called Ariadne, which operates on bibliographic databases with tens of millions of records. In this paper, the method behind Ariadne is further developed and applied to the research question of the special issue "Same data, different results" - the better understanding of topic (re-)construction by different bibliometric approaches. For the case of the Astro dataset of 111,616 articles in astronomy and astrophysics, a new instantiation of the interactive exploring tool, LittleAriadne, has been created. This paper contributes to the overall challenge to delineate and define topics in two different ways. First, we produce two clustering solutions based on vector representations of articles in a lexical space. These vectors are built on semantic indexing of entities associated with those articles. Second, we discuss how LittleAriadne can be used to browse through the network of topical terms, authors, journals, citations and various cluster solutions of the Astro dataset. More specifically, we treat the assignment of an article to the different clustering solutions as an additional element of its bibliographic record. Keeping the principle of semantic indexing on the level of such an extended list of entities of the bibliographic record, LittleAriadne in turn provides a visualization of the context of a specific clustering solution. It also conveys the similarity of article clusters produced by different algorithms, hence representing a complementary approach to other possible means of comparison.
... The future will surprise. 1 Related terms for the same phenomena are 'ontological' or 'ontic' (Lane and Maxwell, 2005;Petersen, 2006), "deep", "Knightian' (Knight 1931), or 'model' uncertainty (Chatfield, 1995. It is a form of uncertainty in which the system model generating outcomes and the input parameters to the system model are not known or widely agreed on by the stakeholders to a decision (Lempert, 2002). ...
Conference Paper
Full-text available
A general framework is proposed to broaden existing decision-making theory to enable it better to understand and analyse how subjectively means-end rational actors cope when the assumptions of traditional optimising models fail to hold. The framework, termed Conviction Narrative Theory (CNT), focuses on the power of narrative and emotion to combine to facilitate action. In CNT agents are able to act because they draw on cognitive and affective resources to form preferred narratives of the outcomes of their planned actions. Such narratives, conviction narratives, establish preference and enable action readiness specifically by evoking feelings of approach and avoidance. They play a necessary but not sufficient role in decisions to act under radical uncertainty, however these decisions turn out. Through developing conviction narratives, actors, in objectively uncertain conditions, become certain enough to act, despite the possibility of serious loss. CNT integrates many research findings over disparate domains. We then introduce the concepts of Divided and Integrated States to represent two different emotional contexts in which narratives are evaluated. Finally, we report two algorithmic techniques that have proved useful for identifying these states and linking them to economic outcomes, such as the outbreaks of financial instability observed prior to the financial crisis.
... The TorMC model comprises four general steps: (1) study region and model parameter definition; (2) tornado footprint creation; (3) tornado cost assessment, and (4) output production ( Figure 1). The TorMC model was designed to be highly modular (Petersen, 2012) in order to provide a user with as many simulation options as possible. Model parameter choices are selected prior to executing the program and allow the user to control the type of output generated by the TorMC model. ...
Article
Full-text available
Determining the likelihood and severity of tornado disasters requires an understanding of the dynamic relationship between tornado risk and vulnerability. As population increases in the future, it is likely that tornado disaster frequency and magnitude will amplify. This study presents the Tornado Impact Monte Carlo (TorMC) model, which simulates tornado events atop a user-defined spatial domain to estimate the possible impact on people, the built-environment or other potentially vulnerable assets. Using a Monte Carlo approach, the model employs a variety of sampling techniques on observed tornado data to provide greater insight into the tornado disaster potential for a location. Simulations based on 10 000 years of significant tornado events for the relatively high-risk states of Alabama, Illinois and Oklahoma are conducted to demonstrate the model processes, and its reliability and applicability. These simulations are combined with a fine-scale (100 m), residential built-environment cost surface to illustrate the probability of housing unit impact thresholds for a contemporary year. Sample results demonstrate the ability of the model to depict successfully tornado risk, residential built-environment exposure and the probability of disaster. Additional outcomes emphasize the importance of developing versatile tools that capture better the tornado risk and vulnerability attributes in order to provide precise estimates of disaster potential. Such tools can provide emergency managers, planners, insurers and decision makers a means to advance mitigation, resilience and sustainability strategies.
... Being able to communicate uncertainties presupposes knowing what they are and this is no simple matter: ecosystem models are often extremely complex and associated with many different uncertainties. There are many different classifications of uncertainty, but a simple example relevant to marine ecosystem modelling that has been used in the context of climate change climate includes: ontological (related to underlying processes); epistemological (related to observations and model predictions); methodological (related to model structure); and axiological (related to the world view of the research) [47]. Ontological uncertainties are often accounted for through expert knowledge, and epistemological uncertainties are generally incorporated through assessment of model predictions and knowledge of the observing system. ...
... The future will surprise. 1 Related terms for the same phenomena are 'ontological' or 'ontic' (Lane and Maxwell, 2005;Petersen, 2006), "deep", "Knightian' (Knight 1931), or 'model' uncertainty (Chatfield, 1995. It is a form of uncertainty in which the system model generating outcomes and the input parameters to the system model are not known or widely agreed on by the stakeholders to a decision (Lempert, 2002). ...
Research
Full-text available
Draft account of conviction narrative theory
... The future will surprise. 1 Related terms for the same phenomena are 'ontological' or 'ontic' (Lane and Maxwell, 2005;Petersen, 2006), "deep", "Knightian' (Knight 1931), or 'model' uncertainty (Chatfield, 1995. It is a form of uncertainty in which the system model generating outcomes and the input parameters to the system model are not known or widely agreed on by the stakeholders to a decision (Lempert, 2002). ...
Article
Full-text available
Financial assets are abstract entities. Their value depends on beliefs which are inherently social and, we argue, emotional. Recent events have revealed profound uncertainty at the heart of financial markets, the manifest existence of emotion and the way confidence is crucial to orderly market functioning. Using findings from two interview studies, supported by ethnographic observation, we elaborate on the irreducible cognitive and emotional conflicts which face actors engaged in financial markets and threaten their daily operations. We introduce the term conviction narrative to analyse how they manage these conflicts on a day-to-day basis, and with what collective consequences. Our thesis is that expertise and conviction in financial markets have constantly to be created and renewed through a combination of psychological and social action with the implication at the macro level that while financial markets can be orderly they are so in an intrinsically fragile way.
... These interpretations and assessments are influenced by personal knowledge of the evidence, but also by weighing the competing credibility of different experts and of different explanations. 12,13 Thus, the "networks of trust" of survey respondents and their views on the "consilience of evidence" will impact any survey results. As the available evidence converges over time, scientists' aggregate opinion can be expected to reflect this convergence, resulting in a broadlythough not necessarily unanimously shared consensus. ...
Article
Full-text available
Results are presented from a survey held among 1868 scientists studying various aspects of climate change, including physical climate, climate impacts and mitigation. The survey was unique in its size, broadness and level of detail. Consistent with other research, we found that, as the level of expertise in climate science grew, so too did the level of agreement on anthropogenic causation. 90% of respondents with more than 10 climate-related peer-reviewed publications (about half of all respondents), explicitly agreed with anthropogenic greenhouse gases (GHGs) being the dominant driver of recent global warming. The respondents' quantitative estimate of the GHG contribution appeared to strongly depend on the judgment or knowledge of the cooling effect of aerosols. The phrasing of the IPCC attribution statement in its fourth assessment report (AR4) - providing a lower limit for the isolated GHG contribution - may have led to an underestimation of the GHG influence on recent warming. The phrasing was improved in AR5. We also report on the respondents' views on other factors contributing to global warming; of these Land Use and Land Cover Change (LULCC) was considered the most important. Respondents who characterized human influence on climate as insignificant, reported having had the most frequent media coverage regarding their views on climate change.
... These processes are quite different in different soils, which is also true for the adsorption of various biocides (vanAlphen and Stoorvogel, 2002). Overall, there are many models available to study certain aspects of ecosystem functioning, relating, e.g., to water quality and availability, food production, ecosystem functioning, effects on climate, etc., and these models play a key role in policy issues and management as they allow expressions of uncertainties and risks (Petersen, 2012). However, overall models that can functionally characterize ecosystems in terms of interrelated physical, chemical and biological processes, have not yet been developed, but the separate models can be linked as an expression of narratives, to be discussed later. ...
Article
Full-text available
The United Nations effort to define Sustainable Development Goals (SDG's), emphasizing local goals and capacity building, offers a unique opportunity for soil science to demonstrate the role it can play when focusing on these goals. Several strategic reports have presented key issues for sustainable development: food security, freshwater and energy availability, climate change and biodiversity loss are issues most frequently being listed, not soil degradation. Focusing on soil contributions towards interdisciplinary studies of these key issues, rather than emphasizing soils by themselves, is therefore bound to be more effective for the soil science profession. But this is still inadequate when studying land-related SDG's, requiring a broader ecosystem approach that can be achieved by a direct link between soil functions and corresponding ecosystem services. Thus, the key issues are not considered separately but linked as part of a dynamic ecosystem characterization following a narrative as is demonstrated for food security, that can be well addressed by precision agriculture. As all key issues and at least five of the ten SDG's are directly land-related, soil science can potentially play an important role in the suggested interdis-ciplinary studies. But in addition, the current information society with knowledgeable stake-holders requires innovative and interactive transdisciplinary scientific approaches by not only focusing on knowledge generation but also on co-learning with stakeholders and, important, on implementation. The soil science discipline can become more effective in the transdisciplinary context by: (1) reconnecting the knowledge chain, linking tacit with scientific knowledge both ways, (2) simplifying soil terminology, (3) learning to deal with "wicked" environmental problems for which no single solutions exist but only a series of alternative options for action, balancing economic, social and environmental considerations, (4) educating "knowledge brokers", linking science with society in land-related issues, acting within a "Community of Scientific Practice", and (5) modernizing soil science curricula. Transdisciplinary approaches are crucial to achieve SDG's, linking science and society. There is a need for specific results on the ground illustrating with hard data the key role soils can play in realizing SDG's. Key words: inter-and transdisciplinarity / precision agriculture / soil nexus / knowledge brokers / Communities of Scientific Practice
Chapter
In this book, we have focused on the theory and practice of subsurface environmental modelling at the science-policy interface from an interdisciplinary perspective.
Chapter
The objective of science-based risk assessment is to protect public health by providing profound decisions. Health risk analysis involves various uncertainties and highly variable parameters like multiple routes (ingestion, dermal, and inhalation), complex environmental contaminants, various pathways, and different exposure to population; which makes the risk estimation procedure extremely challenging and rigorous. The uncertainties in risk assessment majorly result from two reasons, firstly, the lack of knowledge of input variable (mostly random), and secondly, data obtained from an expert judgment or subjective interpretation of available information (non-random). The NRC (1994) states that to ignore the uncertainty in any step of risk assessment process is almost as likely as to leave critical parts of the process has been left incompletely examined and therefore increase the probability of generating a risk estimate that is incorrect, incomplete, or misleading. Each step of the risk assessment process involves various assumptions, both quantitative and qualitative, must be evaluated through uncertainty analysis. However, it is necessary that risk process of evaluation must treats uncertainty and variability scientifically and robustly. Moreover, addressing uncertainties in health risk assessment is a critical issue while evaluating the effects of environmental contaminants on public health. The uncertainty propagation in health risk can be assessed and quantified using probability theory, possibility theory, or a combination of both. This chapter will systematically report the development of various methodologies and frameworks to address the uncertainties that are intrinsic to health risk estimation.
Chapter
This chapter conceptualizes computer simulation and policy-making at the science-policy interface exploring boundaries between scientific knowledge production and political action orientation. The conceptualization entails four layers. First, compatibilities of scientific simulations with the policy-making system rely on key characteristics of modelling meeting policy’s reasoning, forward-looking and decision oriented needs. Simulations meets these needs with their capability to reduce complexity, compare options, analyse intervention effects, deliver results in numbers, and carry out trial without error. Second, from a systemic perspective, simulations serve as a knowledge instrument contributing to secure and uncertain knowledge and the known unknowns. Simulations also enable, amplify and feedback communication. Third, taking an impact perspective, the policy use of simulations differentiates in instrumental, conceptual, strategic and procedural use patterns. Finally, evaluation and assessment of simulations by decision-makers follows several simulation-inherent and simulation-contextual criteria.
Article
Anthropogenic climate change has been presented as the archetypal global problem, identified by the slow work of assembling a global knowledge infrastructure, and demanding a concertedly global political response. But this ‘global’ knowledge has distinctive geographies, shaped by histories of exploration and colonialism, by diverse epistemic and material cultures of knowledge-making, and by the often messy processes of linking scientific knowledge to decision-making within different polities. We suggest that understanding of the knowledge politics of climate change may benefit from engagement with literature on the geographies of science. We review work from across the social sciences which resonates with geographers’ interests in the spatialities of scientific knowledge, to build a picture of what we call the epistemic geographies of climate change. Moving from the field site and the computer model to the conference room and international political negotiations, we examine the spatialities of the interactional co-production of knowledge and social order. In so doing, we aim to proffer a new approach to the intersections of space, knowledge and power which can enrich geography’s engagements with the politics of a changing climate.
Chapter
Lange Zeit gilt der sog. Weltklimarat – Intergovernmental Panel on Climate Change (IPCC) – als ein relativ erfolgreiches Beispiel für Politikberatung und liefert auch die zentralen Weichenstellungen für die wissenschaftliche und politische Diskussionen um die Anpassung an die Folgen des Klimawandels. Dieser Beitrag stellt vor, wie IPCC den Themenkomplex Klimaanpassung rahmt und in seine Sachstandsberichte integriert. Am Beispiel der IPCC-Arbeitsgruppe 2, die für Anpassung zuständig ist, wird gezeigt, dass diese Problemstellung anders gelagert ist als die der Vermeidung des Klimawandels und entsprechend Forschung und Politikberatung vor besondere Herausforderungen stellt. Der Beitrag vertritt die These, dass das IPCC-Modell, das für Fragen der Verursachung und Vermeidung des Klimawandels entworfen wurde, nicht einfach auf Fragen der Klimaanpassung angewendet werden kann, sondern dass diese einfache Übertragung zur Engführung der Diskussion führt, die sich als empirisch falsch und politisch gefährlich erweist. Der Beitrag zeigt, dass und warum das IPCC-Modell im Hinblick auf die besonderen Eigenschaften und Anforderungen der Anpassung „angepasst“ werden muss. Der letzte Abschnitt dieses Beitrags argumentiert, dass die Diskussion um die Zukunft des IPCC als Gelegenheit genutzt werden sollte, um die Definition von Anpassung und das zugrunde liegende Verhältnis von Wissenschaft und Politik zu überdenken und entsprechend revidieren.
Chapter
This introductory essay aims to introduce the chapters in the book presenting some aspects of the theoretical and conceptual framework necessary to consider the advantages computer simulation techniques and technologies offer to historical disciplines, but also quoting from the hundreds of examples in current scientific literature to give a context within which the individual contributions can be understood better. We argue that historical simulations should be much more than vivid illustrations of what scholars believe in the present existed in the past. A simulation is basically the computer representation of a “mechanism”, representing how social intentions, goals and behaviors were causally connected in the past. This can be done by formulating a “generative model”, that is, a model of a set of mechanisms. In this chapter, it is suggested that computer simulation may act as a Virtual Laboratory to help studying how human societies have experimented relevant transformations and in which way the consequences of those transformations in technology, activities, behavior, organization or knowledge were transmitted to other social agents or groups of social agents. Building artificial societies inside a computer allows us to understand that social reality is not capricious. It has been produced somehow, although not always the same cause produces the same effect, because social actions are not performed in isolation, but in complex and dialectical frameworks, which favor, prevent, or modify the capacity, propensity, or tendency the action has to produce or to determine a concrete effect. This way of studying social dynamics in the past by means of computer simulations is beginning to abandon its infancy. Archaeologists and historians have started to convert social theories in computer programs trying to simulate social process and experiment with different explanations about known archaeological societies. Our book is just one additional example of a current trend among archaeologists and historians: historical events occurred only once and many years ago but within a computer surrogates of those events can be artificially repeated here and now for understanding how and why they happened.
Chapter
A Google search for the keyword ‘climate’ on a cold summer day in August 2010 delivered more than 150 million links in 0.23 s, and ‘climate change’ brought another 58 million. Obviously it is no problem to find floods of information about these topics on the net, yet understanding the scientific concept of climate and climate modelling is not so easy. The trouble with ‘climate’ starts when it is mixed up with the idea of weather, and when extreme weather events and short-term trends in temperature or precipitation are interpreted as effects of climate change. Usually, these interpretations are linked to an individual’s memory of experiences in childhood and other periods of life. But the trouble results not from this individual definition, which does not accord with the World Meteorological Organization’s official definition of climate as the statistics of weather. The trouble is raised by the scientific concept of climate as a mathematical construct that cannot be experienced directly. This problem is hitting science now that socio-political demands are coming into play. For responding to such demands, science has to break down its statistical and general concepts into individual and local conclusions, but this is—at the moment at least—not possible. The reason lies in the top-down approach of modern science, which uses globally valid equations to achieve increasingly higher resolution. The great challenge for meteorology during the next years and decades will be to translate statistical and general results into individual and local knowledge. Or in other words, science has to connect its global view with local circumstances. Regional modelling and downscaling are just the beginning, although these methods are still far removed from any particular individual or local view of a particular city or area. Of course, one can ask why humans do not simply get used to the scientific concept of climate. But when concrete environmental activities are required, individual needs and local effects play the main role, not the annual mean global temperature.
Chapter
The Intergovernmental Panel on Climate Change (IPCC) is a body of the United Nations established in 1988 which has the responsibility to provide policy-relevant assessments of knowledge pertaining to climate change. While the IPCC does not advise on which climate policies should be agreed upon by the world’s nations, it does provide succinct Summaries for Policymakers (SPMs) on the state of knowledge on the causes and effects of human-induced climate change, on mitigation of the causes and on adaptation to the effects. If we are interested in how climate-simulation uncertainty is dealt with in policy advice, the IPCC is a prime location for study.
Article
When attendant to the agency of models and the general context in which they perform, climate models can be seen as instrumental policy tools that may be evaluated in terms of their adequacy for purpose. In contrast, when analysed independently of their real-world usage for informing decision-making, the tendency can be to prioritise their representative role rather than their instrumental role. This paper takes as a case study the development of the UK Climate Projections 2009 in relation to its probabilistic treatment of uncertainties and the implications of this approach for adaptation decision-making. It is considered that the move towards ensemble-based probabilistic climate projections has the benefit of encouraging organisations to reshape their adaptation strategies and decisions towards a risk-based approach, where they are confronted definitively with climate modelling uncertainties and drawn towards a more nuanced understanding of how climate impacts could affect their operations. This is further illustrated through the example of the built environment sector, where it can be seen that the probabilistic approach may be of limited salience for the urban heat island in the absence of a corresponding effort towards a more place-based analysis of climate vulnerabilities. Therefore, further assessment of the adequacy-for-purpose of climate models might also consider the usability of climate projections at the urban scale.
Article
Full-text available
This paper applies an ontological politics approach for studying how complexity, uncertainty, and ignorance are being dealt with in the Netherlands by looking at how knowledges are produced and incorporated in decision-making on uncertain climate change. On the basis of work done in the Netherlands, this paper shows two things in particular. First, how decision making responses historically have been subject to change under the influence of floods and how the emergence of climate change has significantly changed these floods. Second, based on the analysis of processes dealing with a blue-green algae problem in a lake, climate change not only changed decision making responses but also changed the very reality that is being enacted. Consequently, this brings an ethical dimension to the fore, related to the intrinsic tension between the growing awareness that “all is interconnected” on the one hand and the realization we cannot take all into account.
Article
Full-text available
Accelerating future water shortages require development of operational water governance models, as illustrated by three case studies: (1) upstream – downstream interactions in the Aral Sea basin, where science acts as problem recognizer, emphasizing scoping policies; (2) impact and adaptation of climate change on water and food supply in the Middle East and North Africa, where science acts as a mediator between perspectives, emphasizing scoping and a start of implementation policies; and (3) green water credits in Kenya, where science acts as advocate, emphasizing scoping and implementation policies in close interaction with stakeholders, including impulses from applied to basic research. Introduction Water governance aimed at increasing food production
Article
The integrated modelling environment MyM integrates design of mathematical models, execution, data analysis and visualization with the explicit purpose to facilitate the interactive communication about model structure and data between modellers, policy analysts and decision makers. The graphical user interface of MyM makes it possible to easily interact with the input data and analyze the output data. MyM offers a software environment in which development, simulation and visualization of mathematical models are truly integrated. MyM is already applied in several environmental sectors and is suitable for a broad range of problem areas.
Article
Drawing appropriate conclusions from simulation results requires a correct understanding of the accuracy and context of those results. Simulation communities often assess simulation results without considering fully uncertainties that might impact the accuracy and context of those results. This creates potential for inappropriate conclusions from simulation results. Much useful work has been done in uncertainty quantification, but most of those efforts have addressed uncertainty in particular parameters and areas. Unfortunately they have not addressed all areas of potential uncertainty that might impact simulation results. A paradigm exists that facilitates consideration of all potential sources of simulation uncertainty. This paper examines simulation uncertainties using that paradigm and indicates potential magnitude of uncertainties for simulation results in various areas. A comprehensive approach to simulation uncertainty not only reduces the likelihood of drawing inappropriate conclusions from simulation results, but it also provides information that can help determine where it is most useful to invest verification and validation resources in efforts to reduce uncertainty in simulation results (i.e., to improve the accuracy of simulation results). Comprehensive assessment of simulation uncertainty may have drawbacks. When addressed comprehensively, simulation uncertainty tends to be larger than desired, and those announcing such run the risk of being bearers of bad news. Realistic appreciation for the uncertainty associated with simulation results can also decrease the importance of those simulation results in decision processes. On the positive side, such realistic and comprehensive appreciation for simulation uncertainty provides a solid factual and logical basis for how to proceed, whether by improving simulation capabilities or by developing alternative approaches to support decision processes. A perspective from comprehensive consideration of simulation uncertainty helps to ensure a proper context for simulation results.
Article
Full-text available
When can macroscopic data about a system be used to set parameters in a microfoundational simulation? We examine the epistemic viability of tweaking parameter values to generate a better fit between the outcome of a simulation and the available observational data. We restrict our focus to microfoundational simulations—those simulations that attempt to replicate the macrobehavior of a target system by modeling interactions between microentities. We argue that tweaking can be effective but that there are two central risks. First, tweaking risks overfitting the simulation to the data and thus compromising predictive accuracy; and second, it risks compromising the microfoundationality of the simulation. We evaluate standard responses to tweaking and propose strategies to guard against these risks.
ResearchGate has not been able to resolve any references for this publication.