Article

in Model-Based Reasoning: Science, Technology, Values,

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This paper examines several analogies employed in computational data analysis techniques: the analogy to the brain for artificial neural networks, the analogy to statistical mechanics for simulated annealing and the analogy to evolution for genetic algorithms. After exploring these analogies, we compare them to analogies in scientific models and highlight that scientific models address specific empirical phenomena, whereas data analysis models are application-neutral: they can be used whenever a set of data meets certain formal requirements, regardless of what phenomenon these data pertain to.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... mathematical or mechanical, to draw inferences about phenomena. In the model-based view scientific practice is understood in terms of constructing and reasoning with models and idealized phenomena (Freudenthal 1961;Magnani and Nersessian 2002;Bailer-Jones 2002Nersessian 2008). That is, scientific models are concerned with idealized phenomena rather than actual phenomena. ...
... mathematical or mechanical, to draw inferences about phenomena. In the model-based view scientific practice is understood in terms of constructing and reasoning with models and idealized phenomena (Freudenthal 1961;Magnani and Nersessian 2002;Bailer-Jones 2002Nersessian 2008). That is, scientific models are concerned with idealized phenomena rather than actual phenomena. ...
Article
Full-text available
It is argued that iterative computations which are attested in Mesopotamian and other ancient sources can be productively analyzed and interpreted in a simulation-based framework. Ancient Mesopotamia present us with a rich body of textual evidence for computational practices over a period of more than three millennia. This paper is concerned with Mesopotamian iterative computations of empirical phenomena, where each iteration updates the values of certain quantities from one state to the next state. It will be argued that these computations can be fruitfully interpreted in the so-called simulation-based framework, which was recently developed by philosophers of science in order to better account for the role of simulations in modern science. This is exemplified on the basis of a text from the Ur III period (2100–2000 BCE) about the growth of a cow herd. Other Mesopotamian sources with iteratively computed sequences, in particular various types of mathematical tables, are ignored here, because they do not directly correspond to any phenomena. Section 1 briefly addresses some developments in the philosophy and historiography of science in order to introduce the simulation-based framework. Section 2 discusses the textual example. Section 3 contains the conclusions.
... However, while constructivist epistemologies are better suited to describe their own research practices, it appears that teachers in academia often express themselves in a vocabulary closer to the traditional empiricist view of science, even when designing educational approaches such as project-based learning (PjBL) that focus on learning to conduct scientific research. 3 Examples of contributions to constructivist epistemologies: the critical evaluation of laws of nature, initiated by Cartwright (1983Cartwright ( , 1989Cartwright ( , 1999; the emphasis on the role of interventions in scientific research by Hacking (1983); the issue of applying science (Boon, 2006;Cartwright, 1974); the roles in scientific reasoning of analogies (Hesse, 1966;Nersessian, 2009b), concepts and formation of concepts (Rheinberger, 1997, Feest, 2008, Andersen, 2012, Nersessian, 2009b, Boon, 2012, Rouse, 2011, conceptual change (Andersen, 2012;Andersen & Nersessian, 2000;Kuhn, 1970;Nersessian, 1992), scientific understanding (De Regt et al., 2009), models (Bailer-Jones, 2009Morrison & Morgan, 1999), modeling and model-based reasoning (Boon & Knuuttila, 2009;Giere, 1988Giere, , 1999Giere, , 2010Giere, 2006;Knuuttila & Boon, 2011;Magnani, 2014;Magnani & Bertolotti, 2017;Nersessian, 2009a;Nersessian & Patton, 2009), epistemic and pragmatic criteria (Chang, 2009(Chang, , 2014(Chang, , 2017(Chang, , 2020Hacking, 1992;Kuhn, 1970), and inductive risk and values (Biddle, 2016;Douglas, 2000;Kukla, 2016;Wilholt, 2009Wilholt, , 2013; the role of context in deriving phenomena from data (Bogen & Woodward, 1988;Leonelli, 2011Leonelli, , 2014Leonelli, , 2019Leonelli & Boumans, 2020); the roles of perspectives through theories, concepts, and technological instruments (Boon, 2020a;Giere, 2006;Van Fraassen, 2008); the challenges of interdisciplinarity (MacLeod, 2018,); and, the role of experimentation and technological instruments (Hansson, 2015;Radder, 2003;Rheinberger, 1997). Therefore, our educational angle concerns educational approaches in teaching scientific research (Section 2). ...
... In our contribution to redesigning PjBL in a biomedical engineering bachelor program, a constructivist epistemology guides our vocabulary for discussing scientific research. Thus, instead of explaining scientific research firstly in terms of hypotheses and tests, we propose that modeling and model-based reasoning are central to the construction of knowledge in practice-oriented scientific research practices (Bailer-Jones, 2009;Boon, 2020b;Boon & Knuuttila, 2009;Magnani, 2014;Magnani & Bertolotti, 2017;Morrison & Morgan, 1999;Nersessian, 2009a;Newstetter, 2005). In particular, we focus on conceptual modeling 14 (rather than mathematical modeling, which is much more common as a learning objective). ...
Article
Full-text available
The complex societal challenges of the 21st Century require scientific researchers and academically educated professionals capable of conducting scientific research in complex problem contexts. Our central claim is that educational approaches inspired by a traditional empiricist epistemology insufficiently foster the required deep conceptual understanding and higher-order thinking skills necessary for epistemic tasks in scientific research. Conversely, we argue that constructivist epistemologies (developed in the philosophy of science in practice) provide better guidance to educational approaches to promote research skills. We also argue that teachers adopting a constructivist learning theory do not necessarily embrace a constructivist epistemology. On the contrary, in educational practice, novel educational approaches that adopt constructivist learning theories (e.g., project-based learning, PjBL) often maintain traditional empiricist epistemologies. Philosophers of science can help develop educational designs focused on learning to conduct scientific research, combining constructivist learning theory with constructivist epistemology. We illustrate this by an example from a bachelor's program in Biomedical Engineering, where we introduce conceptual models and modeling as an alternative to the traditional focus on hypothesis testing in conducting scientific research. This educational approach includes the so-called B&K method for (re-)constructing scientific models to scaffold teaching and learning conceptual modeling.
... EOCS has debated different topics including the definition of what a computer simulation is (Hartmann, 1996), the nature of computational models in simulative sciences and their relations with the simulated systems (Humphreys, 2004), the role of confirmation theory in computer simulations (Winsberg, 1999), the problem of the relation between verification and validation of computational models (Winsberg, 2010), the epistemological status of computer simulations with respect to scientific experiments (Guala, 2002;Morgan et al., 2002;Parker, 2009), the nature of abstraction and idealization of computational models (Suárez, 2008), the philosophical novelty of computer simulations with respect to traditional science (Frigg and Reiss, 2009). ...
... In the end, performing wet experiments on a mammalian cell, to argue about the regulations of some chemical species of all mammalian cells of some type, means utilizing the experimented cell as a model. Guala (2002) and Morgan et al. (2002) go on to highlight that, in those cases, experiments are nevertheless performed over object systems (such as a single mammalian cell) bearing material similarities with their target systems (mammalian cells of some type). By contrast, computer simulations are performed over object systems (programs) bearing formal similarities with their target systems. ...
Article
Full-text available
The Epistemology Of Computer Simulation (EOCS) has developed as an epistemological and methodological analysis of simulative sciences using quantitative computational models to represent and predict empirical phenomena of interest. In this paper, Executable Cell Biology (ECB) and Agent-Based Modelling (ABM) are examined to show how one may take advantage of qualitative computational models to evaluate reachability properties of reactive systems. In contrast to the thesis, advanced by EOCS, that computational models are not adequate representations of the simulated empirical systems, it is shown how the representational adequacy of qualitative models is essential to evaluate reachability properties. Justification theory, if not playing an essential role in EOCS, is exhibited to be involved in the process of advancing and corroborating model-based hypotheses about empirical systems in ECB and ABM. Finally, the practice of evaluating model-based hypothesis by testing the simulated systems is shown to constitute an argument in favour of the thesis that computer simulations in ECB and ABM can be put on a par with scientific experiments.
... Because of the possibility to keep them liberal, analogical models have played important roles in scientific research because they can give rise to questions and suggest new hypotheses. Specifically, the heuristic role that analogies play in theory construction and creative thought is recognized [2,3], and so is the cognitive functioning of models, because they allow for surrogative learning [4] and model-based reasoning [5,6]. This paper combines music and mental illness, specifically heavy metal music and the symptomatology of bipolar disorder, in an analogous model. ...
... The analogous model presented her could potentially be used in psychotherapy or cognitive behavioral therapy to provide the patients with a novel opportunity to assess if and how representative the music reflects their sentiments during depression of (hypo)mania episodes. This could provide opportunities for self-reflection and model-based reasoning [5,6]. Further research could expand on this assumption for the potential application of this analogous model in clinical therapy. ...
Article
Full-text available
This paper builds a link between isolated domains within the arts and sciences, specifically between music and psychiatry. An analogous model is presented that associates heavy metal music with bipolar disorder, a form of mental illness. Metal music consists of a variety of subgenres with distinct manifestations of song, rhythm, instrumentation, and vocal structure. These manifestations are analogous to the symptomatology of bipolar disorder, specifically the recurrent episodes of (hypo)mania and depression. Examples of songs are given which show these analogies. Besides creating a subjective link between apparently unconnected knowledge domains, these analogies could play a heuristic role in clinical applications and education about the disorder and mental illnesses at large.
... Rodrik argues that models function like experiments in this sense. They are like thought experiments (Mäki, 1992(Mäki, , 2005Morgan, 2002;Rodrik, 2015, pp. 24, 114). ...
... An important step for seeing how models are used in explanations is to understand that economic models help us answer what-if-things-had-been-different questions (Woodward, 1984, 2003, Morgan, 1999, 2001, 2002Ylikoski and Aydinonat, 2014 (Ylikoski and Aydinonat, 2014). Hence, models help us produce a menu of causal scenarios, or a menu of possible explanations (cf. ...
Preprint
Full-text available
In Economics Rules, Dani Rodrik (2015) argues that what makes economics powerful despite the limitations of each and every model is its diversity of models. Rodrik suggests that the diversity of models in economics improves its explanatory capacities, but he does not fully explain how. I offer a clearer picture of how models relate to explanations of particular economic facts or events, and suggest that the diversity of models is a means to better economic explanations.
... 14 Many experiments are run with an eye to systems different from the system experimented on. Morgan (2002) reports class-room experiments by Chamberlin and Smith that were used to understand the behavior of humans in real markets outside the class-room. If the experiment is run in a laboratory, but supposed to be interesting for a system outside the lab, it is tempting to mark the contrast between both systems by speaking of the lab and the real world. ...
... Some authors, e.g. Guala (2002), Morgan (2002), and Morgan (2003) and Winsberg (2009b) seem to reject the claim that CSs are experiments. At least they wish to maintain a distinction between CSs and experiments, as does (Winsberg 2009b). ...
Article
Computer simulations and experiments share many important features. One way of explaining the similarities is to say that computer simulations just are experiments. This claim is quite popular in the literature. The aim of this paper is to argue against the claim and to develop an alternative explanation of why computer simulations resemble experiments. To this purpose, experiment is characterized in terms of an intervention on a system and of the observation of the reaction. Thus, if computer simulations are experiments, either the computer hardware or the target system must be intervened on and observed. I argue against the first option using the non-observation argument, among others. The second option is excluded by e.g. the over-control argument, which stresses epistemological differences between experiments and simulations. To account for the similarities between experiments and computer simulations, I propose to say that computer simulations can model possible experiments and do in fact often do so.
... This complexity and the desire for improved availability, reliability and dependability require the development of systematic approaches of diagnosis to detect and isolate a fault. Various diagnosis approaches have been proposed including fault-trees, expert systems, neural networks, fuzzy logic or Bayesian networks for example[1,2,3,4]. Diagnosing a system means providing candidates of a fault which can explain it by the observations collected during the system operation in a bounded delay. ...
... In first point, diagnosers are introduced as observers which reconstruct information of the process and help the users in its decisions. One of the major distinctions in diagnosis approaches is whether a Model Based Reasoning is used or not[1]. Modeling the system behavior is often computationally expensive but returns advantages, such as formalization or instantiation of equipment. ...
Article
Full-text available
In Discrete Events System (DES), there are two basic approaches to diagnosis: the first approach is the diagnosers and the second approach is Causal Temporal Signature (CTS) and chronicles. The first approach has limitations including the issue of combinatorial explosion. On the other side, it offers tools to study the diagnosability of the models constructed. CTS are easier to write but pose the problem of the guarantee of the completeness of a given base. This means that there is at least one CTS in the set of CTS for all the faults in the monitored system. This study aims to propose a method to garantee the completeness of a set of CTS. The method is based on a translation of formalism and model of a diagnoser into CTS. From these CTS, a recognition algorithm based on the concept of 'world' is used. A 'world' is defined as a set of coherent hypotheses of assignment of the event received by the diagnostic task.
... In our approach, we consider a digital reconstruction as one of the tools that humanities scholars have at their disposal to enhance their understanding of past (built) environments. In this sense, they can be equated to simulations and to the visual expressions of a model-based reasoning that in other fields is already seen as central to the process of knowledge building (Magnani & Nersessian, 2002;Magnani, Nersessian, & Thagard, 1999). Especially, the possibility to replicate real-world spatial properties in the 3D reconstruction opens up a range of analytical opportunities. ...
Article
Full-text available
This paper presents our ongoing work in the Virtual Interiors project, which aims to develop 3D reconstructions as geospatial interfaces to structure and explore historical data of seventeenth-century Amsterdam. We take the reconstruction of the entrance hall of the house of the patrician Pieter de Graeff (1638–1707) as our case study and use it to illustrate the iterative process of knowledge creation, sharing, and discovery that unfolds while creating, exploring and experiencing the 3D models in a prototype research environment. During this work, an interdisciplinary dataset was collected, various metadata and paradata were created to document both the sources and the reasoning process, and rich contextual links were added. These data were used as the basis for creating a user interface for an online research environment, taking design principles and previous user studies into account. Knowledge is shared by visualizing the 3D reconstructions along with the related complexities and uncertainties, while the integration of various underlying data and Linked Data makes it possible to discover contextual knowledge by exploring associated resources. Moreover, we outline how users of the research environment can add annotations and rearrange objects in the scene, facilitating further knowledge discovery and creation.
... Standard logic underlies, rather, the construction of simplified models which fail to capture the essential dynamics of biological and cognitive processes, such as reasoning [11]. LIR does not replace classical binary or multi-valued logics but reduces to them for simple systems and situations. ...
... (A similar view of application appears to be endorsed by Morgan 2002.) To see how this works, consider a standard auction theory model-say, one that claims that fi rst-price auctions lead to bids lower than bidders' true valuations. ...
Article
The 1994 US spectrum auction is now a paradigmatic case of the successful use of microeconomic theory for policy-making. We use a detailed analysis of it to review standard accounts in philosophy of science of how idealized models are connected to messy reality. We show that in order to understand what made the design of the spectrum auction successful, a new such account is required, and we present it here. Of especial interest is the light this sheds on the issue of progress in economics. In particular, it enables us to get clear on exactly what has been progressing, and on exactly what theory has – and has not – contributed to that. This in turn has important implications for just what it is about economic theory that we should value.
... Standard logic underlies, rather, the construction of simplified models which fail to capture the essential dynamics of biological and cognitive processes, such as reasoning (Magnani, 2002). LIR does not replace classical binary or multi-valued logics but reduces to them for simple systems and situations. ...
Article
The recent history of information theory and science shows a trend in emphasis from quantitative measures to qualitative characterizations. In parallel, aspects of information are being developed, for example by Pedro Marijuan, Wolfgang Hofkirchner and others that are extending the notion of qualitative, non-computational information in the biological and cognitive domain to include meaning and function. However, there is as yet no consensus on whether a single acceptable definition or theory of the concept of information is possible, leading to many attempts to view it as a complex, a notion with varied meanings or a group of different entities. In my opinion, the difficulties in developing a Unified Theory of Information (UTI) that would include its qualitative and quantita-tive aspects and their relation to meaning are a consequence of implicit or explicit reliance on the principles of standard, truth-functional bivalent or multivalent logics. In reality, information processes, like those of time, change and human con-sciousness, are contradictory: they are regular and irregular; consistent and inconsistent; continuous and discontinuous. Since the indicated logics cannot accept real contradictions, they have been incapable of describing the multiple but interre-lated characteristics of information. The framework for the discussion of information in this paper will be the new extension of logic to real complex processes that I have made, Logic in Reality (LIR), which is grounded in the dualities and self-dualities of quantum physics and cos-mology. LIR provides, among other things, new interpretations of the most fundamental metaphysical questions present in discussions of information at physical, biological and cognitive levels of reality including, especially, those of time, continuity vs. discontinuity, and change, both physical and epistemological. I show that LIR can constitute a novel and general ap-proach to the non-binary properties of information, including meaning and value. These properties subsume the notion of semantic information as well-formed, meaningful and truthful data as proposed most recently by Luciano Floridi. LIR sup-ports the concept of ‘biotic’ information of Stuart Kauffmann, Robert Logan and their colleagues and that of meaningful information developed by Christophe Menant. Logic in Reality does not pretend to the level of rigor of an experimental or mathematical theory. It is proposed as a meth-odology to assist in achieving a minimum scientific legitimacy for a qualitative theory of information. My hope is that by seeing information, meaning and knowledge as dynamic processes, evolving according to logical rules in my extended sense of logic, some of the on-going issues on the nature and function of information may be clarified.
... Only relatively recently-in the second half of the twentieth century-philosophers of science have more specifically identified some forms of scientific thinking as idealizing and modeling since "model-based reasoning" has become extremely important in contemporary science: "idealization is indispensable in theoretical science and the result of idealization is the construction of models that may 29 not obtain in nature" (Liu 2004: 382; cf. Elgin 2007: 33; Magnani and Nersessian 2002;Nowak 1992). Theorists of all kinds regularly operate what they call "distorted models," "Galilean idealizations," and "Galilean experiments" (cf. ...
Article
Full-text available
Waldemar Hanasz MACHIAVELLI’S METHOD REVISITED: THE ART AND SCIENCE OF MODELING Abstract Machiavelli emphasizes that successful political action is impossible without proper methods of reasoning. However, while most readers agree that his “new modes and orders” changed the values and goals of political thinking, there is no such agreement concerning the novelty of his methodological approach. This article explores Machiavelli’s method in more detail and provides an entirely new reading demonstrating how advanced his method of theoretical reasoning was. He constructed a theoretical framework which corresponded to the political reality in very peculiar ways because it was not really based on facts and observations. In fact, his constructs systematically deformed the reality. They were abstract models methodically selecting some aspects of politics and neglecting others in order to grasp the nature of political mechanisms. The article’s sections reconstruct the main components of Machiavelli’s technique: the steps of universalization, reduction, and idealization constructing the models of individual actors and political bodies. Such methods of theoretical thinking were entirely new in his time – although firmly rooted in the Renaissance of arts and sciences – and developed only much later. Today scientists and philosophers of science recognize such modeling methods as significant steps in the development of scientific thinking and such models are commonly used in contemporary science. Machiavelli made some pioneering steps in that important direction.
... Many philosophers approach the difference in ontological terms. Perhaps experimental subjects are materially continuous with their targets, while simulations and their targets are 'made of different stuff'and perhaps this makes a difference to the kinds of epistemic tasks they can perform (see Morgan 2002, Harre 2003, Guala 2002, Parker 2009, Winsberg 2010, Parke 2014. Others, such as Parker (2008) and Winsberg (2003Winsberg ( , 2009) compare the two via their epistemic capacities. ...
Article
I develop an account of productive surprise as an epistemic virtue of scientific investigations which does not turn on psychology alone. On my account, a scientific investigation is potentially productively surprising when (1) results can conflict with epistemic expectations, (2) those expectations pertain to a wide set of subjects. I argue that there are two sources of such surprise in science. One source, often identified with experiments, involves bringing our theoretical ideas in contact with new empirical observations. Another, often identified with simulations, involves articulating and bringing together different parts of our knowledge. Both experiments and simulations, then, can surprise.
... Since models are vehicles for learning about the world, studying a model makes it possible to discover the system characteristics that it describes. This cognitive function of models is well known and has given rise to 'model-based reasoning' (Magnani and Nersessian, 2002). Mental models act as inferential frameworks (Gentner and Gentner, 1983) and influence decision-making, which takes place through feedback loops (Forrester, 1961). ...
Article
Full-text available
In project management research, it is acknowledged that two perspectives on project performance must be considered: project efficiency (delivering efficient outputs) and project success (delivering beneficial outcomes). The first perspective is embedded in a deterministic paradigm of project management, while the second appears more naturally connected to the emerging non-deterministic paradigm. Complexity and uncertainty are key constructs frequently associated with the non-deterministic paradigm. This conceptual paper suggests that these two concepts could very well explain and define particularities of both paradigms, and seeks to articulate both perspectives in a contingent model. First, the constructs of complexity and uncertainty are clarified. Second, the role of project managers' mental models in managerial decision-making is considered. In the third part of this article, we propose a theoretical model suggesting that project managers should consider contingent variables to differentiate managerial conditions of regulation from managerial conditions of emergence.
... Model-based reasoning is applied to among others thought experiments, visual representations, and in analogical reasoning. (Magnani, Nersessian, & Thagard, 1999) (Magnani & Nersessian, 2002)As (Giere, 1999) emphasizes, models are not only tools but they play a central role in the construction of knowledge. " Models are important, not as expressions of belief, but as vehicles for exploring the implications of ideas (McClelland, 2010) " (Rogers & McClelland, 2014) One of the ways of acquiring knowledge besides deduction and induction is abduction that leads to knowledge discovery. ...
Chapter
Full-text available
Computational models and tools provide increasingly solid foundations for the study of cognition and model-based reasoning, with knowledge generation in different types of cognizing agents, from the simplest ones like bacteria to the complex human distributed cognition. After the introduction of the computational turn, we proceed to models of computation and the relationship between information and computation. A distinction is made between mathematical and computational (executable) models , which are central for biology and cognition. Computation as it appears in cognitive systems is physical, natural, embodied, and distributed computation, and we explain how it relates to the symbol manipulation view of classical computationalism . As present day models of distributed, asynchronous, heterogeneous, and concurrent networks are becoming increasingly well suited for modeling of cognitive systems with their dynamic properties, they can be used to study mechanisms of abduction and scientific discovery. We conclude the chapter with the presentation of software modeling with computationally automated reasoning and the discussion of model transformations and separation between semantics and ontology.
... System eksperymentalny podlegający bezpośredniej interwencji w trakcie badań symulacyjnych jest materialnym/ fizycznym systemem, a mianowicie komputerem z oprogramowaniem, zatem eksperymenty komputerowe są materialnymi eksperymentami w dosłownym sensie. Morgan (2002Morgan ( , 2003, zgadzając się z poglądem, że wiele badań symulacyjnych można zakwalifikować jako eksperymenty materialne, zwraca jednak uwagę na ich "stopień materialności" (Morgan, 2003, s. 231) powodujący, że eksperymenty tradycyjne mają większą wartość poznawczą w stosunku do symulacji komputerowej. Z tym poglądem polemizuje Parker (2009), twierdząc, że nawet jeśli docelowy i eksperymentalny system wykonane są z tych samych materiałów, nie zawsze są Małgorzata łatuszyńska syMulacja koMputerowa zaMiast tradycyjnego eksperyMentu ekonoMicznego 41 one podobne we wszystkich istotnych aspektach, a tym samym wnioski dotyczące systemu docelowego mogą być nieuzasadnione. ...
... Significant parts of scientific research are carried out on models rather than on the real phenomena because by studying a model we can discover features of and ascertain facts about the system the model stands for. This cognitive function of models has been widely recognized in the literature, and some researchers even suggest that models give rise to a new form of reasoning, the so-called " model based reasoning " (Magnani & Nersessian, 2002) while modelling ability is also associated to model-based reasoning (Chittleborough & Treagust, 2007). It is well known that scientific theories are developed through a process of continuous elaboration and modification in which scientific models are developed and transformed to account for new phenomena that are uncovered. ...
Article
Full-text available
Computational experiment approach considers models as the fundamental instructional units of Inquiry Based Science and Mathematics Education (IBSE) and STEM Education, where the model take the place of the " classical " experimental setup and simulation replaces the experiment. Argumentation in IBSE and STEM education is related to the engagement of students/learners in a process where they make claims and use models and data to support their conjectures and justify or disprove their ideas. Threshold Concepts (TCs) are of particular interest to IBSE and STEM education, representing a " transformed way of understanding, or interpreting or viewing something without which the learner cannot progress. " The purpose of this study is to explore the effects of IBSE teaching approach on University students': (a) argumentation; (b) involvement in the use of modelling indicators; and (c) acquisition of certain threshold concepts in Physics and Mathematics. 79 pre-service engineering school university students participated in this research and results indicate that the computational experiment can help students' acquisition of threshold concepts and improve their level of argumentation as well as the use of modelling indicators.
... Stable macrophysical objects and simple situations, which can be discussed within binary logic, are the result of processes of processes going in the direction of a " non-contradictory" identity. Standard logic underlies, rather, the construction of simplified models which fail to capture the essential dynamics of biological and cognitive processes, such as reasoning (Magnani, 2002). LIR does not replace standard bivalent or multivalent logics but reduces to them for simple systems. ...
Article
Full-text available
The conjunction of the disciplines of computing and philosophy implies that discussion of computational models and approaches should include explicit statements of their underlying worldview, given the fact that reality includes both computational and non-computational domains. As outlined at ECAP08, both domains of reality can be characterized by the different logics applicable to them. A new "Logic in Reality" (LIR) was proposed as best describing the dynamics of real, non-computable processes. The LIR process view of the real macroscopic world is compared here with recent computational and information-theoretic models. Proposals that the universe can be described as a mathematical structure equivalent to a computer or by simple cellular automata are deflated. A new interpretation of quantum superposition as supporting a concept of paraconsistent parallelism in quantum computing and an appropriate ontological commitment for computational modeling are discussed.
... In transformational learning situations that require cognitive struggle, learning is aided by the use of external representations (analogies, metaphors, diagrams, and/or visual models) that enable offloading of cognitive effort. The process of using external representations of internal mental models as a mechanism for reasoning about complex subjects has been termed Bmodel-based reasoning^in cognitive science (Magnani and Nersessian 2002). Externalizations facilitate offloading, abstraction, and summarizing of complex information so that learners can grasp and transform more information. ...
Article
Full-text available
A well-known barrier to successful interdisciplinary work is the difficulty of integrating knowledge across disciplines. Integrated conceptualizations must leverage the combined knowledge of team members in productive ways for a given problem. The process of knowledge integration has been investigated from a variety of disciplinary perspectives, including organizational science, team psychology, social science, and the learning sciences. These various perspectives are converging on a few key processes that mediate successful knowledge integration: ability to learn each other’s perspectives, participatory processes, and flexible, adaptive problem formulation. This article summarizes key findings from the research literature on knowledge integration and presents a new conceptual model for developing interdisciplinary conceptualizations that links individual, group, and system factors. The model provides clarity regarding the interactions between individual learning and group processes and the challenges these present, identifies strategies for overcoming those challenges, and frames the problem as one of developing a new distributed cognitive system.
Article
Full-text available
В области эпистемологии компьютерных симуляций существует аргументативная стратегия, получившая название «принцип материальности» (Х. Дюран). Стратегия состоит в том, чтобы приписывать компьютерным симуляциям в науке эпистемические характеристики того или иного уровня на основе их онтологического сходства (степени «материальности») с материальными экспериментами. Это сопоставление отчасти мотивировано языковыми интуициями: компьютерное моделирование часто именуют «вычислительным экспериментом» и т.п. Если это верно, то компьютерные симуляции следует рассматривать как подвид научного моделирования и приписывать им соответствующее промежуточное положение между теоретическим и экспериментальным уровнем научного знания (как это делается в рамках современных подходов к структуре научного знания, а именно семантического и прагматического). Рассматриваются четыре варианта реализации «принципа материальности», аргументирующие как в пользу достоверности компьютерных симуляций, так и против нее. Компьютерные симуляции размывают границы между материальным и виртуальным (вычислительным) экспериментированием. «Принцип материальности» может работать аргументом как в пользу, так и против этого различения. Приводятся и усиливаются два основных аргумента против принципа материальности, условно называемые «аргумент непрозрачности» и «аргумент множественной реализуемости». Показано, что принцип материальности недостаточен для того, чтобы аргументированно обосновать эпистемологический статус компьютерных симуляций.
Preprint
Full-text available
Superposition, i.e. the ability of a particle (electron, photon) to occur in different states or positions simultaneously, is a hallmark in the subatomic world of quantum mechanics but non-sensical from the perspective of macro-systems such as ecosystems and other complex systems of people and nature. Using time series and spatial analysis of bird, phytoplankton and benthic invertebrate communities, this paper shows that superposition can occur analogously in redundancy analysis (RDA), a form of canonical ordination frequently used by ecologists. Specifically, we used correlation analysis to show that species can be associated simultaneously with different orthogonal axes in RDA models, a pattern reminiscent of superposition. We discuss this counterintuitive result in relation to the statistical and mathematical features of RDA and the recognized limitations with current traditional species concepts based on vegetative morphology. We suggest that such 'quantum weirdness' is reconcilable with classical ecosystems logic when the focus of research shifts from morphological species to cryptic species that consist of genetically and ecologically differentiated subpopulations. We support our argument with theoretical discussions of eco-evolutionary interpretations that should become testable once suitable data are available.
Preprint
Full-text available
(1) An introduction to the principles of conceptual modelling, combinatorial heuristics and epistemological history; (2) the examination of a number of perennial epistemological-methodological schema: (a) conceptual spaces and conceptual blending; (b) ars inveniendi and ars demonstrandi as two pillars of rational science; (c) two modes of analysis and synthesis – sequential and compositional; (d) taxonomies and typologies as two fundamental epistemic structures; (e) extended cognition, operative symbolism and model-based reasoning.
Article
Full-text available
When one wants to use citizen input to inform policy, what should the standards of informedness on the part of the citizens be? While there are moral reasons to allow every citizen to participate and have a voice on every issue, regardless of education and involvement, designers of participatory assessments have to make decisions about how to structure deliberations as well as how much background information and deliberation time to provide to participants. After assessing different frameworks for the relationship between science and society, we use Philip Kitcher's framework of Well-Ordered Science to propose an epistemic standard on how citizen deliberations should be structured. We explore what potential standards follow from this epistemic framework focusing on significance versus scientific and engineering expertise. We argue that citizens should be tutored on the historical context of why scientific questions became significant and deemed scientifically and socially valuable, and if citizens report that they are capable of weighing in on an issue then they should be able to do so. We explore what this standard can mean by looking at actual citizen deliberations tied to the 2014 NASA ECAST Asteroid Initiative Citizen forums. We code different vignettes of citizens debating alternative approaches for Mars exploration based upon what level of information seemed to be sufficient for them to feel comfortable in making a policy position. The analysis provides recommendations on how to design and assess future citizen assessments grounded in properly conveying the historical value context surrounding a scientific issue and trusting citizens to seek out sufficient information to deliberate.
Preprint
If a specific theory has not been constant at a time, then models related to that theory would not be as well at all.
Article
Full-text available
[full article, abstract in English; abstract in Lithuanian] The article examines the modern computer-based educational environment and the requirements of the possible cognitive interface that enables the learner’s cognitive grounding by incorporating abductive reasoning into the educational process. Although the main emphasis is on cognitive and physiological aspects, the practical tools for enabling computational thinking in a modern constructionist educational environment are discussed. The presented analytical material and developed solutions are aimed at education with computers. However, the proposed solutions can be generalized in order to create a computer-free educational environment. The generalized paradigm here is pragmatism, considered as a philosophical assumption. By designing and creating a pragmatist educational environment, a common way of organizing computational thinking that enables constructionist educational solutions can be found.
Article
The purpose of this essay is to summarize and critically evaluate the epistemological and pragmatic questions with regard to computer simulations as a new technological-scientific format as put forth in current philosophical debates. Computer simulation practices are situated in the broader context of model-building practices and experimentation; the scope and limits of knowledge generated by computer simulations are considered.
Article
Full-text available
The question of where, between theory and experiment, computer simulations (CSs) locate on the methodological map is one of the central questions in the epistemology of simulation (cf. Saam Journal for General Philosophy of Science, 48, 293–309, 2017). The two extremes on the map have them either be a kind of experiment in their own right (e.g. Barberousse et al. Synthese, 169, 557–574, 2009; Morgan 2002, 2003, Journal of Economic Methodology, 12(2), 317–329, 2005; Morrison Philosophical Studies, 143, 33–57, 2009; Morrison 2015; Massimi and Bhimji Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 51, 71–81, 2015; Parker Synthese, 169, 483–496, 2009) or just an argument executed with the aid of a computer (e.g. Beisbart European Journal for Philosophy of Science, 2, 395–434, 2012; Beisbart and Norton International Studies in the Philosophy of Science, 26, 403–422, 2012). There exist multiple versions of the first kind of position, whereas the latter is rather unified. I will argue that, while many claims about the ‘experimental’ status of CSs seem unjustified, there is a variant of the first position that seems preferable. In particular I will argue that while CSs respect the logic of (deductively valid) arguments, they neither agree with their pragmatics nor their epistemology. I will then lay out in what sense CSs can fruitfully be seen as experiments, and what features set them apart from traditional experiments nonetheless. I conclude that they should be seen as surrogate experiments, i.e. experiments executed consciously on the wrong kind of system, but with an exploitable connection to the system of interest. Finally, I contrast my view with that of Beisbart (European Journal for Philosophy of Science, 8, 171–204, 2018), according to which CSs are surrogates for experiments, arguing that this introduces an arbitrary split between CSs and other kinds of simulations.
Thesis
To which degree how and what we think is determined by how and what we see? What do we mean exactly when we say “I see.” And what do we really see? Thinking is primarily, solely most would claim, framed by language, but it is also in good part due to our ability to project mental images -concepts- into a flow of consciousness. Ever since Plato’s cave story, man has imprisoned himself in the problematic of the images. The invention of photography has made matter simply more complex by trapping him in another set of ideas about representation, no longer of the order of the imaginary but of the indexical. The shift from analog to digital imaging and the vast possibilities of discoveries and transmissions that have emerged with these new technologies and networks have exposed us to a complete new set of images to look at, and thus demand of us a complete rethinking of the way we see. In light of Vilém Flusser’s hypothesis on the quantum quality of photographic images, I investigate the claim that a single image holds all other images. I explore and discuss that photography is neither a fixed medium, nor does it fixes anything; rather, it is transient, fluid, undetermined, floating. It does not respond to the Barthian logic of “ça a été” (it has been) but to a constantly renewed “c’est” (it is), which is always in becoming, hence never a witness. I review and study different historical, scientific and philosophical modes of thinking the image, taking into account the positions of Jonathan Crary, Jacques Rancière, Vilém Flusser and Paul Virilio, among others, along with understanding cognitive constructions and proposing parallels with the quantum theories of François Martin, which are based on Jung’s conception of archetypes, and explore what the image means today as a carrier of information. I further discuss how it is no longer so distinct where images happen, whether in our mind in the form of mental images or out there in front of our eyes; and that indeed there is no longer a dialectical logic of the in and out, but the presence of one permeable volumatic field of correspondences that I call fasciae -in reference to the connective tissue- composed of evening news, memories, snapshot, synthetic images, dreams, fantasies, artworks, etc. that belongs to our collective memory, that respond to a quantum logic and that impact on how we perceive we think. This shift from the materiality to the immateriality of the photograph, from paper based to electricity based images, has completely changed the way the visual information is presented to us and how we process it, relate to it, and how these images form our consciousness, our thoughts and are shared. Our flow of consciousness is matched by an equivalent flow of images that we can tap into uninterrupted so that it is becoming increasingly more difficult to tell which is which.
Article
The Digital Maps Metaphor (DMM) is suggested as a transdisciplinary research tool to overcome some of the challenges that are potentially inherent in research projects that involve multiple aims, objectives, knowledge claims, and methodologies. Based on the understanding of metaphors as embodied concepts, it is argued that the DMM can be used to structure mappings of the different rationalities within transdisciplinary projects. The advantage of the DMM is illustrated by the metaphor's application to a comprehensive transdisciplinary intervention study in Swedish preschools. The structural mapping brings forth a number of contradictions between the Childhood Map, the Critical and Micro-political map, and the Developmental map. Potentials for new emergent understandings, facilitated by the metaphor, are suggested for the benefit of children's learning and development. Empirical studies of the effectiveness of the metaphor should be the next step in order to assess its usefulness in different educational and scientific contexts.
Article
Student engagement in learning science is both a desirable goal and a long-standing teacher challenge. Moving beyond engagement understood as transient topic interest, we argue that cognitive engagement entails sustained interaction in the processes of how knowledge claims are generated, judged, and shared in this subject. In this paper, we particularly focus on the initial claim-building aspect of this reasoning as a crucial phase in student engagement. In reviewing the literature on student reasoning and argumentation, we note that the well-established frameworks for claim-judging are not matched by accounts of creative reasoning in claim-building. We develop an exploratory framework to characterise and enact this reasoning to enhance engagement. We then apply this framework to interpret two lessons by two science teachers where they aimed to develop students’ reasoning capabilities to support learning.
Chapter
This chapter discusses the apparent paradox that the rise of modeling is at the same time the rise of imagery. It not only gives an extensive overview on the state of the art literature, but it also examines changes induced by ubiquitous computing, shows different forms and functions of design models, investigates their epistemic potential, and discusses the new role of imagery. As shown in the chapters of this volume, it is striking that computer-based modeling indeed does not marginalize image practices. Rather, the reverse is the case. Traditional image practices are modified and complemented by new forms of imagery which strengthen their overall relevance even more. On the operative level, images hence constitute crucial instruments of reflection to develop the design in architecture and engineering science – especially in the age of modeling.
Article
Full-text available
This paper defends the naïve thesis that the method of experiment has per se an epistemic superiority over the method of computer simulation, a view that has been rejected by some philosophers writing about simulation, and whose grounds have been hard to pin down by its defenders. I further argue that this superiority does not depend on the experiment’s object being materially similar to the target in the world that the investigator is trying to learn about, as both sides of dispute over the epistemic superiority thesis have assumed. The superiority depends on features of the question and on a property of natural kinds that has been mistaken for material similarity. Seeing this requires holding other things equal in the comparison of the two methods, thereby exposing that, under the conditions that will be specified, the simulation is necessarily epistemically one step behind the corresponding experiment. Practical constraints like feasibility and morality mean that scientists do not often face an other-things-equal comparison when they choose between experiment and simulation. Nevertheless, I argue, awareness of this superiority and of the general distinction between experiment and simulation is important for maintaining motivation to seek answers to new questions.
Article
Models are central constructs of science teaching and learning. This research aims to report on seven pre-service science teachers’ perceptions and attitudes towards models and the rationale for using models in science teaching. Semi-structured in-depth interviewing, an open-item questionnaire, and a five-point Likert scale questionnaire were used to obtain data from the participants. No evidence of negative attitude towards the use of models was observed among the participants. Although the pre-service science teachers (PSTs) valued the idea that scientific models are important aspects of science teaching and learning, they were hesitant to use and build models as teaching tools. Four categories related to the perceptions about the rationale for using models were identified from the data namely, promoting interest and attention, promoting understanding due to illustrative and representative nature of models, promote concretization as instrumental tool and promoting theoretical understanding in science. The findings indicated that the PSTs showed positive attitudes towards the use of models in their teaching, but certain factors like level of learner, time, lack of modeling experience, and limited knowledge of models appeared to be affecting their perceptions and attitudes negatively.
Thesis
Full-text available
Oltre le professioni? Il lavoro all’epoca dei social media Questo lavoro di tesi riguarda, principalmente, i mutamenti delle pratiche professionali dei lavoratori di un team di una multinazionale americana dell’high tech, indotti dalle tecnologie digitali dell’informazione e della comunicazione. Nel corso della ricerca si è cercato di sviluppare una descrizione dinamica e pratica della quotidianità dei lavoratori osservata negli “ambienti tecnologicamente densi” (Bruni 2005; Bruni & Gherardi 2007), che abitano abitualmente durante la loro attività di lavoro. La “narrazione” di questi ambienti di lavoro ha favorito la comprensione del funzionamento dell’“infrastruttura tecnologica” (Gherardi 2007) del team che si occupa di gestire la presenza della suddetta multinazionale sui social media. Si è osservato – quindi – che questa infrastruttura tecnologica differisce completamente da quelle osservate negli anni ’90 del XX secolo da molti studiosi (Joseph 1994; Heath e Luff 1992; Suchman 1997; 2000; Star 1999; Grosjean 2004). Questa differenza si traduce quasi in un mutamento antropologico e sociale che si evidenzia nel modo di lavorare e nel modo di autorappresentarsi al lavoro.
Thesis
Full-text available
Au delà des professions ? Le travail à l'époque des médias sociaux Ce travail de thèse concerne, principalement, les mutations des pratiques professionnelles des travailleurs dans une team d’une multinationale américaine de l’hightech, dotée de technologie numériques de l’information et de la communication. Dans le cours de la recherche, on a cherché à développer une description dynamique et pratique du quotidien des travailleurs observés dans des « environnements technologiquement denses » (Bruni 2005; Bruni & Gherardi 2007), qu’ils habitent habituellement pendant leur activité de travail. La « narration » de ces environnements de travail a favorisé la compréhension du fonctionnement de l’ «infrastructure technologique » (Gherardi 2007) de la team qui s’occupe de gérer la présence de la multinationale sur les médias sociaux. On observe, donc, que cette infrastructure technologique diffère complétement de celles observées dans les années 90 du XXème siècle d’un certain nombre de recherche (Joseph 1994; Heath e Luff 1992; Suchman 1997; 2000; Star 1999; Grosjean 2004). Cette différence se traduit presque dans une mutation anthropologique et sociale qui est mise en évidence dans la façon de travailler et dans la manière de s’autoreprésenter au travail.
Chapter
Alle Modellbegriffe in den Sozialwissenschaften lassen sich wissenschaftsphilosophisch einordnen, fundieren und hinterfragen. Daher stellt der Beitrag Eckpfeiler der wissenschaftsphilosophischen Auseinandersetzung mit wissenschaftlichen Modellen vor, soweit sie sozialwissenschaftlich relevant sind und zum Verständnis sowie zur Reflexion über sozialwissenschaftliche Modellbildung beizutragen vermögen. Vor diesem Hintergrund legen wir einen Versuch vor, das Forschungsfeld durch zwei übergreifende Zielsetzungen sozialwissenschaftlicher Modellbildung im Sinne einer gestaltgebenden Strukturierung zu systematisieren. Unseres Erachtens lässt sich das Feld durch zwei Scientific Communities beschreiben, für deren Selbstbeschreibung das Konzept der mathematischen Soziologie bzw. die Theorie rationalen Handelns zentral sind. Zur Illustration werden entsprechende Modelle kurz vorgestellt.
Chapter
Philosophical analysis of the historical development of modelling, as well as the programmatic statements of the founders of modelling, support three different functions for modelling: for fitting theories to the world; for theorizing; and as instruments of investigation. Rather than versions of data or of theories, models can be understood as complex objects constructed out of many resources that defy simple description. These accounts also suggest a kinship between the ways models work in economics and various kinds of experiment, found most obviously in simulation but equally salient in older traditions of mathematical and statistical modelling.
Article
Resumo What are scientific idealizations, what is their epistemic contribution to science, and how is it obtained? Along this paper, I introduce three different philosophical analysis of scientific idealization that have been offered as possible answers to the questions presented above. The differences between them consist in the way they understand the role of idealization in the construction of scientific models and the epistemic contribution that they ascribe to scientific models. Nonetheless, a restriction lies to their arguments: although they conceive idealizations in terms of processes, they tend to attribute their epistemic contribution to its products. In the last section of the paper, I argue that in order to recognize the process-based epistemic contribution of idealizations, it is required to seriously take into account the analysis of the scientific practices. That said, from a pluralistic point of view I suggest to regard idealizations as situated in scientific practices, this view may prove useful for their philosophical analysis
Chapter
Scientific popularizer and railway economist, Lardner was born in Dublin on 3 April 1793 and died on 29 April 1859. He was educated at Trinity College, Dublin, between 1817 and 1827 and is probably best known for his Cabinet Cyclopaedia of 133 volumes, published between 1829 and 1849. Although Lardner’s series was graced by a number of distinguished contributors, he was satirized in the scientific community as ‘Dionysius Diddler’. An astronomer as well as an essayist on numerous scientific topics, Lardner often took side trips into other fields. He studied railway engineering in Paris, and was probably well acquainted with the econo-engineering work at the Ecole des Ponts et Chaussées at a time when Jules Dupuit was actively pursuing economic topics. His sole work relating to economics, Railway Economy (1850), was filled with the kind of factual work and analysis being undertaken by the French engineers and by an American pupil of the Ecole, Charles Ellet. Lardner’s work caught the eye of W.S. Jevons, who claimed that a reading of Railway Economy in 1857 led him to investigate economics in mathematical terms.
Chapter
Full-text available
The main purpose of this paper is to investigate some important aspects of the relationship between thought experiment (hereafter TE) and computer simulation (hereafter CS), from the point of view of real experiment (RE). In the first part of this paper, I shall pass in critical review four important approaches concerning the relationship between TE and CS. None of these approaches, though containing some important insights, has succeeded in distinguishing between CS and TE, on the one hand, and REs, on the other. Neither have they succeeded in distinguishing TEs and REs (Sect. 1–4). In Sect. 5, the paper briefly outlines an account of CSs as compared with TEs that takes REs as a central reference point. From the perspective of the analysis of the empirico-experimental intensions of the concepts of TE, CS, and RE—considering their empirical content and actual performance within a discipline—the attempts to find a distinction in logical kind between TEs, CSs and REs breaks down: for every particular characteristic of one of these notions there is a corresponding characteristic in the two others. From an epistemological-transcendental point of view, the only difference in kind between TEs and CSs consists in the fact that any simulation, even a computer one, involves a kind of real execution, one that is not merely psychological or conceptual. In TEs the subject operates concretely by using mental concepts in the first person; in contrast, real experiments and simulations involve an ‘external’ realisation. As shown in Sect. 6, this manifests itself in the higher degree of complexity often found in CSs as compared with TEs.
Chapter
Full-text available
I begin with a typology of reasoning and cross it with types of processes. I demonstrate that the thrust of Plato’s Republic is theory-building. This involves the critical and dialectic processes which are paradigms of Platonic methodology. Book I displays abductive analogical reasoning joined by an induction that is embedded in a deduction; hence there is a deduction–induction–abduction chain. In Book VI, Plato constructs a visual model of the divided line, which also displays model-based and abductive hypothesis generation that is essential to theory building. Book VII provides an abductive metaphor model of the allegory of the cave. Both models depict degrees of reality and the ascendency of knowledge. The multimodal model-based allegory has far reaching applications from criminal justice to information systems. I conclude by capturing the narrative of the Republic as a critical and dialectic process of theory building (of justice) using deductive–inductive–abductive chains, an abductive visual model and an abductive metaphor model. Hence, the Republic is simultaneously a masterpiece of deductive reasoning and a marvel of complex model-based abduction, involving visual models, analogies and metaphors.
Chapter
This paper sets out to show how mathematical modelling can serve as a way of ampliating knowledge. To this end, I discuss the mathematical modelling of time in theoretical physics. In particular I examine the construction of the formal treatment of time in classical physics, based on Barrow’s analogy between time and the real number line, and the modelling of time resulting from the Wheeler-DeWitt equation. I will show how mathematics shapes physical concepts, like time, acting as a heuristic means—a discovery tool—, which enables us to construct hypotheses on certain problems that would be hard, and in some cases impossible, to understand otherwise.
Chapter
Abduction is reasoning which produces explanatory hypotheses. Models are one basis for such reasoning, and language use can function as a model. I treat children’s early use of mental verbs as a model for dealing with a problem from developmental psychology, namely, how children’s early non-referential use of mental verbs might give children an early grasp of the mental realm. The present paper asks what practical knowledge of mental actions accompanies children’s competent use of mental verbs. I begin with examples of non-referential verb uses and some theories from Diessel and Tomasello (Cognit Linguist 12:97–141, 2001) as bases for discussion. I argue that in using mental verbs non-referentially children understand several kinds of relations which people have to situations. Children learn how to use mental verbs to request someone to search for a situation in the past or in the physical surroundings; they learn to express hopes so as to affect their future; they learn to vouch strongly or weakly for the existence of a situation. In all of these cases it appears that the children’s main focus is on interactions with people, in which one person’s mental action in relation to a situation described in a COMP-clause is intended to have an effect on the other person. Children do not understand the nature or mechanisms of any of these mental actions, but instead focus on practical matters: how to use the verbs to perform certain actions in relation to other people and various situations. It appears that in these early uses children do not view mind as at all separate from the interactional and physical world.
Chapter
Human perception, experience, consciousness, feeling, meaning, thought, and action all require a functioning human brain operating in and through a live body that is in ongoing engagement with environments that are at once physical, interpersonal, and cultural. This embodied perspective demands an explanation of how all of the wondrous aspects of human mind – from our ability to have unified, intelligible experience all the way up to our most stunning achievements of theoretical understanding, imaginative thought, and artistic creativity – can emerge from our bodily capacities. I want to examine how the intricate intertwining of perception and action might provide the basis for our so-called “higher” acts of cognition and communication. In other words, I will explore how important parts of our abstract conceptualization and reasoning appropriate structures and processes of our most basic sensory–motor operations.
Chapter
As Luciano Floridi states in the Introduction to his Philosophy of Information (PI), Information and Communication Technologies (ICTs) have achieved the status of the characteristic technology of our time. The computer and its related devices constitute a “culturally defining technology”, and Information and Communications Systems (ICSs) and ICT applications are among the most strategic factors governing science, the life of society and its future directions of development. The concept of levels enters inevitably into the Philosophy of Information: one is concerned with their nature, content, and the relations between them, starting from the ‘lowest’ levels of information constituted by physical electronic data themselves. The question of levels of analysis also arises in the Philosophy of Technology, which has emerged as a separate field of study, not coextensive with PI. However, since other papers in this Volume will address the Philosophy of Technology specifically, mine will be limited to aspects of levels in PI. As a tool for analysis of informational issues, Floridi has made a critical construction of epistemological Levels of Abstraction (LoAs), defined as non-empty sets of observables, in his PI. In applying LoAs in various fields, Floridi correctly critiques other uses of ‘levels’ in philosophy (levelism), especially, the lack of a satisfactory concept of ontological levels. This paper approaches the problem of levels from a novel perspective, namely, that of an extension of logic to complex real processes, including those of information production and transfer. This non-propositional, non-truth-functional logic (Logic in Reality; LIR), is grounded in the fundamental dualism (dynamic opposition) inherent in energy and accordingly present in all real phenomena. I show that Floridi’s theory of Levels of Abstraction (LoAs), Gradients of Abstraction (GoAs) and Levels of Organization (LoOs) can be supported by the concept of ontological Levels of Reality (LoRs) based on LIR, defined in terms of the different but isomorphic laws applicable to them. Applications of LoAs can be made ‘jointly’ with LoRs to describe the informational component present in all phenomena. The Floridi concepts are compared to ontological, other epistemological and systems concepts of levels: Levels of Reality and Complexity in the categorical approach of Poli; the nested hierarchical levels of Salthe, which can be related to Floridi’s Gradients of Abstraction (GoAs); and the concept of Levels of Logical Openness of Minati and Licata, which are applied in a systems context. The utility of this new logical perspective on the generalization of applying LoAs and LoRs conjointly to on-going problems in the philosophy and metaphysics of information is suggested.
Chapter
Nowadays, the role of values in the configuration of technology appears as a crucial topic. (1) What technology is and ought to be depends on values. These values can be considered in a twofold framework: the structural dimension and the dynamic perspective. (2) Axiology of technology takes into account the existence of these values—structural and dynamic—of its configuration, because technology is not a value-free undertaking and it has an “internal” side as well as an “external” part. Thus, axiology of technology studies the role of the “internal” values of technology (those characteristic of technology itself) and the task of “external” values of technology (those around this human undertaking). (3) Subsequently, the ethics of technology deals with ethical values. In this regard, the ethical analysis of technology is also twofold: there is an endogenous perspective (as a free human undertaking related to the creative transformation of the reality) and an exogenous viewpoint, insofar as technology is related to other human activities within a social setting.
Chapter
Abstract Systems biology has been framed as a newly emerging paradigm in biology conceived to overcome the theoretical and methodological shortcomings of previous approaches such as molecular biology. Framed as an approach, its history has to date rarely been addressed which means the historical analysis of its theoretical roots and ancestors still remain in the dark. This chapter aims at partly fi lling this gap by analyzing the imagined presents, pasts, and futures of systems biology as seen through the systems biologist’s eyes. For this to be done, a narrative analysis is applied to written sources and expert interviews conducted with system biologists in Germany. The analysis reveals considerably different pictures of imagined present, pasts and futures between the written and interview data. It becomes apparent that despite current attempts to establish a common defi nition of systems biology considerable differences of what it represents exist. More important, however, is the fact that an ahistoric perspective prevails among many system biologists interviewed. Albeit historical references to so-called predecessors appear now and then, we discuss the danger of a prevailing ahistoric narrative in systems biology. A solution to this problem is a still missing conceptual historiography of systems biology that holds the potential to provide clarifi cation of defi nitional fuzziness and the relevance of a historically grounded understanding of its conceptual importance in current biology. Only the knowledge about imagined presents, pasts and futures can help us better understand the present condition of systems biology and contribute to substantiating its conceptual deficits.
Article
Introduction: Adopting a conceptual change perspective yields information not only about the organization of students' conceptions and the mechanisms behind their changes, but also about the most effective teaching interventions for promoting conceptual change. In experimental science, modelling constitutes a basic activity for acquiring and using scientific concepts, and a key method for eliciting conceptual change. The aim of this study was to investigate how modelling activities can elicit conceptual changes concerning the notion of energy. Method: 40 students aged 16-17 years, working in pairs had to construct symbolic representations of three materially present experiments (Battery-bulb, Falling object and Rising object) drawing on a simple model that introduced them to the properties of energy. In order to track changes in their cognitive processes, we defined a number of specific modelling categories. Results: Results showed that students implemented increasingly complex cognitive processes to solve the three problems. Modelling activities enhance the ability to process the material world and the world of theories and models simultaneously, even when there is no isomorphism between the two. Discussion: The modelling activities we administered to students promoted efficient learning, insofar as the conceptual change mechanism was put in place. Solving the three problems allowed students to draw on their prior knowledge but also to develop new knowledge about the material and theoretical worlds. They acquired the ability to process representations simultaneously from concrete and conceptual worlds and to move freely between them, despite their lack of isomorphism.
Article
During the optical revolution, there were different styles of operating optical instruments, and their impact on the dispute between the two rival theories of light was evident. The differences in the use of optical instruments during the optical revolution originated from two incompatible instrumental traditions. This chapter begins with a brief historical review of these instrumental traditions. In their early years, optical instruments functioned primarily as visual aids to the eye, which was regarded as an ideal optical instrument. But when more and more optical instruments were used as measuring devices, the reliability of the eye came into question. In this context, there emerged two incompatible instrumental traditions, each of which endorsed a body of practices, both articulated and tacit, that defined how optical instruments should be operated, and particularly, how the eye should be used in optical experiments.