Conceptual Revolutions
... Interpretation involves understanding the nature and strength of the relationships between the evidence and competing claims and models, and is, thus, a core aspect of working with evidence (Bogen, 2017;Chapman & Wylie, 2016;Galison, 1997;Haack, 2007). (4) Integration: Reasoning with evidence involves a variety of processes for identifying bodies of relevant evidence, considering how types of research can be fit together to support one theory or model over another, and weighing evidence in various ways (Cole, 1992;Rasmussen, 1993Rasmussen, , 2001Solomon, 2001Solomon, , 2015Thagard, 1992). (5) Laypeople's use of evidence: Laypeople, by definition, do not have the disciplinary knowledge to engage as experts in analyzing, evaluating, interpreting, and integrating evidence (Barzilai & Chinn, in press; Chinn & Duncan, in press). ...
... Indeed, on some accounts of evidence, data become evidence only when it is linked to such theoretical constructs (Walton, Reed, & Macagno, 2008). Research on scientific reasoning has highlighted the importance of considering how different theoretical perspectives can lead to different interpretations of evidence (Thomm, Barzilai, & Bromme, 2017;Thagard, 1992), as well as considering the strength and diagnosticity of evidence in supporting or refuting alternative models (Haack, 2007;Rinehart, Duncan, Chinn, Atkins, & DiBenedetti, 2016;Haack, 2007). Evidence interpretation encompasses both determining how evidence is related to models (i.e., how the evidence is explained by the models) and appraising how strongly the evidence supports the models. ...
... One type of body of evidence is a group of studies that are conceived of as a line of evidence. Scientists often aim to explain these lines of evidence that establish general phenomena, as opposed to trying to explain individual studies (Samarapungavan, in press;Thagard, 1992;Woodward, 1989). For example, the distribution of volcanoes on the surface of the earth provides one line of evidence for the plate tectonics theory. ...
Practices of generating, analyzing, and using evidence play a central role in the Framework for K‐12 Science Education and the NGSS. However, the construct of evidence remains largely underspecified in these documents, providing insufficient guidance on how to engage students with the broad and complex nature of evidentiary reasoning. This creates a risk of perfunctory and simplified implementation of evidence‐based practices that misses the intent of the standards and does little to prepare students for reasoning with the complex, varied, and contentious evidence encountered in popular media or in advanced education. To address these challenges, we propose a theoretical framework, which we call Grasp of Evidence, that complexifies the concept of evidence in ways that facilitate introducing more authentic forms of evidence and more sophisticated ways of engaging with evidence in science classrooms. Our approach focuses on promoting a lay grasp of evidence needed by competent outsiders as they engage with science in their everyday lives. The framework posits five dimensions. The first four dimensions capture what students should understand about how experts work with evidence: evidence analysis, evidence evaluation, evidence interpretation, and evidence integration. The fifth dimension focuses on how laypeople can use evidence reports themselves. Each of these dimensions of practice involves specific epistemic aims, epistemic ideals, and reliable epistemic processes for reasoning with and about evidence. We discuss these dimensions and their contribution to the conceptualization of evidence as well as provide some initial instructional implications and potential directions for future research.
... We built a connectionist model as described in Thagard (2002) in order to compare Coherencer to a locally parallel algorithm that was competitive in the coherence literature (see Fig. 3). The model was built as follows. ...
... Research on generative cognitive faculties traditionally focuses on determining constraints that will exclude combinations that humans do not generate from the total set of options that could be generated, much like Coherencer. We will describe three prominent models that select different constraints to achieve this goal: a top-down conceptual approach to constraints in word-pair concept combinations (the C 3 model; Costello & Keane, 2000), a modal approach to constraints through the use of simulations in perceptual symbol systems ( Barsalou, 1999), and an amodal approach to constraint satisfaction from the literature on coherence (Thagard, 2002). ...
... Costello and Keane (2000), Barsalou (1999), and Coherencer use a similar approach with a few differences. First, the incremental algorithms described in Thagard (2002) build their initial pool one element at a time, whereas all the other models seed their initial pool with the strongest associations. Second, the space within the incremental algorithm's "working memory" can be of any size: It could literally contain the entire set of possible elements if that was what maximized coherence. ...
An incoherent visualization is when aspects of different senses of a word (e.g., the biological “mouse” vs. the computer “mouse”) are present in the same visualization (e.g., a visualization of a biological mouse in the same image with a computer tower). We describe and implement a new model of creating contextual coherence in the visual imagination called Coherencer, based on the SOILIE model of imagination. We show that Coherencer is able to generate scene descriptions that are more coherent than SOILIE's original approach as well as a parallel connectionist algorithm that is considered competitive in the literature on general coherence. We also show that co-occurrence probabilities are a better association representation than holographic vectors and that better models of coherence improve the resulting output independent of the association type that is used. Theoretically, we show that Coherencer is consistent with other models of cognitive generation. In particular, Coherencer is a similar, but more cognitively plausible model than the C³ model of concept combination created by Costello and Keane (2000). We show that Coherencer is also consistent with both the modal schematic indices of perceptual symbol systems theory (Barsalou, 1999) and the amodal contextual constraints of Thagard's (2002) theory of coherence. Finally, we describe how Coherencer is consistent with contemporary research on the hippocampus, and we show evidence that the process of making a visualization coherent is serial.
... 4 Already in 1992 Margaret Boden [26] illustrated the distinction between classical programs able to re-generate historical cases of scientific discovery in physical science (Simon's five BACON systems and GLAUBER, Langley [12]), and systems able to build new discoveries (DENDRAL and AM, cited above). Other authors (for example, Schunn and Klahr [27], who wrote the program ACT-R) stressed the further distinction between computational systems that regard the processes of abductive hypothesis formation and of evaluation: PHINEAS [28]; AbE [29]; ECHO [30,31]; TETRAD [32], and the already quoted MECHEM. Other programs addressed the abductive nature of experimental procedures (DEED [33]; DIDO [34]), and, finally, we have to remember the ones that addressed both processes of hypothesis formation and evaluation, and of experiment (KEKADA [35]; SDDS [36], LIVE [37]). ...
... They proposed semantic spaces as a computational approximation of Gärdenfors' conceptual space. Abductive hypotheses generated from semantic spaces do not have a proof-theoretic 7 A simple neural network has been used to build the computational program ECHO (Explanatory Coherence) regarding that part of abduction that concerns the process of hypothesis evaluation [30,31]. ...
Locked and unlocked strategies are at the center of this article, as ways of shedding new light on the cognitive aspects of deep learning machines. The character and the role of these cognitive strategies, which are occurring both in humans and in computational machines, is indeed strictly related to the generation of cognitive outputs, which range from weak to strong level of knowledge creativity. I maintain that these differences lead to important consequences when we analyze computational AI programs, such as AlphaGo, which aim at performing various kinds of abductive hypothetical reasoning. In these cases, the programs are characterized by locked abductive strategies: they deal with weak (even if sometimes amazing) kinds of hypothetical creative reasoning, because they are limited in what I call eco-cognitive openness, which instead qualifies human cognizers who are performing higher kinds of abductive creative reasoning, where cognitive strategies are instead.
... Нормативну філософію науки репрезентують П.Тагарт [15,16], Н.Нерсесіан [10,11], П.Джиер [4], які вбачають нові можливості для вивчення історичного розвитку наукових дослідницьких програм за допомогою моделей, розроблених в сфері штучного інтелекту. Крім того, Л.Лаудан [8] запропонував проект натуралізованої філософії науки, успішність якого полягає в тому, що можна зберегти нормативний характер, якщо відмовившись від метаметодологічного завдання надати реконструкцію раціональності вибору наукових теорій науковою елітою минулого. ...
... Натуралізована епістемологія, напрям започаткований В.Куайном [16], вбачає єдину можливість для епістемології залишитись на "сцені" в тому, щоб відмовитись від нормативного характеру та стати емпіричною дисципліною. Проект натуралізованої епістемології передбачає, що епістемологія та філософія науки мають стати гілкою дескриптивної психології, а можливо й нейрофізіології, справою якої є дослідження процесу отримання такого знання, яке називається наукою. ...
... The 35 differential equations in physical theory that assumed continuous quantities could be 36 approximated by difference equations expressed in computer instructions. The new 37 method replaced crude estimates of criticality by simulations that enable physicists 38 to determine how detonations occur. Even the very primitive early computers could 39 carry out calculations that would have taken humans hundreds of years. ...
... Most 231 philosophical discussions of coherence have only vaguely suggested how it can 232 be objectively assessed. However, coherence can be made much more precisely 233 calculable by considering it as a kind of constraint satisfaction problem of the 234 sort naturally approached using neural network algorithms [37,38,40]. Moreover, 235 coherence from this perspective can be formalized to an extent that enables proof 236 that the problem of coherence is NP-hard, i.e. in a class of problems for which a 237 guaranteed solution is unlikely to be found [55]. ...
Computer models provide formal techniques that are highly relevant to philosophical issues in epistemology, metaphysics, and ethics. Such models can help philosophers to address both descriptive issues about how people do think and normative issues about how people can think better. The use of computer models in ways similar to their scientific applications substantially extends philosophical methodology beyond the techniques of thought experiments and abstract reflection. For formal philosophy, computer models offer a much broader range of representational techniques than are found in traditional logic, probability, and set theory, taking into account the important roles of imagery, analogy, and emotion in human thinking. Computer models make possible investigation of the dynamics of inference, not just abstract formal relations.
... Lastly, students may make use of different "rules" to explain phenomena, which may be grouped into a set of "explanatory relations" (Thagard, 1992). Where this is helpful, it will be applied to the data being analyzed. ...
... Using the analysis proposed by Thagard (1992), we can identify eleven "rules" used by students (see numbers) to describe energy associated with Tasks A and B, which may be grouped into five "explanatory relations" (see Roman numerals): ...
In this article, we examine first-year chemical engineering students' conceptions of the energy changes taking place in dissolution. Students were individually interviewed with three tasks in which three different salts were dissolved in water, and 17 transcripts were analyzed using a phenomenographic methodology. Four descriptive categories of energy in dissolution were discerned: (a) you give energy (n = 1); (b) water gives energy (n = 17); (c) salt gives off energy (n = 13); and (d) reaction gives off energy (n = 7). Four students gave the same explanation for all three tasks, but more students used the same explanation for two of the tasks: four for Tasks A and B, four for Tasks B and C, and eight for Tasks A and C. Moreover, "salt gives off energy" was the most common explanation for Tasks A and B (n = 3), "reaction gives off energy" for Tasks B and C (n = 3), and "water gives energy" for Tasks A and C (n = 8). Four of the students showed variations of conception within tasks. Students described the solution process of all three tasks using a range of concepts, including previously learned chemical concepts. Even where students used the same chemical concepts in each of the tasks, they did not always give the same meaning to the concepts they used. The phenomenographic categories explanations given by students were used as a basis for developing an approach to teaching energy in solution processes. It is argued that this approach of using phenomenographic categories described at a collective level as a basis for discourse for constructing common knowledge should be used in teaching. It is proposed that a future study must be conducted to develop new trajectories students take to arrive at common knowledge and to understand how to move learners from their personal conceptions to plausible models in solution chemistry Correspondence to:
... With the realisation that concepts are central to the success of communication, efforts have been made by various authors to shed light on what concepts are and how useful they are in communication ( Novak & Gowin 1984, Thagard 1992, Novak 1998, Robinson 1999, Kinchin, Hay & Adams 2000, and so on). An understanding of concept relations or concept maps, for instance, equips language users with better understanding of how terms are interconnected and organised in a given subject field. ...
... This is because of the intertwined relationship that exists between a term and a concept. Thagard (1992: 21) describes concepts as mental structures representing what words stand for. He continues that concepts are normally products of ideas and thoughts that are like images, with some of these ideas are innate, some deriving from external source, and others being constructed by the thinker. ...
... There are also exhaustive, well documented and rigorous versions of it, like the book published by Oreskes (1999) or the four volumes 1 Rupke adds another possible contributory factor to this argument: Eurocentric continentalism in continental drift theory may have played a role in causing the American distaste for Wegener's theory (Rupke 1996, 268). 2 Similar expressions can be found in other authors: "the arguments for continental motions did not gel until the 1960s, when a drastic expansion of geophysical research, driven by the cold war, produced evidence that reopened and eventually settled the debate" (Oreskes 2013, 27). See also Giere (1988), Thagard (1992), and Laudan and Laudan (1989). always ambiguous, so closure of controversies is not just a matter of unequivocal evidence but a process of social agreements between experts (Collins and Pinch 1998, 145-146). ...
The continental drift controversy has been deeply analysed in terms of rationalist notions, which seem to find there a unique topic to describe the weight of evidence for reaching consensus. In that sense, many authors suggest that Alfred Wegener’s theory of the original supercontinent Pangea and the subsequent continental displacements finally reached a consensus when irrefutable evidence became available. Therefore, rationalist approaches suggest that evidence can be enough by itself to close scientific controversies. In this article I analyse continental drift debates from a different perspective which is based on styles of thought. I’ll argue that continental drift debate took much longer than it was usually recognized with two styles of thought coexisting for hundreds of years. These were fixism and mobilism and they were always confronting their own evidence and interpretations and functioning as general frameworks for the acceptability of a specific theory. Therefore, this text aims to bring much broader sociological elements than usually involved in the analysis of the continental drift theory.
... (2001), juga menyatakan bahwa beberapa pebelajar dalam tugas belajarnya mengalami unadapted respons di mana pebelajar tidak menyadari adanya konflik. Oleh karena itu, beberapa hal bisa dilakukan untuk memfasilitasi pebelajar dalam mengenali konflik dan mengundang ketertarikan mereka untuk belajar, antara lain dengan menyajikan informasi yang kontradiktif atau data yang anomali (Chinn dan Malhotra, 2002;Limon, 2001;Thagard, 1992). Menurut Limon dan Carretero (1997) menyajikan informasi yang bertentangan atau data anomali di sisi lain membantu pebelajar untuk merefleksikan lebih lanjut tentang ide-ide mereka untuk memberikan penjelasan tentang fenomena yang dipelajari, dan mungkin dapat mengaktifkan keingintahuan mereka tentang fenomena yang diajarkan. ...
This research aims to develop active-based-inquiry (ABI) learning model to improve critical thinking skills of students teachers candidate of physics. ABI learning model developed by three criteria, namely the validity, practicality, and effectiveness. Validation of the model involves nine validators through the mechanism of the Focus Group Discussion. The validation tests showed the ABI learning model declared valid and can be used as a reference to development learning tools and materials that accompanying the ABI learning models. In the implementation phase of models, practicality evaluated using observation-sheet instruments involves two observers. Results showed ABI learning models implemented with the very-well criteria. Furthermore, the effectiveness of the model in improving critical thinking skills of students teachers candidate. Instruments such as critical thinking skills test includes six indicators, namely the interpretation, analysis, evaluation, inference, explanation and self-regulation. The test results show the critical thinking skills were in critical criterion with an average value of 76.4. Generally, it can be concluded that ABI learning model can improve critical thinking skills of students teachers candidate of physics.
... Piaget also stated that several learners have not adapted response in learning task where learners are unaware of any conflict (Lee et al., 2003;Limon, 2001). Therefore, some things can be done to facilitate learners in recognizing conflict and to bring up student's interest to learn, like presenting contradictive information or anomalous data (Chinn & Malhotra, 2002;Thagard, 1992). According to Limon & Carretero (1997) presenting contradictive information or anomalous data on the other hand helps learners to reflect more about their ideas to bring explanation of learned phenomena, and may be able to activate their curiosity of learned phenomena. ...
Teaching critical thinking (CT) to prospective teachers has been a concern for a long time, and prospective teacher training becomes an appropriate period for interventions that promote CT ability. Therefore, it is necessary to develop a model of learning that accommodates aspects of prior knowledge, motivation, and CT. This study aims to develop Critical-Inquiry-Based-Learning (CIBL) model to promote the CT ability of prospective teachers of physic (PTP). This study is based on Nieveen's theory about the criteria of rich product quality (valid, practice, and effective) and the theory of Borg and Gall about development research. The CIBL model embraced three criteria, namely validity, practicality, and effectiveness. The CIBL model was validated by experts through the mechanism of the focus group discussion (FGD) (for validity aspect), the implementation of the model in the class were observed by a number of observers (for practicality aspect), and the assessment of CT ability is done after the learning process (for effectiveness aspects) and then analyzed. The findings of the research showed that the CIBL model is feasible because of its validity, practicality, and effectiveness. This means that the CIBL model was able to promote CT ability of PTP.
... Piaget also stated that several learners have not adapted response in learning task where learners are unaware of any conflict (Lee et al., 2003;Limon, 2001). Therefore, some things can be done to facilitate learners in recognizing conflict and to bring up student's interest to learn, like presenting contradictive information or anomalous data (Chinn & Malhotra, 2002;Thagard, 1992). According to Limon & Carretero (1997) presenting contradictive information or anomalous data on the other hand helps learners to reflect more about their ideas to bring explanation of learned phenomena, and may be able to activate their curiosity of learned phenomena. ...
Teaching critical thinking (CT) to prospective teachers has been a concern for a long time, and prospective teacher training becomes an appropriate period for interventions that promote CT ability. Therefore, it is necessary to develop a model of learning that accommodates aspects of prior knowledge, motivation, and CT. This study aims to develop Critical-Inquiry-Based-Learning (CIBL) model to promote the CT ability of prospective teachers of physic (PTP). This study is based on Nieveen's theory about the criteria of rich product quality (valid, practice, and effective) and the theory of Borg and Gall about development research. The CIBL model embraced three criteria, namely validity, practicality, and effectiveness. The CIBL model was validated by experts through the mechanism of the focus group discussion (FGD) (for validity aspect), the implementation of the model in the class were observed by a number of observers (for practicality aspect), and the assessment of CT ability is done after the learning process (for effectiveness aspects) and then analyzed. The findings of the research showed that the CIBL model is feasible because of its validity, practicality, and effectiveness. This means that the CIBL model was able to promote CT ability of PTP.
... He shares the view of many scientists and philosophers who note that scientific explanations should be broad (Kuhn, 1977;Whewell, 1840). A reasonable psychological prediction is that people should prefer explanations with broad scope as well (Thagard, 1992), and they often do. In the aforementioned studies by Read and Marcus-Newhall (1993), participants learned a few facts about an arbitrary woman, for example, that she has nausea, weight gain, and fatigue. ...
Reasoning concerns the cognitive processes by which people draw conclusions from the salient, meaningful pieces of information that they comprehend or observe. Reasoning processes are challenging to investigate because both their initiation and their final product (the inference) can be nonverbal and unconscious. This chapter summarizes recent developments in the science of reasoning. It briefly reviews the differences between “core” patterns of inference, that is, deduction, induction, and abduction: Deductions are inferences that are true in those cases in which the premises are true. Inductions concern all other sorts of reasoning. And abductions are special types of inductions that yield explanatory hypotheses. The chapter then addresses three fundamental debates that engage contemporary reasoning researchers. The first addresses how to separate rational from irrational deductions. The second concerns the relation between deduction and induction. And the third focuses on how people create explanations. The chapter concludes by addressing ways of making progress to a general, unified account of higher‐level reasoning.
... Critical realist methodologists Danermark and colleagues (2002), and Haig (2005) both identify a stage in explanatory research and theory construction where comparison and assessment of the identified theories and abstractions are undertaken. Haig (2005) argued for Thagard's (1992) formulation of Inference to the Best Explanation, which included the seven principles of symmetry, explanation, analogy, data priority, contradiction, competition, and acceptability, and three criteria: consilience, simplicity, and analogy (Thagard, 1988). Ward (2009) argued that the commonly used Bradford Hill "criteria" are an Inference to Best Explanation within a realist epistemology. ...
We have previously reported on the findings of a critical realist concurrent triangulated mixed method multilevel study that sought to identify and explain complex perinatal contextual social and psychosocial mechanisms that may influence the developmental origins of health and disease. That study used both emergent and construction phases of a realist explanatory theory building method. The purpose of this article is to present the thesis, theoretical framework, propositions, and models explaining neighborhood context, stress, depression, and the developmental origins of health and disease. The analysis draws on an extensive extant literature; intensive (qualitative), extensive (quantitative), and multilevel studies used for phenomena detection, description, and emergent phase theory development; and the abductive and retroductive analysis undertaken for the theory construction phase. Global, economic, social, and cultural mechanisms were identified that explain maternal stress and depression within family and neighborhood contexts. There is a complex intertwining of historical, spatial, cultural, material, and relational elements that contribute to the experiences of loss and nurturing. Emerging is the centrality of social isolation and “expectation lost” as possible triggers of stress and depression not only for mothers but possibly also for others who have their dreams shattered during life’s transitions. The thesis: In the neighborhood spatial context, in keeping with critical realist ontology, global-economic, social, and cultural-level generative powers trigger and condition maternal, psychological, and biological-level stress mechanisms, resulting in the phenomenon of maternal depression and alteration of the infants’ developmental trajectory.
... Among other extra-scientific factors, scholars mention: gender (Harding 1991), geography (Livingstone 2010), ideology (Žižek 1989), memory (Nora 1989), politics (Latour 1993) and language (Pinker 2003). Still later developments have also sought to extend scientific knowledge production to include cognitive and other psychological contingents (Feist and Gorman 2013;Klahr 2002;Proctor and Capaldi 2012;Thagard 1992) to better understand how our own minds can 'extra-scientifically' influence scientific knowledge production. ...
The concepts ‘rural’ and ‘urban’ have long been criticized by geographers for their lack of analytical and explanatory power, yet have remained a vital source for conceptual guidance in human geography. Realizing that the continued use of questionable concepts inadvertently runs the risk of compromising communication, misdirecting resources and downgrading social theory, the current status of ‘rural/urban’ creates a paradoxical epiphenomenon of progress-making in geography. We disentangle this paradox in two dimensions. Firstly, we show how a conflation between meaning and utility is what renders us desensitized to the problem. Secondly, we outline 12 extra-scientific factors likely to actuate the binary’s persistent retention. We finally sketch a sensuous template set out to minimize its undesired impact. We concede that the confusion surrounding ‘rural/urban’ in human geography cannot be understood unless the influence of extra-scientific factors is fully taken into account, revealing the concepts’ vestigiality. This, we argue, is the only way forward if we truly want to embrace the rationale of the scientific approach. The principal contribution of our paper is laying the groundwork for this particularly underresearched dimension of ‘rural/urban’ amidst an exceptionally rich conceptual literature on what ‘rural/urban’ ‘are’ or ‘mean’.
... laws and models). Learning such knowledge is also learning to construct and map possible conceptual connections in that system [1][2][3]. The structure of the knowledge system also affects how concepts are introduced in teaching scientific knowledge and how they are acquired in formal teaching and learning [4][5][6][7][8]. ...
Learning scientific knowledge is largely based on understanding what are its key concepts and how they are related. The relational structure of concepts also affects how concepts are introduced in teaching scientific knowledge. We model here how students organise their knowledge when they represent their understanding of how physics concepts are related. The model is based on assumptions that students use simple basic linking-motifs in introducing new concepts and mostly relate them to concepts that were introduced a few steps earlier, i.e. following a genealogical ordering. The resulting genealogical networks have relatively high local clustering coefficients of nodes but otherwise resemble networks obtained with an identical degree distribution of nodes but with random linking between them (i.e. the configuration-model). However, a few key nodes having a special structural role emerge and these nodes have a higher than average communicability betweenness centralities. These features agree with the empirically found properties of students’ concept networks.
There has been little investigation to date of the way metacognition is involved in conceptual change. It has been recognised that analytic metacognition is important to the way older children (c. 8–12 years) acquire more sophisticated scientific and mathematical concepts at school. But there has been barely any examination of the role of metacognition in earlier stages of concept acquisition, at the ages that have been the major focus of the developmental psychology of concepts. The growing evidence that even young children have a capacity for procedural metacognition raises the question of whether and how these abilities are involved in conceptual development. More specifically, are there developmental changes in metacognitive abilities that have a wholescale effect on the way children acquire new concepts and replace existing concepts? We show that there is already evidence of at least one plausible example of such a link and argue that these connections deserve to be investigated systematically.
This paper illustrates how the combination of teacher and computer guidance can strengthen collaborative revision and identifies opportunities for teacher guidance in a computer-supported collaborative learning environment. We took advantage of natural language processing tools embedded in an online, collaborative environment to automatically score student responses using human-designed knowledge integration rubrics. We used the automated explanation scores to assign adaptive guidance to the students and to provide real-time information to the teacher on students’ learning. We study how one teacher customizes the automated guidance tools and incorporates it with her in-class monitoring system to guide 98 student pairs in meaningful revision of two science explanations embedded in an online plate tectonics unit. Our study draws on video and audio recordings of teacher-student interactions during instruction as well as on student responses to pretest, embedded and posttest assessments. The findings reveal five distinct strategies the teacher used to guide student pairs in collaborative revision. The teacher’s strategies draw on the automated guidance to personalize guidance of student ideas. The teacher’s guidance system supported all pairs to engage in two rounds of revision for the two explanations in the unit. Students made more substantial revisions on posttest than on pretest yet the percentage of students who engaged in revision overall remained small. Results can inform the design of teacher professional development for guiding student pairs in collaborative revision in a computer-supported environment.
In this article I am going to reconstruct Stephen Toulmin’s procedural theory of concepts and explanations in order to develop two overlooked ideas from his philosophy of science: methods of representations and inferential techniques. I argue that these notions, when properly articulated, could be useful for shedding some light on how scientific reasoning is related to representational structures, concepts, and explanation within scientific practices. I will explore and illustrate these ideas by studying the development of the notion of instantaneous speed during the passage from Galileo’s geometrical physics to analytical mechanics. At the end, I will argue that methods of representations could be considered as constitutive of scientific inference; and I will show how these notions could connect with other similar ideas from contemporary philosophy of science like those of models and model-based reasoning.
This empirical qualitative research study examines teacher education candidates' development throughout their experience in a community-based teacher education program. Utilizing conceptual change theory, and conceptual frameworks of learning created digitally by teacher education candidates, researchers examined how candidates changed their existing conceptions of working with diverse children, families, and communities different from their own. Results indicate that development of conceptual frameworks provided candidates opportunities to challenge existing conceptual understandings about teaching in diverse contexts, and to go through the accommodation process where they were able to construct alternative conceptions for teaching in diverse classrooms.
The purpose of this study was the development and adaptation of three instruments for the measurement of scientific reasoning, motivation and interest of students towards learning science. Sixteen students of the Bachelor in Aeronautical Engineering answered the questionnaires; they vary in age and gender. The first tool was a questionnaire to measure student motivation toward science learning (MAAC) obtained an overall Cronbach alpha of 0.771. A second instrument for measuring the scientific reasoning (PRC) obtained a Kuder-Richardson 20 formula estimate reliability of .751. The survey of student interest for issues related to science obtained a Cronbach alpha of .845. The study findings confirm the validity and reliability of all instruments. The implications of using these instruments as supports for measuring conceptual change in the students are discussed in the document.
This chapter primarily deals with the conceptual prospects for generalizing the aim of abduction from the standard one of explaining surprising or anomalous observations to that of empirical progress or even truth approximation. It turns out that the main abduction task then becomes the instrumentalist task of theory revision aiming at an empirically more successful theory, relative to the available data, but not necessarily compatible with them. The rest, that is, genuine empirical progress as well as observational, referential and theoretical truth approximation, is a matter of evaluation and selection, and possibly new generation tasks for further improvement. The chapter concludes with a survey of possible points of departure, in AI and logic, for computational treatment of the instrumentalist task guided by the ‘comparative evaluation matrix’.
In this chapter I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called ‘mere-exposure effect’ which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular. It was initiated by Robert Zajonc (Attitudinal effects of mere exposure. J Pers Soc Psychol Monogr Suppl 9:1–27, 1968). On the other hand it is based on the formal theory of truthlikeness and truth approximation as presented in my From instrumentalism to constructive realism (2000).
Grounded theory methodology is the most influential perspective on how to conduct qualitative research in the behavioural and social sciences.
In this chapter, a broad abductive theory of scientific method is described that has particular relevance for the behavioural sciences. This theory of method assembles a complex of specific strategies and methods that are used in the detection of empirical phenomena and the subsequent construction of explanatory theories. A characterization of the nature of phenomena is given, and the process of their detection is briefly described in terms of a multistage model of data analysis. The construction of explanatory theories is shown to involve their generation through abductive, or explanatory, reasoning, their development through analogical modelling, and their fuller appraisal in terms of judgments of the best of competing explanations. The nature and limits of this theory of method are discussed in the light of relevant developments in scientific methodology.
The eighth chapter undertakes a philosophical examination of four prominent quantitative research methods that are employed in the behavioural sciences. It begins by outlining a scientific realist methodology that can help illuminate the conceptual foundations of behavioural research methods. Typically, these methods contribute to either the detection of empirical phenomena or the construction of explanatory theory. The methods selected for critical examination are exploratory data analysis, Bayesian confirmation theory, meta-analysis, and causal modelling. The chapter concludes with a brief consideration of directions that might be taken in future philosophical work on quantitative methods. Two additional quantitative methods, exploratory factor analysis and tests of statistical significance, are examined in more detail in separate chapters.
This chapter is concerned with the methodological foundations of evolutionary psychology. Evolutionary psychologists have offered adaptation explanations for a wide range of human psychological characteristics. Critics, however, have argued that such endeavours are problematic because the appropriate evidence required to demonstrate adaptation is unlikely to be forthcoming. More specifically, doubts have been raised over both the methodology employed by evolutionary psychologists for studying adaptations and about the possibility of ever developing acceptably rigorous evolutionary explanations of human psychological phenomena. In this chapter, it is argued that by employing a wide range of methods for inferring adaptation and by adopting an inference to the best explanation strategy for evaluating adaptation explanations, these two doubts can be adequately addressed.
This Chapter examines the methodological foundations of exploratory factor analysis (EFA) and suggests that it is properly construed as a method for generating explanatory theories. In the first half of the chapter, it is argued that EFA should be understood as an abductive method of theory generation that exploits an important precept of scientific inference known as the principle of the common cause. This characterization of the inferential nature of EFA coheres well with its interpretation as a latent variable method. The second half of the chapter outlines a broad theory of scientific method in which abductive reasoning figures prominently. It then discusses a number of methodological features of EFA in the light of that method. It is concluded that EFA, as a useful method of theory generation that can be profitably employed in tandem with confirmatory factor analysis and other methods of theory evaluation.
This chapter discusses of the nature of philosophical naturalism and its relation to scientific method. The discussion takes its cue from an interdisciplinary examination of the naturalization of the philosophy of mind by Kievit et al. (Psychol. Inq. 22:67–87, 2011), who employ statistical methods to construct psychometric models of both the identity and supervenience theories of the mind-body relation. For the most part, the focus of the chapter is on methods of inquiry. After a brief discussion of two different attitudes to naturalized philosophy, two well-known views of naturalism in the philosophy of mind are presented and considered in relation to the naturalism of Kievit et al. Thereafter, some limitations of structural equation modelling, which is the authors’ method of choice, are noted, as is the useful but neglected method of inference to the best explanation. Philosophers and psychologists are encouraged to use one another’s methods, to the benefit of both.
This chapter presents a framework for clinical reasoning and case formulation that is largely based on the abductive theory of scientific method presented in chapter three. Clinical reasoning has traditionally been understood in terms of the hypothetico-deductive method. Occasionally, Bayesian methods have been used as a resource. However, it is suggested that clinical psychology requires an organizing framework that goes beyond the strictures of these two methods and characterizes the full range of reasoning processes involved in the description, understanding, and formulation of the difficulties presented by clients. In the abductive theory of method, the processes of phenomena detection and theory construction are articulated and combined. Both of these processes are applied to clinical reasoning and case formulation, and a running case example is provided to illustrate the application.
The cognitive science of science studies the cognitive processes involved in carrying out science: How do scientists reason? How do scientists develop new theories? How do scientists deal with data that are inconsistent with their theories? How do scientists choose between competing theories? Research on these issues has been carried out by investigators in a number of cognitive science disciplines, particularly psychology, philosophy, and artificial intelligence. More detailed accounts of work in this area can be found in two recent conference volumes.
Much of the attention of philosophy of science, history of science, and psychology in the twentieth century has focused on the nature of conceptual change. Conceptual change in science has occupied pride of place in these disciplines, as either the subject of inquiry or the source of ideas about the nature of conceptual change in other domains. There have been numerous conceptual changes in the history of science, some more radical than others. One of the most radical was the chemical revolution. In the seventeenth century, chemists believed that the processes of combustion and calcination involved the absorption or release of a substance called phlogiston. On this theory, when an ore is heated with charcoal, it absorbs phlogiston to produce a metal; when a metal is burned, it releases phlogiston and leaves behind a residue, or calx. The concept of phlogiston derived from a quite complex Aristotelian/medieval structure that included three concepts central to chemical theory: sulphur, the principle of inflammability; mercury, the principle of fluidity; and salt, the principle of inertness. All material substances were believed to contain these three principles in the form of earths. The phlogiston theory held that in combustion, the sulphurous earth (phlogiston) returns to the substance from which it escaped during some earlier burning process in its history, and that in calcination the process is reversed. However, chemists also knew that a calx is heavier than the metal from which it was derived. So, the theory implies that phlogiston has a negative weight, or a positive lightness. This did not present a problem, though, because it was compatible with the Aristotelian elements of fire and air (the others being earth and water), which were not attracted towards the center of the earth. The development of the oxygen theory of combustion and calcination by Lavoisier in the late eighteenth century has been called the chemical revolution because it required replacing the whole conceptual structure with, for example, different concepts of substance and element and new concepts of oxygen and caloric. In the new system, it was no longer possible to believe in the existence of substances with negative weight. According to the oxygen theory, oxygen gas is released in combustion and absorbed in calcination. Thus calx is metal (substance) plus oxygen, rather than metal minus phlogiston. The concept of phlogiston was eliminated from the chemical lexicon. The reconceptualization of chemical phenomena that took place in the chemical revolution made possible the atomic theory of matter, which, as we know, posits quite different constituents of material substances from the principles central to the earlier conceptual structure. Just what constitutes conceptual change, how it relates to theory change, and how it relates to changes in belief continues to be a subject of much debate. Clearly, though, as the preceding example demonstrates, the three are significantly interrelated.
A great deal of research has indicated that teaching is rarely a matter of introducing learners to material that simply replaces previous ignorance, but is more often a matter of presenting ideas that are somewhat at odds with existing understanding. In subjects such as chemistry, learners at school and university come to their studies already holding misconceptions or ‘alternative conceptions’ of subject matter. This has implications for subsequent learning, and so for teaching. This article reviews a number of key issues: (i), the origins of these alternative conceptions; (ii), the nature of these ideas; and, (iii), how they influence learning of the chemistry curriculum. These issues are in turn significant for guidance on (a) how curriculum should be selected and sequenced, and (b) on the pedagogy likely to be most effective in teaching chemistry. A specific concern reported in chemistry education is that one source of alternative conceptions seems to be instruction itself.
The purpose of this study was to describe the effect of multiple knowledge representations of physical and chemical changes on the development of primary pre-service teachers’ cognitive structures. The study took place in an introductory general chemistry laboratory course in a four-year teacher education program. Multiple knowledge representations in chemistry refer to the macroscopic (visible), sub-microscopic (invisible), and symbolic (formulas and equations). The study adopted one group pretest-posttest design supported by qualitative data. Forty primary pre-service teachers participated in this study. The results revealed that enabling the primary pre-service teachers to learn multiple representations of physical and chemical changes was effective in developing both groups of pre-service teachers’ cognitive structures, low and high-level understanding of particulate nature of matter, the latter benefitting the most. This finding was instructive because it emphasizes the difficulty that some primary pre-service teachers had on the particulate and symbolic representations of physical and chemical changes. The improvement in primary pre-service teachers’ cognitive structures of physical and chemical change by the use of multiple representations.
Your article is available as 'Online First':
http://link.springer.com/article/10.1007/s10838-018-9409-0
Abstract As emphasized by Larry Laudan in developing the notion of non-refuting
anomalies (Laudan 1977; Nola and Sankey 2000), traditional analyses of empirical adequacy
have not paid enough attention to the fact that the latter does not only depend on a
theory’s empirical consequences being true but also on them corresponding to the most
salient phenomena in its domain of application. The purpose of this paper is to elucidate
the notion of non-refuting anomaly. To this end, I critically examine Laudan’s account and
provide a criterion to determine when a non-refuting anomaly can be ascribed to the
applicative domain of a theory. Unless this latter issue is clarified, no proper sense can be
made of non-refuting anomalies, and no argument could be opposed to those cases where
an arbitrary restriction in a theory’s domain of application dramatically reduces the possibilities
for its empirical scrutiny. In arguing for the importance of this notion, I show how
several semanticist resources can help to reveal its crucial implications, not only for theory
Mental health professionals such as psychiatrists and psychotherapists assess their patients by identifying disorders that explain their symptoms. This assessment requires an inference to the best explanation that compares different disorders with respect to how well they explain the available evidence. Such comparisons are captured by the theory of explanatory coherence that states 7 principles for evaluating competing hypotheses in the light of evidence. The computational model ECHO shows how explanatory coherence can be efficiently computed. We show the applicability of explanatory coherence to mental health assessment by modelling a case of psychiatric interviewing and a case of psychotherapeutic evaluation. We argue that this approach is more plausible than Bayesian inference and hermeneutic interpretation.
The English National Curriculum (for 5–16 year olds) for the science taught in English schools has had unintended as well as planned effects. There has been extensive government involvement in the professional work of teachers through inspection regimes, offering direction on the nature of formal assessment, and emphasising the outcomes of high status tests as public markers of educational quality. The chapter considers where these efforts have supported teachers in meeting widely accepted aims of science education, and where they have—often inadvertently—restricted good teaching practice and undermined efforts to teach in accordance with the principles of constructivist educational theory: working against teachers’ flexibility to respond to the needs of students, undermining meaningful enquiry teaching, and restricting effective teaching about socio-scientific issues.
p align="LEFT">Chinese characters have always been the
difficulty of teaching Chinese as a foreign
language, especially for Chinese learners of
non-Chinese cultural circles. Based on the
characteristics of Chinese characters and the
cognitive law of "linguistic concept" and
"categorization" of Chinese characters, the
article aims to design a set of radicals "Typical
attribute", with the radicals as the "prototype",
"typical members" and "atypical members" of
the hierarchical system for Chinese learners
more systematic and more reasonable to master
the Chinese characters to provide a teaching
model, and After the teaching of Chinese
characters to provide some reference.</p
Whether abduction is treated as an argument or as an inference, the mainstream view presupposes a tight connection between abduction and inference to the best explanation (IBE). This paper critically evaluates this link and supports a narrower view on abduction. Our main thesis is that merely the hypothesis-generative aspect, but not the evaluative aspect, is properly abductive in the sense introduced by C. S. Peirce. We show why equating abduction with IBE (or understanding them as inseparable parts) unnecessarily complicates argument evaluation by levelling the status of abduction as a third reasoning mode (besides deduction and induction). We also propose a scheme for abductive argument along with critical questions, and suggest retaining abduction alongside IBE as related but distinct categories.
En este trabajo, quisiera sugerir que una epistemología jurídica adecuada debe asentarse, de manera fundamental, en dos grandes pilares: la idea de coherencia y la idea de virtud. Mi argumento procederá de la siguiente manera. En la primera seccion, sostendré que el coherentismo es una teoría prometedora para analizar la justificación de las conclusiones acerca de los hechos en el Derecho. De manera más específica, como argumentaré en la segunda sección, es el coherentismo explicacionista el tipo de coherentismo que, me parece, puede ofrecer un paradigma adecuado para analizar el problema de la justificación de los enunciados fácticos en el Derecho. En la siguiente sección, describiré un problema que aqueja al coherentismo, a pesar de su plausibilidad inicial. Mi argumento central, como desarrollaré en la seccion cuarta, es que este problema se puede resolver apelando a la idea de virtud. Concluiré apuntando, en la última sección, algunas líneas de investigación futuras que se abren una vez que insertamos la idea de virtud en la epistemología jurídica.
Macroscopic theories of scientific change are holistic views of what drives the creation and acceptance of scientific knowledge. At the grand scale of scientific communities, such theories offer a conceptual framework for analyzing the development of a scientific discipline through philosophical, sociological, and problem solving perspectives. As mental models, however, these models of scientific processes are subject to pitfalls and biases that may hinder our analytic reasoning. Integrating theoretical and empirical studies has the potential to help us reach a new level of understanding the dynamics of scientific knowledge. Three major theories of scientific change are presented from philosophical, sociological, and problem-solving perspectives to highlight distinct concepts and expectations as well as shared characteristics.
Students of history and philosophy of science courses at my University are either naive robust realists or naive relativists in relation to science and technology. The first group absorbs from culture stereotypical conceptions, such as the value-free character of the scientific method, that science and technology are impervious to history or ideology, and that science and religion are always at odds. The second believes science and technology were selected arbitrarily by ideologues to have privileged world views of reality to the detriment of other interpretations. These deterministic outlooks must be challenged to make students aware of the social importance of their future roles, be they as scientists and engineers or as science and technology policy decision makers. The University as Decision Center (DC) not only reproduces the social by teaching standard solutions to well-defined problems but also provides information regarding conflict resolution and the epistemological, individual, historical, social, and political mechanisms that help create new science and technology. Interdisciplinary research prepares students for roles that require science and technology literacy, but raises methodological issues in the context of the classroom as it increases uncertainty with respect to apparently self- evident beliefs about scientific and technological practices.
The innovation systems approach has swiftly spread out worldwide in the last three decades and stood as an important framework for policy-making in the fields of science, technology, and innovation. At the same time, there have been serious and untreated concerns in the literature about the theoretical soundness of this approach. Our discussion in this paper is based on the belief that a detailed analysis on epistemological foundations of the approach could shed a judgmental light on the aforementioned concerns. To provide that analysis, we reconstructed and studied the nature and evolution of innovation systems approach as a case of conceptual revolution against the more established traditions in economics of innovation. Our analyses show that the rise of the systemic view of innovation include radical changes in conceptual structures pertaining to orthodox view of innovation. These developments also go so far as to include significant shifts in epistemological foundations of the old paradigm. These later shifts provide us with an epistemological base from which we can shelter the innovation systems approach from the hard criticisms rooted in a quasi-positivist point of view. That epistemological base also shows the path for the future development of the approach towards more robustness and precision.
Nicholas Rescher has developed “pragmatic idealism” as a philosophical system. In this original system, the problem of scientific prediction appears as an element that is part of a whole. Following the idea of a system as a backdrop, this chapter offers the philosophico-methodological coordinates for the analysis of scientific prediction in Rescher’s thought. Several steps are followed: (1) There is a description of his academic and intellectual trajectory. (2) His system of pragmatic idealism is analyzed within the contemporary context, with emphasis on its originality. (3) The idealistic aspect of his philosophy is seen in regard to the role of concepts in the articulation of knowledge, and the pragmatic aspect is considered with the problem of scientific progress at stake. (4) The main philosophico-methodological characters of scientific prediction are addressed. This includes paying attention to the semantic, logical, epistemological, methodological, ontological, axiological and ethical features of prediction, which in Rescher are closely related.
O padrão pelo qual se avalia o processo de raciocínio da geologia entende que lhe falta uma metodologia própria e distintiva. Pelo contrário, a geologia é descrita como ciência derivada, baseada em técnicas lógicas, tal como exemplificado pela física. Defendo que esta avaliação é insuficiente e distorce nossa compreensão tanto da geologia como do processo científico em geral. Longe de simplesmente assumir e aplicar as técnicas lógicas da física, o raciocínio geológico desenvolveu seu próprio conjunto distinto de procedimentos lógicos. Eu começo com uma revisão da filosofia da ciência contemporânea no que se refere à geologia. Passo, então, a discutir as duas características distintivas do raciocínio geológico, que são sua natureza: (1) de ciência interpretativa e (2) de ciência histórica. Concluo que o raciocínio geológico nos oferece o melhor modelo do tipo de raciocínio necessário para enfrentar os tipos de problemas que deverão emergir no século 21
The paper explores possible influences that some developments in the field of branches of AI, called automated discovery and machine learning systems, might have upon some aspects of the old debate between Francis Bacon’s inductivism and Karl Popper’s falsificationism. Donald Gillies facetiously calls this controversy ‘the duel of two English knights’, and claims, after some analysis of historical cases of discovery, that Baconian induction had been used in science very rarely, or not at all, although he argues that the situation has changed with the advent of machine learning systems. (Some clarification of terms machine learning and automated discovery is required here. The key idea of machine learning is that, given data with associated outcomes, software can be trained to make those associations in future cases which typically amounts to inducing some rules from individual cases classified by the experts. Automated discovery (also called machine discovery) deals with uncovering new knowledge that is valuable for human beings, and its key idea is that discovery is like other intellectual tasks and that the general idea of heuristic search in problem spaces applies also to discovery tasks. However, since machine learning systems discover (very low-level) regularities in data, throughout this paper I use the generic term automated discovery for both kinds of systems. I will elaborate on this later on). Gillies’s line of argument can be generalised: thanks to automated discovery systems, philosophers of science have at their disposal a new tool for empirically testing their philosophical hypotheses. Accordingly, in the paper, I will address the question, which of the two philosophical conceptions of scientific method is better vindicated in view of the successes and failures of systems developed within three major research programmes in the field: machine learning systems in the Turing tradition, normative theory of scientific discovery formulated by Herbert Simon’s group and the programme called HHNT, proposed by J. Holland, K. Holyoak, R. Nisbett and P. Thagard.
Vine and Matthews1 suggest that the pattern of local magnetic anomalies on the flanks of a mid-oceanic ridge is strongly lineated parallel to the ridge, and that these magnetic `stripes' represent strips of material in the upper mantle the directions of permanent magnetization of which are alternately parallel and anti-parallel to the present local geomagnetic field. Vine and Matthews suggest that mantle material cools as it rises convectively under a ridge and then spreads2 horizontally outward. As the material cools through its Curie point it is magnetized parallel to the contemporary local geomagnetic field. Because this field reverses quasi-periodically3,4 with a period 2T, T being of the order of 0.5-1.0 million years, stripes of alternate permanent magnetization are produced the width of which is vT, v being the local horizontal velocity with which material at the surface of the mantle spreads away from the centre of the ridge. The stripes are observed1 to have widths of the order of 20 km. If T is 0.5 million years, v is 4 cm/yr. Convective velocities of this order are also indicated by palæomagnetic data5.