Article

ConnectedScience: Learning biology through constructing and testing computational theories - An embodied modeling approach

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The complex systems literature shows that learners can develop a deep understanding of emergent phenomena by focusing on the individual or the agent level, especially by using ABMs (Danish, 2014;Dickes, Sengupta, Farris, & Basu, 2016;Goldstone & Wilensky, 2008;Sengupta & Wilensky, 2009;Wilensky & Reisman, 2006). Complex emergent phenomena can often be counterintuitive (e.g., although cars move in the forward direction, traffic jams propagate backward), and difficult to understand in the absence of appropriate pedagogical support (Chi, Roscoe, Slotta, Roy, & Chase, 2012;Goldstone & Wilensky, 2008). ...
... ABMs provide learners with an intuitive pathway for reasoning about emergent outcomes by enabling learners to think like an agent and, consequently, bootstrap their embodied and intuitive knowledge of individual-level actions and interactions (Dickes, Sengupta, Farris, & Basu, 2016;Sengupta & Wilensky, 2009). By learning to take on the perspectives of different agents in the system, and by reasoning about their relationships, often using multiple forms of representations of these interactions and relationships, learners can, indeed, develop a deep understanding of how the aggregate-level outcome emerges from these interactions (Dickes, Sengupta, Farris, & Basu, 2016;Wilensky & Reisman, 2006). Similarly, we thought that reasoning about the local conditions, motives, and interactions of individual actors in the ViMAP simulation and the one-dot map of race would enable the preservice teachers to develop a deep understanding of ethnocentrism and ethno-racial segregation in an intuitive manner. ...
... However, designing and creating meaningful connections across these two forms of activities requires further empirical work. Such inves- tigations would enable us to develop a deeper understanding of perspectival thinking that is a hallmark of reasoning using ABMs in general (Dickes, Sengupta, Farris, & Basu, 2016;Wilensky & Reisman, 2006;Farris & Sengupta, 2014), and historical thinking in which perspective-taking is a central component (Seixas, 2015). ...
Article
Full-text available
In this article, we argue that when complex sociopolitical issues such as ethnocentrism and racial segregation are represented as complex, emergent systems using agent-based computational models (in short agent-based models or ABMs), discourse about these representations can disrupt social studies teacher candidates' dispositions of teaching social studies without engaging in critical conversations about race and power. Our study extends the literature on agent-based computing to the domain of social studies education, and demonstrates how preservice teachers' participation in agent-based modeling activities can help them adopt a more critical stance toward designing learning activities for their future classrooms.
... Individualbased modelling has the advantage of giving a bottom-up approach to the model, which can validate or test a top-down theoretical approach. In this approach, prey and predators are represented as individuals located in their ecosystem (Wilensky and Reisman, 2006;Wilson, 1998). Each individual is considered with its specific characteristics and is able to take decisions (feeding, movement etc.) through a set of rules or algorithms. ...
... Most of the individual-based models reproducing predator-prey interactions share two characteristics that affect the understanding of their results. Firstly, authors tend to add processes or parameters in order to obtain coexisting populations through time, for example adding a food resource for the prey or limiting the number of individuals per patch (Wilensky and Reisman, 2006). Adding various parameters to individual-based models can have a significant impact on their dynamics and make it difficult to understand the influence of each of these parameters on the final results (Myung, 2000). ...
... The results of the simulations showed that this approach of mixing population-based and individual-based processes is able to reproduce the same behaviour with this study's PDIB model as with the standard Lotka-Volterra PB model, with some local fluctuations that do not affect its overall stability. Several papers have discussed how it is possible to obtain the same behaviour in both kinds of models (Bascompte et al., 1997;Wilson, 1996Wilson, , 1998Wilensky and Reisman, 2006). Wilson obtains coexistence over time of both prey and predator populations in his individualbased model by limiting to 1 the number of agents allowed in each cell of the landscape. ...
... Multiple representations provide scaffolding by allowing users to construct, interpret, and switch between multiple perspectives of a domain [1]. Multi-Agent Based Simulations (MABMs) [8] provide multiple representations through concrete representations of biological entities and abstract, aggregate representations such as graphs that capture global temporal properties [1]. However, learning by linking the multiple representations of a MABM is not an easy task and requires appropriate scaffolding. ...
... This is because learners find it difficult to integrate and coordinate representations, necessitating adequate scaffolding. MABM simulation environments like NetLogo [8] with multiple representations provide effective design scaffolds for teaching ecology concepts, especially to novices. MABMs, rather than describing relationships between properties of populations, require students to primarily focus on individuals and their interactions [8], thereby engaging in "agent-level thinking" that is intuitive for novices. ...
... MABM simulation environments like NetLogo [8] with multiple representations provide effective design scaffolds for teaching ecology concepts, especially to novices. MABMs, rather than describing relationships between properties of populations, require students to primarily focus on individuals and their interactions [8], thereby engaging in "agent-level thinking" that is intuitive for novices. In contrast, studies show that non-MABM based approaches to teach complex biological phenomena have met with limited success. ...
Conference Paper
Full-text available
This paper combines Multi-Agent based simulation with causal modeling and reasoning to help students learn about ecological processes. Eighth grade students who took part in the study showed highly significant pre to post test gains on learning domain content and causal reasoning ability. Moreover, students’ success in reasoning with a causal model of the ecosystem was strongly correlated with higher learning gains. This work provides the foundations for designing scaffolded multi-agent, simulation-based intelligent learning environments with modeling and reasoning tools to help students learn science topics.
... Количество энергии, получаемое хищником за каждую съеденную жертву, и вероятность воспроизводства агентов на очередном временном шаге оставались постоянными по умолчанию. После того, как продуценты были съедены, они вырастали только через определённое время [37]. Исследовали системы продуцент-консумент с разным числом консументов. ...
... 5б), что дополняет работу [54]. В этом случае численности популяций имеют тенденцию колебаться в предсказуемом темпе при неизменных параметрах [37]. ...
Article
Lotka-Volterra predator-prey models are used to study community ecology, but their ability to generate ecological pyramids compared to field data has not been investigated in detail. In this paper, agent-based modeling (ABM) was used instead of systems of ordinary differential equations (ODE). It was shown that the two-component producer-consumer system is unstable, whereas the three-component system with consumers of the 1st and 2nd order is stable under prolonged simulation. Time slices as the program progresses can generate both ecological pyramids and cascades. Simulation results are consistent with experiments on separation of the Black Sea plankton from the area of Cape Fiolent (Crimea) into fractions ranging in size from 2 mm to 2 microns. Although biodiversity in individual samples at different points in time as well as abundance vary widely, both predictably decline with rising trophic levels in cases where the number of tests increases over time.
... What does make sense is to say that phenomena within the human world of science have computationally describable aspects. For instance, in the human practice of biological science, different species' populations (phenomena) display interdependencies (aspects) that are computationally describable in models such as the ones introduced in educational practice by Wilensky and colleagues (Wilensky et al., 2014, Wilensky andReisman, 2006). ...
... The arguments of Dreyfus and Schön do not clearly address the latter, and empirical studies diverge. Some show that computer visualisations can help students understand connections in complex systems which they are otherwise barred from engaging with (Swanson et al., 2019, Wilensky andReisman, 2006). Other studies indicate the opposite, that students do not develop the required conceptual understanding of the domain in question if they only work with visualisations and do not engage with the underlying mathematical models (Grønbaek et al., 2017). ...
Article
Computational thinking’ (CT) is highlighted in research literature, societal debates, and educational policies alike as being of prime significance in the 21st century. It is currently being introduced into K–12 (primary and secondary education) curricula around the world. However, there is no consensus on what exactly CT consists of, which skills it involves, and how it relates to programming. This article pinpoints four competing claims as to what constitutes the defining traits of CT. For each of the four claims, inherent philosophical presuppositions are identified concerning the nature of thinking, problem-solving, and human–computer relationships. It is argued that most of these philosophical presuppositions are very similar to ones that Hubert Dreyfus and Donald Schön addressed forty to fifty years ago. Therefore, claims about the power of CT raise old discussions in a new disguise. Dreyfus and Schön argued that the presuppositions were wrong. Do their arguments hold and thus defeat contemporary claims about CT? Alternatively, has the technological development since the time of their writings proven them wrong? This article argues that it is necessary to heed the arguments of Dreyfus, Schön, and—later—Bent Flyvbjerg to ensure that CT curricula are built in accord with the value-rational nature of human practice, rather than on misconceived rationalizations of problem-solving, computer use, and computational aspects in the world. However, the pervasive integration of information technology in today's world raises new issues concerning human–machine collaborations that sidetrack the arguments of Dreyfus and Schön. A revised view of CT is required which focusses on articulating how humans can design for, partake in, and critically appraise human–machine collaborations.
... According to research, academic success significantly impacts CT (Durak & Saritepeci, 2018;Kalelioglu et al., 2016;Lei et al., 2020). In the literature, the impact of CT on STEM (Günbatar & Bakırcı, 2019;Repenning et al., 2017;Wilensky & Reisman, 2006) and the importance of CT in STEM have been well documented (García-Peñalvo & Mendes, 2018;Henderson et al., 2007;Sengupta et al., 2013;Wilensky & Reisman, 2006). Studies showed CT deepen students' understanding of STEM (Grover & Pea, 2013;Riley & Hunt, 2014;Sırakaya et al., 2020). ...
... According to research, academic success significantly impacts CT (Durak & Saritepeci, 2018;Kalelioglu et al., 2016;Lei et al., 2020). In the literature, the impact of CT on STEM (Günbatar & Bakırcı, 2019;Repenning et al., 2017;Wilensky & Reisman, 2006) and the importance of CT in STEM have been well documented (García-Peñalvo & Mendes, 2018;Henderson et al., 2007;Sengupta et al., 2013;Wilensky & Reisman, 2006). Studies showed CT deepen students' understanding of STEM (Grover & Pea, 2013;Riley & Hunt, 2014;Sırakaya et al., 2020). ...
Article
Full-text available
In the advent of the rapid technological advancement of The Fourth Industrial Revolution (4IR), computational thinking is recognised as an essential skill in the 21st century across all disciplines, especially in STEM, as it trains students to have the cognitive flexibility to deal with complex problem-solving. Computational thinking (CT) is naturally embedded in STEM practices in the reflection of creativity, algorithmic thinking, critical thinking, problem solving and cooperation skills. This study aimed to measure the level of computational thinking in science matriculation students and examine the effect of gender and academic achievement in STEM on CT. The convenient sampling strategy was used to identify one matriculation college in the northern region of Malaysia to participate in the study. (CTS) instrument was employed on 153 science students. Descriptive analysis was used to evaluate the level of CT. One-way multivariate analysis of variance (MANOVA) was performed to analyse the main effect of academic achievement in STEM on CT, followed by univariate analysis of variance (ANOVA) to determine the effect on each of the dimensions of CT. The result indicates that students have a medium high level of CT with an overall mean of 3.51. In addition, the findings showed that there was a statistically significant effect of academic achievement in STEM on CT. The mean score for academic achievement revealed that good students scored the highest, followed by average students and weak students in all dimensions of CT except for cooperation. This study will provide insight into the impact of STEM learning outcomes on the development of CT to inform instructional design.
... According to research, academic success significantly impacts CT (Durak & Saritepeci, 2018;Kalelioglu et al., 2016;Lei et al., 2020). In the literature, the impact of CT on STEM (Günbatar & Bakırcı, 2019;Repenning et al., 2017;Wilensky & Reisman, 2006) and the importance of CT in STEM have been well documented (García-Peñalvo & Mendes, 2018;Henderson et al., 2007;Sengupta et al., 2013;Wilensky & Reisman, 2006). Studies showed CT deepen students' understanding of STEM (Grover & Pea, 2013;Riley & Hunt, 2014;Sırakaya et al., 2020). ...
... According to research, academic success significantly impacts CT (Durak & Saritepeci, 2018;Kalelioglu et al., 2016;Lei et al., 2020). In the literature, the impact of CT on STEM (Günbatar & Bakırcı, 2019;Repenning et al., 2017;Wilensky & Reisman, 2006) and the importance of CT in STEM have been well documented (García-Peñalvo & Mendes, 2018;Henderson et al., 2007;Sengupta et al., 2013;Wilensky & Reisman, 2006). Studies showed CT deepen students' understanding of STEM (Grover & Pea, 2013;Riley & Hunt, 2014;Sırakaya et al., 2020). ...
Article
Full-text available
In the advent of the rapid technological advancement of The Fourth Industrial Revolution (4IR), computational thinking is recognised as an essential skill in the 21st century across all disciplines, especially in STEM, as it trains students to have the cognitive flexibility to deal with complex problem-solving. Computational thinking (CT) is naturally embedded in STEM practices in the reflection of creativity, algorithmic thinking, critical thinking, problem solving and cooperation skills. This study aimed to measure the level of computational thinking in science matriculation students and examine the effect of academic achievement in STEM on CT. The convenient sampling strategy was used to identify one matriculation college in the northern region of Malaysia to participate in the study. Computational thinking scale (CTS) instrument was employed on 153 science students. Descriptive analysis was used to evaluate the level of CT. One-way multivariate analysis of variance (MANOVA) was performed to analyse the main effect of academic achievement in STEM on CT, followed by univariate analysis of variance (ANOVA) to determine the effect on each of the dimensions of CT. The result indicates that students have a medium high level of CT with an overall mean of 3.51. In addition, the findings showed that there was a statistically significant effect of academic achievement in STEM on CT. The mean score for academic achievement revealed that good students scored the highest, followed by average students and weak students in all dimensions of CT except for cooperation. This study will provide insight into the impact of STEM learning outcomes on the development of CT to inform instructional design.
... Interactions can occur in multiple ways: agents can interact among themselves, with other agents or with the environment. For example, in a predator-prey model (Reisman, 2006;Wilensky & Reisman, 1998), wolves and sheep (agents) are spatially located (environment). If wolves see a sheep nearby, they move toward it to hunt it (interaction). ...
... The first step is to identify a clear purpose, the questions the model is intended to answer, and the hypotheses. For example, in the well-known predator-prey model (Reisman, 2006;Wilensky & Reisman, 1998), the objective is to explore the populational stability of predator-prey ecosystems. The second step is to identify components, interactions, and relevant data sources that allow us to codify the model; i.e., it is necessary to define what the agents are, and how and with whom they interact. ...
Chapter
One of the most nefarious consequences of violent conflicts is forced displacement. Refugee crises have impacts on both the refugees’ place of origin (e.g., loss of human capital) and the places where they resettle (e.g., demands on health systems). Humanitarian information technologies (IT) can be used to collect, process and analyze information that may contribute to improving the livelihoods of refugees. This chapter summarizes the role of humanitarian IT in assisting refugees or organizations that provide services to them in the four steps of the refugee pathway: displacement (e.g., collecting information about the current situation), journey (e.g., providing information about the closest services on a map), temporary settlement (e.g., monitoring health programs of refugee camps), and permanent settlement (e.g., processing refugee resettlement in a new country).
... Interactions can occur in multiple ways: agents can interact among themselves, with other agents or with the environment. For example, in a predator-prey model (Reisman, 2006;Wilensky & Reisman, 1998), wolves and sheep (agents) are spatially located (environment). If wolves see a sheep nearby, they move toward it to hunt it (interaction). ...
... The first step is to identify a clear purpose, the questions the model is intended to answer, and the hypotheses. For example, in the well-known predator-prey model (Reisman, 2006;Wilensky & Reisman, 1998), the objective is to explore the populational stability of predator-prey ecosystems. The second step is to identify components, interactions, and relevant data sources that allow us to codify the model; i.e., it is necessary to define what the agents are, and how and with whom they interact. ...
Chapter
This chapter explores the use of agent-based modeling (ABM) as a methodology to analyze decision-making in humanitarian operations. We show how the individual decision-making of various stakeholders in humanitarian settings (e.g. non-governmental organizations, military, governmental organizations, United Nations) affects the overall progress of relief work. ABM takes a bottom-up approach to model individual agents with specific characteristics and show how their individual actions and interactions define the overall behavior of the system. This chapter provides researchers with an interesting tool for the analysis of humanitarian operations. It also acts as a guide for practitioners to start thinking about the evaluation of policies that could improve decision-making processes leading to a timely distribution of aid.
... (4) Wolf-Sheep Predation (Wilensky and Reisman, 1999) This model is used to explore the stability of predator-prey ecosystems. It has two main variations. ...
... In each step wolves must eat sheep in order to replenish their energy, or they die for running out of energy. To maintain the population, each wolf/sheep has a fixed probability of reproducing at each step (Wilensky and Reisman, 1999). This variation produces a population that is ultimately unstable. ...
Book
Full-text available
Description As the first comprehensive title on network biology, this book covers a wide range of subjects including scientific fundamentals (graphs, networks, etc) of network biology, construction and analysis of biological networks, methods for identifying crucial nodes in biological networks, link prediction, flow analysis, network dynamics, evolution, simulation and control, ecological networks, social networks, molecular and cellular networks, network pharmacology and network toxicology, big data analytics, and more. Across 12 parts and 26 chapters, with Matlab codes provided for most models and algorithms, this self-contained title provides an in-depth and complete insight on network biology. It is a valuable read for high-level undergraduates and postgraduates in the areas of biology, ecology, environmental sciences, medical science, computational science, applied mathematics, and social science. Sample Chapter(s) 1. Fundamentals of Graph Theory Contents: Mathematical Fundamentals: Fundamentals of Graph Theory Graph Algorithms Fundamentals of Network Theory Other Fundamentals Crucial Nodes/Subnetworks/Modules, Network Types, and Structural Comparison: Identification of Crucial Nodes and Subnetworks/Modules Detection of Network Types Comparison of Network Structure Network Dynamics, Evolution, Simulation and Control: Network Dynamics Network Robustness and Sensitivity Analysis Network Control Network Evolution Cellular Automata Self-Organization Agent-based Modeling Flow Analysis: Flow/Flux Analysis Link and Node Prediction: Link Prediction: Sampling-based Methods Link Prediction: Structure- and Perturbation-based Methods Link Prediction: Node-Similarity-based Methods Node Prediction Network Construction: Construction of Biological Networks Pharmacological and Toxicological Networks: Network Pharmacology and Toxicology Ecological Networks: Food Webs Microscopic Networks: Molecular and Cellular Networks Social Networks: Social Network Analysis Software: Software for Network Analysis Big Data Analytics: Big Data Analytics for Network Biology Readership: Advanced undergraduates and graduate students and researchers in biology, ecology, pharmacology, applied mathematics, computational science, etc.
... The original Tumor model was contributed by Prof. Gershom Zajicek. (4) Wolf-Sheep predation (Wilensky and Reisman, 1999) This model is used to explore the stability of predator-prey ecosystems (Fig. 6). It has two main variations. ...
... In each step, wolves must eat sheep in order to replenish their energy, or they die from running out of energy. To maintain the population, each wolf/sheep has a fixed probability of reproducing at each step (Wilensky and Reisman, 1999). This variation produces a population that is ultimately unstable. ...
Book
Full-text available
This invaluable book is the first of its kind on "selforganizology", the science of self-organization. It covers a wide range of topics, such as the theory, principle and methodology of selforganizology, agent-based modelling, intelligence basis, ant colony optimization, fish/particle swarm optimization, cellular automata, spatial diffusion models, evolutionary algorithms, self-adaptation and control systems, self-organizing neural networks, catastrophe theory and methods, and self-organization of biological communities, etc. Readers will have an in-depth and comprehensive understanding of selforganizology, with detailed background information provided for those who wish to delve deeper into the subject and explore research literature. This book is a valuable reference for research scientists, university teachers, graduate students and high-level undergraduates in the areas of computational science, artificial intelligence, applied mathematics, engineering science, social science and life sciences. https://doi.org/10.1142/9685
... Using the R-package "strucchange", we identified the breakpoints that signify regime shifts in the system components and their long-term co-evolution. Breakpoints are the number of observations that are the last in one segment of the classical regression models, whereby the regression coefficients shift from one stable regression relationship to another (Wilensky and Beisman, 1999). In the strucchange approach, identifying fracture points is not a prerequisite. ...
... Evolving cooperation and sustainability for common pool resources 2. PRACTICE DESCRIPTION in science (e.g., Aktipis et al., 2011;Ghorbani & Bravo, 2016;Waring et al., 2017) and education (e.g., Dickes et al., 2016;Wilensky & Reisman, 2006). We have developed a range of models of social-ecological systems to help students understand the mechanisms that influence the evolution of cooperation around CPR use. ...
Chapter
Full-text available
Since knowledge about evolution—and especially human evolution—is insufficient, we aimed to design three student centered online activities. These activities deal with human evolution and are intended to expose high school biology students and pre-service science teachers to issues concerning human evolution in order to enhance their knowledge of evolution and human evolution whilst also potentially enhancing their acceptance of evolution. The activities deal with lactose tolerance, celiac disease and starch consumption affecting diabetes. Additionally, we describe the principles that guided the design of these three activities: issues connecting to students’ lives; noncontentious topics regarding human evolution; human evolution examples that occurred in the not-too-distant past; unambiguous genetic frame stories including simple genetic mutations that affect known traits; and examples that expose students to basic bioinformatics tools for facing authentic scientific issues dealing with genetic evidence of evolution. Furthermore, we present the results of pre-service science teachers’ experiences with one of the activities, which demonstrate that a significant proportion of these teachers used more evolution key concepts after experiencing the activity. Notably, a significant proportion of these teachers showed an increase in evolution acceptance. In-service teachers who experienced one of the activities recommended the introduction of genetic evidence of human evolution via the activity and did not predict opposition among their students. Thus, we recommend the use of these activities among high school biology students since dealing with a relevant topic that includes clear and straightforward evidence of evolution may lead to better knowledge, a greater acceptance of evolution and human evolution, and the improved negotiation of evolution related socioscientific issues (SSIs).
... Programming with krABMaga: the Wolf, Sheep, and Grass Model 3.1 This Section describes the process of designing and implementing an ABM with krABMaga, using the Wolf, Sheep, and Grass (WSG) model as a use case. This model is a typical example of the effective use of ABMs and has been widely studied (Wilensky & Reisman 1998. ...
... We also used agent-based modeling [44][45][46][47] to construct source codes for our simulations. Modeling flexibility, inherent dynamics, the ability to model individual behavior, spatial consideration, and the logical entrance of complexity and noise in the system are some advantages of mimicking biological processes with agent-based modeling. ...
Article
Full-text available
Previous studies have revealed the extraordinarily large catalytic efficiency of some enzymes. High catalytic proficiency is an essential accomplishment of biological evolution. Natural selection led to the increased turnover number, kcat, and enzyme efficiency, kcat/KM, of uni–uni enzymes, which convert a single substrate into a single product. We added or multiplied random noise with chosen rate constants to explore the correlation between dissipation and catalytic efficiency for ten enzymes: beta-galactosidase, glucose isomerase, β-lactamases from three bacterial strains, ketosteroid isomerase, triosephosphate isomerase, and carbonic anhydrase I, II, and T200H. Our results highlight the role of biological evolution in accelerating thermodynamic evolution. The catalytic performance of these enzymes is proportional to overall entropy production—the main parameter from irreversible thermodynamics. That parameter is also proportional to the evolutionary distance of β-lactamases PC1, RTEM, and Lac-1 when natural or artificial evolution produces the optimal or maximal possible catalytic efficiency. De novo enzyme design and attempts to speed up the rate-limiting catalytic steps may profit from the described connection between kinetics and thermodynamics.
... Evolving cooperation and sustainability for common pool resources 2. PRACTICE DESCRIPTION in science (e.g., Aktipis et al., 2011;Ghorbani & Bravo, 2016;Waring et al., 2017) and education (e.g., Dickes et al., 2016;Wilensky & Reisman, 2006). We have developed a range of models of social-ecological systems to help students understand the mechanisms that influence the evolution of cooperation around CPR use. ...
... Evolving cooperation and sustainability for common pool resources 2. PRACTICE DESCRIPTION in science (e.g., Aktipis et al., 2011;Ghorbani & Bravo, 2016;Waring et al., 2017) and education (e.g., Dickes et al., 2016;Wilensky & Reisman, 2006). We have developed a range of models of social-ecological systems to help students understand the mechanisms that influence the evolution of cooperation around CPR use. ...
Chapter
Full-text available
Addressing the complex and controversial problems we face today requires education to empower citizens with competencies in sustainability that allow them to contribute to more just and sustainable societies. Many sustainability problems are strongly linked to evolutionary processes. When complex problems can be informed by science, these are known as socioscientific issues (SSI). Educational approaches that explore SSI have been shown to contribute to the development of functional scientific literacy and character development. Together, this suggests that evolution education through the SSI approach may contribute to the development of key competencies in sustainability. To test this hypothesis and understand how evolution education has been explored through SSI approaches, we performed a systematic literature review to identify the key competencies in sustainability developed in papers addressing evolution through SSI. Our results indicate that a few studies have addressed evolution education through SSI and support the potential of this approach since all key competencies in sustainability were found in these studies; however, some of these competencies (e.g., strategic and anticipatory competencies) were not frequently observed. Our results also support the interest in this approach to evolution education since all evolution education dimensions were found. However, the analysed studies show little diversity in terms of the explored SSI, with the majority being related to biotechnology. The implications of these findings and important highlights for educational practices and research are discussed.
... This place simulates a virtual platform or a physical place in which agents can meet and trade, but only the ones that are there at the same time are allowed to trade with each other. Moreover, it allows more interaction between agents than for instance a random wondering in the simulation world as in the classic wolf-sheep predation model (Wilensky and Reisman, 1998), although it maintains random interactions. When the agents are selected to buy (randomly between the ones in the market-house) they will buy all the water they can from the other agents until their WTP is lower than the sellers' WTA or they have contacted all the sellers they can. ...
... Evolving cooperation and sustainability for common pool resources 2. PRACTICE DESCRIPTION in science (e.g., Aktipis et al., 2011;Ghorbani & Bravo, 2016;Waring et al., 2017) and education (e.g., Dickes et al., 2016;Wilensky & Reisman, 2006). We have developed a range of models of social-ecological systems to help students understand the mechanisms that influence the evolution of cooperation around CPR use. ...
Book
Full-text available
EuroScitizen and EvoKE offer you the ebook "Learning evolution through socioscientific issues". Produced by professionals from 15 different countries, this book includes 6 theoretical chapters on the teaching and learning of biological evolution and on the socioscientific issues pedagogical approach, and 6 chapters with practical activities on a variety of topics and grade levels, which you can use to inspire your own research.
... Similar aspirations to encode and articulate problems so that we can attempt to solve them with calculations remains strong today in computer science and its engagements with disciplines such as biology, sociology or urban planning. For example, in the field of animal population dynamics the relationship between the size of a wolf pack, the well-being and reproduction patterns of a heard of sheep and growth pattern of the grass field can be encoded in a computational model that explores the stability of this ecosystem depending on changing parameters (Wilensky & Reisman 1998). Another well-known example is the Segregation Model that establishes a computable relationship between race, life-style preferences and the resulting habitation patterns in a neighbourhood (Schelling 1971). ...
Conference Paper
Full-text available
We engage in a conversation with critical ecofeminism, which proposed to transform the colonialism-racism-capitalism-patriarchalism induced environmental crisis by non-essentialist countering of oppressions and hyper-separations produced by human/nature dualism. We modulate the critical ecofeminist approach by countering a similar dualism, namely that of nature/technology. Furthermore, our theoretical balance-act has a praxis-oriented side: we believe that computation can be included in ecofeminist action. By providing alternative forms of engagement to instrumentalization, we trace pathways to different futures, countering the binary narratives of technology but also its moralizing of socio-cultural mediation. We take an intersectional approach to outcomes of computational modelling (simulations, visualisations, forecasts) and discuss the ecofeminist method of synthesis as a way to include different perspectives into computational processes. We work with two ‘modulated models’ that pay attention to assumptions, observations and thinking about urban commoning initiatives, and amateur knowledge of radio telecommunications. We aspire to provoke discussions about different modes of inclusion in communities and archives that are centred on shared, environment-friendly, solidarity oriented life-style and mutual care. Our approach engages with feminist arguments and inquiries into ways patriarchalism is embedded in our relationship to technoscience and engineering. We explore modes of resistance by proposing skilled and alternative uses of these techniques.
... While constructionism can be used as a general pedagogy, e.g., to guide the design of programming environments for children and youth (Maloney et al., 2010), there is rich empirical evidence demonstrating how building simulation models can help people learn about complicated real-world systems (e.g. Barab et al., 2000;Blikstein and Wilensky, 2009;Jonassen et al., 2005;Klopfer, 2003;Klopfer et al., 2005;Louca and Zacharia, 2012;Smetana and Bell, 2012;Stieff and Wilensky, 2003;Wilensky and Reisman, 1999;Wilensky and Resnick, 1999), although there can be challenges. Learners often need scaffolding to engage in the modeling process (Sins et al., 2005). ...
Article
Complex systems simulations can support collaborative water planning by allowing stakeholders to jointly see hidden effects of land- and water-use decisions on groundwater flow. We adopted a participatory modeling progression where stakeholders learned to modify and use increasingly sophisticated models to assess policy impacts on groundwater levels. Stakeholders' shared understanding of the problem and the novelty, concreteness, and richness of proposed solutions evolved alongside the models’ degree of realism, but up to a certain point. More realistic models became a distraction and stymied efforts to plan for water shortages. The reflective learning required to plan for complex environmental problems is best supported by models that strike a balance between representational fidelity and end-user intelligibility. Complicated models and high-resolution data may overwhelm model users, preventing them from acting on the useful planning insights they derived from the exploratory modeling, particularly within social contexts that exhibit strong power dynamics and favor prediction.
... To simulate characteristic properties of predator-prey population dynamics in view of individuals, the Lotka-Volterra equation was enhanced with other theories to increase the stability and a Lotka-Volterra-like behaviour was implemented in StarLogoT simulating the population dynamics of wolf-sheep on grass (Wilensky, 1997;Wilensky & Reisman, 1998). Blaauw et al (2010) simulated fossil proxy time series using the Random Walk algorithm, and found the simple procedures can generate results that resembling the patterns usually identified in palaeoecological data, such as abrupt events, long-term trends, quasi-cyclic behaviour, extinction and immigration. ...
Thesis
Environmental problems caused by lake eutrophication have become more widespread at a global level, threatening the safety of water, food and other daily needs of people living in the vicinity of lakes. Although lake types, physical conditions and causes of degradation vary from region to region, all threatened lake ecosystems face the same problem of identifying the mechanisms that underlie the deterioration of water quality or cause algal blooms in lakes, and discovering pathways to recovery is becoming an increasingly urgent matter. Theoretical and mathematical models based on Alternative Stable States (ASS) can be used to model and explain the deterioration and restoration of shallow lakes, while multiple patterns of ecosystem state response to external drivers have been validated by a large number of observations in European and North American lakes. In eastern China, eutrophic lakes account for 86.7% and heavily eutrophic lakes for 12.2% of more than 100 lakes in the Middle and Lower Reaches of the Yangtze River plain. Large lakes, such as Chaohu and Taihu, have experienced large cyanobacterial outbreaks in the early twenty-first century, and these disasters will probably be replicated in other lakes under similar conditions. I therefore need to understand the mechanism underlying such catastrophic shifts in lake ecosystems. Natural and anthropogenic influences on the ecological trajectories of lake ecosystems, response mechanisms, and how to avoid subsequent catastrophes are the main questions addressed by this thesis. To discover the external conditions that predominantly affect lake ecosystems in the lower Yangtse River, I used palaeolimnological tools and selected a typical lake in the middle and lower reaches of the Yangtze River, Lake Taibai, for a case study. The study used information relating to the species composition of subfossil diatom genera as indicators of the reconstructed ecosystem state as well as lake area, depth, chlorophyll content, transparency, ion concentration and nitrogen and phosphorus content as environmental factors to analyse the correlation between environmental changes. In terms of anthropogenic impacts, historical data on lake hydrodynamics, fish farming and nutrient loading in the basin over past decades were recovered using historical records, literature research and proxy reconstruction to calculate the magnitude of the correlation between ecosystem state driven by human and natural factors and to analyse possible responses by examining feedback mechanisms. Since it is difficult to use palaeoecological data alone to reveal the dynamic ecosystem changes and emergent mechanisms under the influence of various external factors, I developed an agent-based model (ABM) to simulate the influence of environmental and human activities on lake ecosystems to help analyse how past patterns of shifts in state occurred, the influence of external conditions on these changes and how to avoid the development of catastrophic ecosystem failures. The ABM was constructed on the basis of the predation-prey relationship and other interactions like competition and providing refuge and on known ecological theories to simulate population dynamics in aquatic food webs in response to external drivers. The developed ABM – LAKEOBS_MIX achieves a reasonable balance between generality and realism, providing insights of how ecosystems were affected by various drivers in MLYB-like lakes. The effects of the external environment are implemented as sub-models to the biotic interactions, and the currently available factor models are lake nutrient levels, temperature, water depth, area and changes in the number of fishes in the lake. In-silico experiments were designed with multiple factors to simulate different conditions and measure ecosystem response patterns. The simulations show that the patterns of equilibria developed from the same initial state can be very different due to stochasticity of spatial distribution and decision making of individuals in functional groups. Decades are needed for the modelled ecosystem to form a dynamic equilibrium without external disturbance, which can be altered by any sudden extinction of a functional group. In experiments where nutrient loading constantly impacts on the formed equilibrium, results show regime shifts that occurred in different time and patterns due to changes in the amount, adding pace and timing of nutrient load, the presence of other influencing factors. Besides, recovery pathways are simulated in hyper-eutrophic system state, confirming moderate natural fishing and nutrient removing are efficient approaches to lower the total nutrient level in lake ecosystems. Combining these two approaches provided us with an improved understanding of the development trajectory of lake ecosystems in the MLYB over the last 100 years. I discovered links between changes in conditions and their corresponding response mechanisms, and according to the results of simulations I was able to develop possible regulatory approaches to avoid abrupt system degradation
... The WSP model explores the stability of predator-prey relationships 21 . The construction of this model is described in two principle articles 45,46 . In our investigation, we used a variation of the model which includes grass in addition to wolves and sheep. ...
Article
Full-text available
Regime shifts can abruptly affect hydrological, climatic and terrestrial systems, leading to degraded ecosystems and impoverished societies. While the frequency of regime shifts is predicted to increase, the fundamental relationships between the spatial-temporal scales of shifts and their underlying mechanisms are poorly understood. Here we analyse empirical data from terrestrial (n = 4), marine (n = 25) and freshwater (n = 13) environments and show positive sub-linear empirical relationships between the size and shift duration of systems. Each additional unit area of an ecosystem provides an increasingly smaller unit of time taken for that system to collapse, meaning that large systems tend to shift more slowly than small systems but disproportionately faster. We substantiate these findings with five computational models that reveal the importance of system structure in controlling shift duration. The findings imply that shifts in Earth ecosystems occur over ‘human’ timescales of years and decades, meaning the collapse of large vulnerable ecosystems, such as the Amazon rainforest and Caribbean coral reefs, may take only a few decades once triggered. Little is known about how the speed of ecosystem collapse depends on ecosystem size. Here, Cooper, Willcock et al. analyse empirical data and models finding that although regime shift duration increases with ecosystem size, this relationship saturates and even large ecosystems can collapse in a few decades.
... There are several approaches that promote creative theory development (DiSessa 2000, Wilensky & Reisman 2006, Lipman 1988, Bereiter & Scardamalia 2010. Bereiter & Scardamalia's theory about knowledge building combines a focus on students as producers of ideas and on the elaboration and development of these ideas and theories. ...
... DOE was first applied in agriculture and biology, and its general 10 potential for analyzing simulation models has been recognized (see, e.g., Antony has the potential to address the validity aspect. They focus on conducting 31 subexperiments to better understanding the behavior of submodels. However, 32 they do not outline a process to tackle the possibly complex interactions between 33 submodels. ...
Preprint
Abstract Individual-based modeling is considered an important tool in ecology and other disciplines. A major challenge of individual-based modeling is that it addresses complex systems that include a large number of entities, hierarchical levels, and processes. To represent these, individual-based models (IBMs) usually comprise a large number of submodels. These submodels might be complex by themselves and interact with each other in many ways, which in turn can affect the overall system behavior in ways that are not always easy to understand. As a result, both the validity and credibility of IBMs can be limited. We here demonstrate how a cascaded design of simulation experiments (cDOE) may support the validity and efficiency of the analysis of IBMs and other ecological simulation models. We take a systematic approach that adopts a divide-and-conquer strategy. In a preparatory phase, submodels and their parameters are configured in “subexperiments”. Consequently, the “top-level experiments” of the simulation model can assess the research questions in a more valid and efficient way. Our strategy thus supports the structural realism of individual-based models because both the behavior of their main components and the relationships between these components are explicitly addressed. Keywords: sensitivity analysis; design of experiments; ecological theory; computational modeling; model analysis; validation
... El modelo da un ejemplo estilizado de la posible dinámica en la gestión de recursos para el control de la delincuencia, basándose en un modelo ecológico de presa-predador desarrollado por Wilensky & Reisman (1999). Comenzando con 265 dicho modelo, se programó el presente MbA, tratando de usar la mayor abstracción posible, incluyendo el mínimo de variables y simplificando al máximo el modelo; aplicando el principio KISS, Keep It Simple, Stupid! (Axelrod, 1997), que insta a que los MbA sean lo más simples posible para que ayuden a entender el fenómeno que modelan. ...
Article
Full-text available
This paper presents Agent-based Modeling (ABM) as a useful tool for analyzing social systems and the impact of possible interventions on such systems. To illustrate the power of ABM, the paper uses a didactic example of resource management applied to delinquency control. Using the ABM’s results the paper shows the analysis of possible resource management for public security. This model shows how four types of dynamics appear as the value of the reward for catching thief increases, with abrupt thresholds between each of them: a) police extinction; b) co-existence of police and thief, c) police and thief extinction, and d) thief extinction. The paper concludes that ABM may help in the planning of resources in social interventions and/or research aimed at explaining the basis of non-linear social dynamics.
... Wilensky and colleagues have advocated for thinking in levels as key to building explanations: being able to move between an emergent phenomenon, such as evaporation, and an agent level, such as that of water particles and their individual behaviors and properties (first described in Wilensky & Resnick, 1999). In addition, they have argued that levels slippage, or conflating the behaviors and properties of agents at one level with those at a different scalar level, underlies many errors in student reasoning (e.g., Levy & Wilensky, 2009;Sengupta & Wilensky, 2009;Wilensky & Reisman, 2006). Building from this work, we identify this strategy-considering what is going on at the scalar level below the level of the observed phenomenon-as an important epistemic heuristic guiding students' construction of mechanistic accounts. ...
Article
Mechanistic reasoning, or reasoning systematically through underlying factors and relationships that give rise to phenomena, is a powerful thinking strategy that allows one to explain and make predictions about phenomena. This article synthesizes and builds on existing frameworks to identify essential characteristics of students’ mechanistic reasoning across scientific content areas. We argue that these characteristics can be represented as epistemic heuristics, or ideas about how to direct one’s intellectual work, that implicitly guide mechanistic reasoning. We use this framework to characterize middle school students’ written explanatory accounts of two phenomena in different science content areas using these heuristics. We demonstrate evidence of heuristics in students’ accounts and show that the use of the heuristics was related to but distinct from science content knowledge. We describe how the heuristics allowed us to characterize and compare the mechanistic sophistication of account construction across science content areas. This framework captures elements of a crosscutting practical epistemology that may support students in directing the construction of mechanistic accounts across content areas over time, and it allows us to characterize that progress.
... Agent-based models and modeling environments are a specific type of simulation that has gained much traction in middle and high school science education research. These simulations are particularly well suited for exploring emergent systems, whereby a system is comprised of many elements (such as atoms, electrons, or organisms -see Figure 7) which, when they interact with one another and their immediate environment, create an often unexpected outcome that is observable at a different level than the elements themselves (liquid diffusion; current; or the SIR pattern of disease spread (for a recent review see Wilensky & Jacobson, 2014;Wilensky & Reisman, 2006). Figure 7. ...
Technical Report
Full-text available
What it means to work with data has changed significantly since the preparation and publication of America’s Lab Report (Singer, Hilton, & Schweingruber, 2006) in ways that are impacting students, educators, and the very practice of science. This change is expressing itself most obviously in the abundance of data that can be collected and accessed by students and teachers. There are also notable changes in the types of data (e.g., GPS data, network data, qualitative/verbal data) that are now readily available, and the purposes for which data are collected and analyzed. These shifts have both generated enthusiasm and raised a number of questions for K-12 science educators as new science standards are being adopted across the United States. The questions driving this paper are: In this age of data abundance, what is the state of research on data use to support middle and secondary students’ learning? And, how might science and engineering education and educational research for those grade levels adapt to the changes in data availability and use observed in the past 10 years?
... By engaging in these practices, learners are expected to discover the inner workings of scientific and mathematical systems: First elaborating their understandings of a given system through constructing a computer model, then "debugging" that knowledge by testing and refining the model (Papert, 1980;Penner, 2000). With proper facilitation and support, computational modeling is generally understood to be an effective way to engage learners in model-based inquiry and knowledge construction ( van Joolingen, de Jong, & Dimitrakopoulou, 2007;Wilensky & Reisman, 2006). Here, we argue that it can also be transformative at the classroom level by providing a framework for students to collectively contribute, evaluate, and synthesize scientific ideas through the production and refinement of shared playable artifacts. ...
Conference Paper
Full-text available
Although computational modeling is noted as a powerful way to engage students in scientific knowledge construction, many studies focus on individuals or small groups. Here, we explore computational modeling as an infrastructure to support classroom level knowledge building. We present data from a two-week study where two fifth grade classrooms modeled evaporation and condensation. We focus our analysis on one group that experienced success with the activity, and another that struggled; these groups’ intended models emphasized random motion and aggregation respectively, two important but complementary molecular behaviors. Both groups’ ideas were incorporated into a collective model designed in consultation with the entire class. We show that computational modeling (1) often required explicit support, but when leveraged productively (2) served a representational role by supporting the elaboration of student ideas about physical mechanism, and (3) served an epistemic role by allowing students to compare, synthesize, and build on other’s contributions.
... Réciproquement ses actions peuvent modifier cet environnement. Etant centré sur l'individu, un SMA permet à l'utilisateur d'une simulation d'assumer le rôle d'un agent et ainsi de « penser comme un loup, un mouton ou une mouche » [3]. Ces dernières années, le développement de Cormas a pris une direction innovante orientée vers la modélisation participative, à savoir la conception collective de modèles et la simulation interactive. ...
Conference Paper
Full-text available
Ce document présente les nouvelles fonctionnalités de Cormas, une plate-forme de modélisation multi-agent dédiée à la gestion des ressources renouvelables. Logiciel libre, Cormas vise à concevoir facilement un SMA et analyser des scénarios. Aujourd'hui, Cormas a pris une direction novatrice orientée vers la conception collective de modèles et la simulation interactive. Ces simulations hybrides mixent des décisions prises par les joueurs et d'autres par le modèle. Cela permet d'interagir avec une simulation en modifiant le comportement des agents et la façon dont ils utilisent les ressources. Ainsi, il est possible d'explorer collectivement des scénarios à moyen et long terme afin de mieux comprendre comment atteindre une situation souhaitée. En retour, cela permet de réviser collectivement le modèle conceptuel. Après avoir expliqué la philosophie de la modélisation d'accompagnement, ce document présente la façon dont les fonctionnalités de Cormas sont appliquées à travers trois expériences. Abstract This paper presents the new functionalities of Cormas, a multi-agent modeling platform dedicated to the management of renewable resources. As free software, Cormas is intended to facilitate the design of ABM as well as the monitoring and analysis of simulation scenarios. Today Cormas has taken an innovative direction oriented towards the collective design of models and interactive simulation. These hybrid simulations are mixing decisions taken by stakeholders and others by the model. This allows the user to interact with a simulation by changing the behavior of agents. Thus, it is possible to collectively explore medium and long-term scenarios to better understand how a desired situation may be reached. In turn, this feed back into the collective design of the model. After having explained the philosophy of the companion modeling, this paper presents how the Cormas functionalities are put into practice through three experiments with stakeholders facing actual environmental challenges.
... Netlogo, a multi-agent programmable modelling environment is used because simple language structure allows relatively rapid development of models, which can incorporate large numbers of static 'patch' agents or moving individual agents. Time and movement can be modelled and extensions allow input and output of GIS readable files (Wilensky and Reisman 2006). The dispersal model is built in Netlogo as this is a relatively straightforward modelling package which if the model were used as a tool by lynx conservationists would aid transparency in terms of its flaws and assumptions. ...
Conference Paper
Published version available at: Philips, I. (2019) An Agent Based Model to Estimate Lynx Dispersal if Re-Introduced to Scotland. Applied Spatial Analysis and Policy https://doi.org/10.1007/s12061-019-09297-4 Re-introduction of Eurasian Lynx to Scotland is being considered. Work by others has identified areas of suitable habitat, but it is not contiguous. This model examines the potential for lynx to successfully disperse from a release point. Movement rules derived from observation of wild lynx in Europe have been established in the literature. These are used in the current model to assess whether lynx are able to bridge the gap between noncontiguous areas of habitat and whether their typical movement patterns suggest that they will explore enough habitat in the months following release. The model is built using Netlogo software. Results, based on observed lynx movement rules from several European studies suggest a number of sites which would enable released lynx to access sufficient habitat to establish territories. The visual nature of the model and its mapped outputs may also have a discursive function as a tool to help people understand a landscape which includes lynx and people. The model could be easily adapted to include modified movement rules specific to the Scottish case and further developed to model human interactions such as risks from road traffic and disturbance from recreation.
... Agent-based modeling (hence ABM) has been increasingly used by natural scientists to study a wide range of phenomena such as the interactions of species in an ecosystem, the interactions of molecules in a chemical reaction, the percolation of oil through a substrate, and the food-gathering behavior of insects (e.g., Bonabeau, Dorigo, & Théraulaz. 1999; Wilensky & Reisman, 1998). Such phenomena, in which the elements within the system (molecules, or ants) have multiple behaviors and a large number of interaction patterns, have been termed complex and are collectively studied in a relatively young interdisciplinary field called complex systems or complexity studies (e.g., Holland, 1995). ...
Conference Paper
Full-text available
We have been exploring the potential of agent-based modeling methodology for socialscience research and, specifically, for illuminating theoretical complementarities of cognitive and socio-constructivist conceptualizations of learning (e.g., Abrahamson & Wilensky, 2005a). The current study advances our research by applying our methodology to pedagogy research: we investigate individual and social factors underlying outcomes of implementing collaborative-inquiry classroom practice. Using bifocal modeling (Blikstein & Wilensky, 2006a), we juxtapose agent-based simulations of collaborative problem solving with real-classroom data of students' collaboration in a demographically diverse middle-school mathematics classroom (Abrahamson & Wilensky, 2005b). We validate the computer model by comparing outcomes from running the simulation with outcomes of the real intervention. Findings are that collaboration pedagogy emphasizing group performance may forsake individual learning, because stable division-of-labor patterns emerge due to utilitarian preference of short-term production over long-term learning (Axelrod, 1997). The study may inform professional development and pedagogical policy (see interactive applet: http://ccl.northwestern.edu/research/conferences/CSCL2007/CSCL2007.html).
... That the equation-based model can be given an agent-based interpretation has also been frequently seen in other social sciences. For example, Uri Welinsky, the founder of NetLogo, frequently used the famous Lokta-Volterra equation, a prominent equation in ecology, as an example to illustrate how the same kind of phenomena can be generated by agent-based models(Wilensky and Reisman, 2006) (see NetLogo Models Library: Sample Models/Biology, Wolf Sheep Predation). However, because of the inclusion of geographical specifications, the Lokta-Volterra equation can only be considered as an approximation of the result dynamics generated by the agent-based model.www.economics-ejournal.org ...
Article
Full-text available
In this paper, the effect of the social network on macroeconomic stability is examined using an agent-based, network-based DSGE (dynamic stochastic general equilibrium) model. While the authors' primitive (first-stage) examination has the network generation mechanism as its main focus, their more in-depth second-stage analysis is based on a few main characteristics of network topologies, such as the degree, clustering coefficient, length, and centrality. Based on their econometric analysis of the simulation results, the authors find that the betweenness centrality contributes to the GDP instability and average path length contributes to the inflation instability. These results are robust under two augmentations, one taking into account non-linearity and one taking into account the shape of the degree distribution as an additional characteristic. Through these augmentations, the authors find that the effect of network topologies on economic stability can be more intriguing than their baseline model may suggest: in addition to the existence of non-linear or combined effects of network characteristics, the shape of the degree distribution is also found to be significant. © Author(s) 2014. Licensed under the Creative Commons License - Attribution 3.0.
... Por estas razones, la simulación es una manera de desarrollar teoría y realizar estimaciones preliminares que posteriormente favorecerán las decisiones sobre posibles estudios empíricos (Axelrod, 1997). En el presente manuscrito usaremos el Modelamiento basado en Agentes (MbA) como herramienta de simulación (Maguire, McKelvey, Mirabeau & Oztas, 2006;Wilensky & Reisman, 1999). ...
Article
Full-text available
Gender stereotypes are sets of characteristics that people believe to be typically true of a man or woman. We report an agent-based model (ABM) that simulates how stereotypes disseminate in a group through associative mechanisms. The model consists of agents that carry one of several different versions of a stereotype, which share part of their conceptual content. When an agent acts according to his/her stereotype, and that stereotype is shared by an observer, then the latter's stereotype strengthens. Contrarily, if the agent does not act according to his/her stereotype, then the observer's stereotype weakens. In successive interactions, agents develop preferences, such that there will be a higher probability of interaction with agents that confirm their stereotypes. Depending on the proportion of shared conceptual content in the stereotype's different versions, three dynamics emerge: all stereotypes in the population strengthen, all weaken, or a bifurcation occurs, i.e., some strengthen and some weaken. Additionally, we discuss the use of agent-based modeling to study social phenomena and the practical consequences that the model's results might have on stereotype research and their effects on a community.
... The right part of the screen is called the Aggregate Model and its evolution is only graphically depicted. This part of the screen describes the evolution of the two populations, based strictly on the mathematical treatment, and provided, in turn, by a form of the famous set of the two Lotka-Volterra equations [28][29]. ...
Article
Full-text available
In this paper, the effectiveness of the NetLogo programming environment is in- vestigated, as regards assisting students (of various levels of achievement) of the Greek Higher Secondary Education, to understand how some simple ecosystems are structured and to model the systemic behaviour of such ecosystems by con- ceptualising their Complexity features. This paper is part of a wider research on teaching Ecosystem Complexity to high-school students, with the use of In- formation and Communication Technologies (ICT’s). Specific models from the NetLogo Models’ Library were used and students of the 2nd class of the Greek Lyceum (aged between 16 and 17) participated in the investigation. Apart from oral instruction, the students were asked to run the NetLogo simulations and do specific things with the models, answering simultaneously questions on work- sheets provided to them. The studying and the evaluation of the worksheets by the researchers, as well as the post- instructional evaluation of the students, both oral (by means of cassettes) and written, through the use of an evaluation sheet, gave research findings that proved to be encouraging in that: a) students devel- oped a greater understanding of the complex/systemic behaviour of ecosystems and b) they were capable, to a certain extent, of analyzing the systemic relations within simple ecosystems and built analogous relations in other, also simple, e- cosystems.
Article
Full-text available
Research on exploring the relationship between computational thinking and domain specific knowledge gains (i.e. biological evolution) are becoming more common in science education research. The mechanisms behind these relationships are not well understood, particularly between computational practices and biological evolution content knowledge. Increased computational complexity (i.e. simple to complex) may support a greater comprehension of scales or levels of biological organization (i.e. micro to macro) within the context of biological evolution learning. We made use of quantitative methods from qualitative work in the form of coding and relational analysis to identify which biological levels of organization students addressed, how students made connections between these levels and the level of computational complexity displayed during evolution learning with the use of two computational interventions. The aim of this study was not only exploring the biological levels and biological level connections made during the computational thinking interventions, but also analysis of the differences between these two interventions. The results illuminated that use of specific biological levels, biological level connections and differences in computational complexity were distinguishable and there were significant differences between the interventions. These factors may contribute to better understanding of biological evolution knowledge gains.
Poster
Full-text available
College is a critical time when changes in students' attitudes, knowledge, personality characteristics, and self-concepts are affected by their face-to-face and online interactions with educators, peers, and the campus climate (Astin, 1997). The growing use of big data and analytics in higher education has fostered research that supports human judgement in the analysis of information about learning and the application of interventions that can aid students in their development and improve retention rates (Siemens & Baker, 2012). This information is often displayed in the form of learning analytics dashboards (LADs), which are individual displays with multiple visualizations of indicators about learners, their learning activities, and/or features of the learning context both at the individual and group levels (Schwendimann et al., 2017). The information presented in LADs is intended to support students' learning competencies that include metacognitive, cognitive, behavioral, and emotional self-regulation (Jivet et al., 2018). We investigated the impact of a student-facing LAD on students' self-concepts and viewing preferences to address the following questions: What are students' viewing preferences (i.e., for individual vs. comparative performance feedback)? How does viewing performance information affect the development of students' metacognitive skills and self-concepts? And, what are students' perceptions about the usability of LADs? In an end-of-term survey, 111 students at a large research university responded to 10 Likert scale and three open-ended questions. Overall, the students reported understanding the information that was presented to them through the LAD and that it was useful, although some students expressed concerns about its accuracy and wanted more detailed information. Students also reported that they preferred to view comparisons to other students over just viewing their own performance information, and that LAD use increased positive affect about performance. Students also reported that dashboard use affected how much they believed they understood the course material, the time and effort they were willing to put into the course, and that it lessened their anxiety. We concluded that course-specific or program-specific related outcomes may require different LAD design and evaluation approaches, and the nonuse of the LAD may be linked to self-confidence, forgetfulness, and a lack of innovative dashboard features. Our study was limited by the analysis of survey data (without trace data), and the sample size. This research contributes to the literature on student-facing learning analytics dashboards (LADs) by investigating students' reasons for interacting with dashboards, their viewing preferences, and how their interactions affect their performance and tying these insights to educational concepts that were a part of the LAD design. Further research is needed to determine whether presenting students with the option to turn on the dashboard for any or all of their courses over the course of the semester is important,
Chapter
Full-text available
When embarking on a new model, a programmer working with scholars in the humanities is often tasked with helping a likely non-programmer(s) with critical decisions concerning how to set about modeling the theory at hand. I argue that, in these early stages of development, the goals of the researcher and epistemological considerations are of paramount importance to the development of valid computational models. In order to start this discussion with a real-world example, this chapter outlines a mistake, made by myself, in a critical stage early on in the modelling process. Specifically, using early discussions with the theorist, I suggested modeling the theory as an agent-based model. After some critical reflection after substantial development, I came to the conclusion that the theory is better modelled as a system dynamics model. In the chapter, I reflect on what drove me to make the original mistake, what caused me to realize the error, and what the result of correcting the error was. I share this mistake in this chapter for two reasons: (1) so that others in similar situations might not fall into the same trappings and (2) to open up a dialogue concerning epistemology of the social sciences and humanities insofar as it relates to modelling and simulation. My general conclusion is that the thinking received by the social scientist and humanities scholar should be fully flushed out at early stages of model development, as their strength is attention to theoretical nuance. This is of utmost importance to model development, which if unaddressed should still cause issues later during model validation and verification.
Data
Full-text available
Article
Full-text available
Learners often struggle to grasp the important, central principles of complex systems, which describe how interactions between individual agents can produce complex, aggregate-level patterns. Learners have even more difficulty transferring their understanding of these principles across superficially dissimilar instantiations of the principles. Here, we provide evidence that teaching high school students an agent-based modeling language can enable students to apply complex system principles across superficially different domains. We measured student performance on a complex systems assessment before and after 1 week training in how to program models using NetLogo (Wilensky, 1999a). Instruction in NetLogo helped two classes of high school students apply complex systems principles to a broad array of phenomena not previously encountered. We argue that teaching an agent-based computational modeling language effectively combines the benefits of explicitly defining the abstract principles underlying agent-level interactions with the advantages of concretely grounding knowledge through interactions with agent-based models.
Article
Full-text available
Human activity often generates effluents that pollute waterways and tend to generate serious environmental problems. A very useful tool to help assess these issues is the mathematical modeling. A specific class of models that are very promising for water resources and pollution control are the agent-based models. This paper presents an agent-based model developed and used to simulate plumes of conservatives and non-conservatives constituents in water courses. The assumptions made for the representation of advection, dispersion and first order decay phenomena's on the development of the model are presented, and these solutions are evaluated against the analytical solution of one-dimensional advection-dispersion equation. The results of the comparisons show a good representation of the agent-based model, and show the importance of the released constituent's mass discretization definition. The solutions developed and presented are promising for the application of agent based models in the simulation of impacts of discharges into watercourses, as well as the developed model presents itself as an educational tool for simple understanding of the simulated phenomena. Keywords: water quality; simulation; agent-based modeling
Article
Full-text available
The nonlinear biochemical processes and feedback play a crucial role in the physiology of living organism; they are powerful tools that allow us to understand the complex biochemical processes and their dynamics. Nonlinear biochemistry has been used to explain the mechanism of regulation of thyroid hormone, the Krebs cycle and blood pH. Another example of such tools is the process of cell death in various forms. There are mathematical models, which show the dynamics of protein synthesis in the cell; such models demonstrate the nonlinearity of the mechanism showing how the concentration of intracellular and extracellular proteins varies depending on metabolic needs of the cell. These models can be used to describe the dynamics of different biochemical mechanisms involved in cell death. In particular, in this paper we employ nonlinear dynamics to model in a closed system two cells, one normal and one cancerous.
Article
Full-text available
Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students’ understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence (r = .26, p = .03), Order (r = .37, p = .002), and Tradeoffs (r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.
Article
Individual-based modeling is considered an important tool in ecology and other disciplines. A major challenge of individual-based modeling is that it addresses complex systems that include a large number of entities, hierarchical levels, and processes. To represent these, individual-based models (IBMs) usually comprise a large number of submodels. These submodels might be complex by themselves and interact with each other in many ways, which in turn can affect the overall system behavior in ways that are not always easy to understand. As a result, both the validity and credibility of IBMs can be limited. We here demonstrate how a cascaded design of simulation experiments (cDOE) may support the validity and efficiency of the analysis of IBMs and other ecological simulation models. We take a systematic approach that adopts a divide-and-conquer strategy. In a preparatory phase, submodels and their parameters are configured in “subexperiments”. Consequently, the “top-level experiments” of the simulation model can assess the research questions in a more valid and efficient way. Our strategy thus supports the structural realism of individual-based models because both the behavior of their main components and the relationships between these components are explicitly addressed.
Article
Habitat fragmentation is a harmful impact on wild animals. In this paper, we propose an agent-based simulation model to estimate the fragmentation impacts (FIs) on wild animals caused by road network. We employ predator-prey system as a simplified ecological system, and investigate FIs on wildlife by adding roads gradually into the habitat of wildlife. Simulation results are well consistent with the ecological observation from previous researches. We find that (1) when road density increasing, but still below the critical threshold, wolves' population decrease monotonically; (2) when road density steps over the critical threshold, wolves are extinct; surprisingly, (3) after the extinction of wolves, sheep cannot recover to the carrying capacity of the initial state without wolves. Our simulation model provides not only a way to estimate the ecological impacts caused by the existed road network, but also a tool for considering alternative regional policies of road network planning.
Article
The multidisciplinary study of complex systems in the physical and social sciences over the past quarter of a century has led to the articulation of important new conceptual perspectives and methodologies that are of value both to researchers in these fields as well as to professionals, policymakers, and citizens who must deal with challenging social and global problems in the 21st century. The main goals of this article are to (a) argue for the importance of learning these ideas at the precollege and college levels; (b) discuss the significant challenges inherent in learning complex systems knowledge from the standpoint of learning sciences theory and research; (c) discuss the "learnability issue" of complex systems conceptual perspectives and review a body of literature that has been exploring how learning sciences pedagogical approaches can lead to student learning of important dimensions of complex systems knowledge; (d) argue that the cognitive and sociocultural factors related to learning complex systems knowledge are relevant and challenging areas for learning sciences research; and (e) consider ways that concepts and methodologies from the study of complex systems raise important issues of theoretical and methodological centrality in the field of the learning sciences itself.
Conference Paper
A novel analysis method for systems engineering based on wargaming technology is proposed. First, the essential factors and procedures of wargaming are introduced. Then, the systems engineering method based on wargaming is analyzed. At last, the systems modeling, simulating, and analyzing procedure is described in details by illustrating a living example. The characteristics and advantages of this method are summarized by comparing to the general multi-agent modeling method.
Article
Placed in the larger context of broadening the engagement with systems dynamics and complexity theory in school-aged learning and teaching, this paper is intended to introduce, situate, and illustrate—with results from the use of network supported participatory simulations in classrooms—a stance we call ‘embedded complementarity’ as an account of the relations between two major forms of systems-related learning and reasoning. The two forms of systems reasoning discussed are called ‘aggregate’ and ‘agent-based.’ These forms of reasoning are presented as distinct yet we also outline how there are forms of complementarity, between and within these approaches, useful in analyzing complex dynamic systems. We then explore specific ways in which the embedded complementarity stance can be used to analyze how learner understandings progress in science, technology, engineering, and mathematics-related participatory simulations supported by the HubNet (Wilensky and Stroup 1999c) learning environment developed with support from the National Science Foundation. We found that the learners used and built on the interdependence of agent and aggregate forms of reasoning in ways consistent with the discussion of embedded complementarity outlined in the early parts of the paper.
Book
Full-text available
This guide provides broad coverage of computational Artificial Life, a field encompassing the theories and discoveries underpinning the invention and study of technology-based living systems. The book focusses specifically on Artificial Life realised in computer software. Topics covered include the pre-history of Artificial Life, artificial chemistry, artificial cells, organism growth, locomotion, group behaviour, evolution and ecosystem simulation.
Article
The general aim is to promote the use of individual-based models (biological agent-based models) in teaching and learning contexts in life sciences and to make their progressive incorporation into academic curricula easier, complementing other existing modelling strategies more frequently used in the classroom. Modelling activities for the study of a predator–prey system for a mathematics classroom in the first year of an undergraduate program in biosystems engineering have been designed and implemented. These activities were designed to put two modelling approaches side by side, an individual-based model and a set of ordinary differential equations. In order to organize and display this, a system with wolves and sheep in a confined domain was considered and studied. With the teaching material elaborated and a computer to perform the numerical resolutions involved and the corresponding individual-based simulations, the students answered questions and completed exercises to achieve the learning goals set. Students’ responses regarding the modelling of biological systems and these two distinct methodologies applied to the study of a predator–prey system were collected via questionnaires, open-ended queries and face-to-face dialogues. Taking into account the positive responses of the students when they were doing these activities, it was clear that using a discrete individual-based model to deal with a predator–prey system jointly with a set of ordinary differential equations enriches the understanding of the modelling process, adds new insights and opens novel perspectives of what can be done with computational models versus other models. The complementary views given by the two modelling approaches were very well assessed by students.
ResearchGate has not been able to resolve any references for this publication.