Conference PaperPDF Available

Minsky, mind, and models: Juxtaposing agent-based computer simulations and clinical-interview data as a methodology for investigating cognitive-developmental theory

Authors:
Paper presented at the annual meeting of the Jean Piaget Society, June, 2006, Baltimore, MD.
1
Minsky, Mind, and Models: Juxtaposing Agent-Based Computer Simulations and Clinical-
Interview Data as a Methodology for Investigating Cognitive-Developmental Theory
Paulo Blikstein1, Dor Abrahamson2, and Uri Wilensky1
1Northwestern University
2University of California, Berkeley
paulo@northwestern.edu
dor@berkeley.edu
uri@northwestern.edu
We discuss an innovative application of computer-based simulations in the study of cognitive development. Our
work builds on previous seminal contributions to field, in which theoretical models of cognition were implemented
in the form of computer programs in attempt to predict human reasoning (Newell & Simon, 1972; Rose & Fischer,
1999). Our computer model can both be a useful vehicle to illustrate the Piagetian theoretical model or to simulate it
departing from clinical interview data. We focused in the Piagetian conservation experiment, and collected and
analyzed data from actual (not simulated) interviews.
The interviews were videotaped, transcribed, and coded in terms of parameters of the computer simulation. The
simulation was then fed with these coded data. We were able to perform different kinds of experiments:
Playback the interview and the computer model side-by-side, trying to identify behavior patterns;
Model validation: investigate whether the child’s decision-making process can be predicted by the model.
We conclude that agent-based simulation, activated alongside real data, offers powerful methods for exploring the
emergence of self-organized hierarchical organization in human cognition. We are currently exploring the entire
combinatorial space of all hypothetical children’s initial mental states and activating the simulation per each of these
states. From that perspective, our data of real participants become cases out of the combinatorial space.
Introduction
We discuss an innovative application of computer-based simulations in the study of cognitive
development. Our work builds on previous seminal contributions to field, in which theoretical models of
cognition were implemented in the form of computer programs in attempt to predict human reasoning
(Newell & Simon, 1972; Rose & Fischer, 1999). Computers offer powerful methods for exploring the
emergence of self-organized hierarchical organization in human cognition. In particular, agent-based
modeling (ABM; e.g., ‘NetLogo,’ Wilensky, 1999; ‘Swarm,’ Langton & Burkhardt, 1997; ‘Repast,’
Collier & Sallach, 2001) enables theoreticians to assign rules of behavior to computer “agents,”
whereupon these entities act independently but with awareness to local contingencies, such as the
behaviors of other agents. ABM has been used to illustrate aspects of cognitive development (see
Abrahamson & Wilensky, 2005, for a previous JPS paper on Piagetian–Vygotskiian perspectives on
individual learning in social contexts). We, too, propose to use ABM to simulate human reasoning, yet
we move forward by juxtaposing our simulation with real data. We chose the theory of Minsky (1985),
the Society of Mind, because it is dynamical, hierarchical, and emergent; we chose a Piagetian
conservation task, because Minsky modeled this task with his theory; finally, we worked with children
in both transitional and stable phases so as to elicit richer data. Our full paper and presentation provide
step-by-step hybrid narratives – computer simulation vs. videography – of children’s performance on a
conservation task. In the remainder of this paper, we will introduce Minsky's theory, explain our
experiment (a variation on the classical conservation-of-volume task, Piaget, 1952), and present case
studies where simulation and real data are juxtaposed.
Minsky’s model of Piagetian experiments stresses the importance of structure to cognitive evolution,
especially its reorganization. Younger children, for instance, would have ‘one-level’ priority-based
structures: one aspect would always be more dominant (tall would take priority over thin and over
JPS 2006 PAPER Submission Form ID: _____
2 of 6
confined - see Figure 1) and compensation is inexistent.
Figure 1 – A one-level model for evaluating “who has more”.
Later the child develops a new “administrative” layer that allows for more complex decisions: in Figure
2, for example, if more tall and less thin are in conflict, the history administrator can take over the
decision.
Figure 2 – New administrative layer
The Experiment
Two elongated blocks of clay of same shape but different color are laid before the child. One is “the
child’s,” and the other is “the experimenter’s.” After the child agrees that both are the same size, the
experimenter cuts one block in two, lengthwise, and joins the two parts so as to form a block twice as
long, then cuts the other block in two, widthwise, to form a block twice as thick as before. The child is
asked whether the blocks are still “the same” or whether either person has more than the other.
According to the child’s response, the interaction then becomes semi-clinical, with the experimenter
pursuing the child’s reasoning and challenging him/her with further questions.
JPS 2006 PAPER Submission Form ID: _____
3 of 6
The interviews were videotaped and transcribed, and the data were coded in terms of parameters of the
computer simulation (see below). The simulation was then fed with these coded data. We were able to
perform different kinds of experiments:
1) Playback the interview and the computer model side-by-side, trying to identify behavior
patterns and couch them in terms of the simulated model;
2) Model validation: investigate whether the child’s decision-making process can be predicted by
the model. We set the model with the child’s initial responses, “run” it through to completion,
and try to identify whether the simulated cognitive development matches processes observed.
Below, we demonstrate option #1, using hybrid data (computer simulation alongside human behavior).
We will show how the different models (programmed with data from the interviews with three children)
yield a surprisingly similar probabilistic cluster of responses as the interviews themselves.
Figure 3. A screenshot of the computer model and its main components. Similar to the
child, the computer ‘sees’ blocks of clay and tries to determine which block is ‘more.’
JPS 2006 PAPER Submission Form ID: _____
4 of 6
Computer model (screen captures)
Transcriptions/pictures
Child 1
From Child1’s (6yo) interview, we inferred the simple
model below. Cognitive agents presumed to be active are
marked with a green outline. Dominance is represented in
the model by the vertical distance to top. For this child,
whenever Number -- the cardinal dimension of the
stimulus -- is contextually salient, it dominates the
decision-making process. Also Tall appears to dominate
Thin.
“Because you cut in half, so there is two pieces,
but… It's not as fat as that. This is kind of fat, but this
is taller. I have more”.
Number is absent from this second interaction. Even
when two other measurements conflict, one is always
dominant. In this case, tall is more salient.
Researcher: Who has more?
Child1: It's hard to tell now. [tries to measure the fat
one with his fingers, then compares his fingers with
the thin and tall one]. This one [the taller].
JPS 2006 PAPER Submission Form ID: _____
5 of 6
In the third interaction, the experimenter reintroduces
Number by cutting his piece in four: as predicted by the
model, number takes priority again over tall and thin.
“You have more, because you have four quarters, I
have only two halves.”
Interpretation: The ‘priority’ model can account for the responses of Child1: he cannot coordinate two or more
measures. In the computer model, also, two measures cannot be coordinated. Given the same inputs, the computer
model and the interview data yield comparable results.
Child 2
Child 2 (8yo) has a model with Minsky’s “administrators”
(appearance and history of the transformations). With
one in conflict, the other takes control. If the Tall agent
reports ‘more’ and the Thin agent reports ‘less’, then the
Appearance administrator will say nothing - it is in conflict
and cannot decide.
“If you put them back together, you’ll have the same”
Child 2 has a new level of administrators, which enables him to background the appearance and focus on the
history of the objects. The blue is ‘re-joinable’, so both blocks are the same. During the interview, Child 2
occasionally said that nothing was added or taken away. The model, again, correctly determines the combinatorial
space and predicts response frequency distribution.
JPS 2006 PAPER Submission Form ID: _____
6 of 6
Child 3
For Child 3 (10yo), material taken away? was far more
dominant that joinable? or appearance.
“It’s the same, because you still have the same
amount, even if you cut in half in different ways,
because it’s still in half.
Child 3 backgrounds appearance from the start (see, in the model, that these agents are lower than others) and
focuses on confinement (nothing was taken away or added), and thus concludes that the blocks are still the same.
Conclusions
The computer model can be a useful vehicle both to illustrate the Piagetian theoretical model and to
simulate it departing from interview data. Through the lens of agent-based models, new properties of
Minsky’s model are revealed. Namely, the mature, hierarchical structure of the cognitive model is
stochastically determined, in the sense that across combinatorial initial conditions, and over sufficient
interactions, the same meta-structures ultimately emerge. Collecting and analyzing data from actual (not
simulated) interviews is an essential phase in the ongoing improvement of the computer simulation of a
theoretical model, such as Minsky’s model: The data sensitize us to the crucial components and
dimensions of the interactions and to the nature of the transformations. We are currently exploring the
entire combinatorial space of all hypothetical children’s initial mental states and activating the
simulation per each of these states. From that perspective, our data of real participants become cases out
of the combinatorial space. At the conference, we will demonstrate the several strands of our
methodology, including simulation, prediction, and stochastic exploration of combinatorial spaces.
References
Collier, N., & Sallach, D. (2001). Repast. University of Chicago (2001). http://repast.sourceforge.net .
Fischer, K. W., & Rose, S. P. (1999). Rulers, models, and nonlinear dynamics. In G. Savelsbergh, H.
van der Maas, & p. van Geert, (Eds.), Nonlinear developmental processes (pp. 197 - 212).
Amsterdam: Royal Netherlands Academy of Arts and Sciences.
Langton, C. & Burkhardt, G. (1997). Swarm. Santa Fe Institute, Santa Fe. www.swarm.org/release.html
Minsky, M. (1985). The society of mind. London, England: Hienemann.
Newell, A., & Simon, H. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.
Piaget, J. (1952). The child's conception of number. London: Routledge & Kegan Paul.
Wilensky, U. (1999). NetLogo. Center for Connected Learning and Computer-Based Modeling,
Northwestern University, Evanston, IL. ccl.northwestern.edu/netlogo
... Once the classroom data is collected, at most researchers can revisit the videotapes and transcriptions, but never re-live the situations. Second, as we move towards theories that conceptualize learning as a dynamic and adaptive phenomenon, the traditional media of scientific discourse-static linear text-becomes limited in its capacity to express these theories (Abrahamson & Wilensky, 2005;Blikstein, Abrahamson, & Wilensky, 2006). Thirdly, tools such as fMRIs cannot yet offer the speed and resolution needed to evaluate any complex learning process close to what we would find in a classroom. ...
... (b) An agent-based explanatory model for a classical piagetian task (the conservation experiment), based on Minsky and Papert's model (Minsky, 1986), and paired with data (bifocal modeling, Blikstein & Wilensky, 2006b) from interviews (Blikstein, Abrahamson & Wilensky, 2006). ...
... (d) The emergence and feasibility of multiple epistemological resources (Blikstein, & Wilensky, 2006, 2007. ...
... Recently ABM has been used to illustrate aspects of cognitive development (see Abrahamson & Wilensky, 2005, Blikstein, Abrahamson & Wilensky, 2006 Blikstein & Wilensky, 2006a) and collaboration and group work in classrooms (Abrahamson, Blikstein & Wilensky, 2007). We, too, propose to use ABM to simulate human reasoning, yet we move forward by juxtaposing our simulation with real classroom data using the Bifocal Modeling framework (Blikstein & Wilensky, 2006b). ...
... Once the classroom data is collected, at most researchers can revisit the videotapes and transcriptions, but never re-live the situations. Second, as we move towards theories that conceptualize learning as a dynamic and adaptive phenomenon, the traditional media of scientific discourse—static linear text—becomes limited in its capacity to express these theories (Abrahamson & Wilensky, 2005; Blikstein, Abrahamson, & Wilensky, 2006). ...
Article
Full-text available
Agent-based modeling has been increasingly used by scientists to study a wide range of phenomena such as the interactions of species in an ecosystem, the collisions of molecules in a chemical reaction, or the food-gathering behavior of insects (Bonabeau, 1999; Wilensky & Reisman, 2006). Such phenomena, in which the elements within the system (molecules, or ants) have multiple
... Specialized computer-based environments (e.g., 'NetLogo, ; 'Repast, ' Collier & Abrahamson, Blikstein, Cole, Lesh, Levin, & Wilensky (2007) Sallach, 2001; 'Swarm, have been developed as research tools for investigating complex phenomena (Gilbert, 2005;North et al, 2002;Wilensky, 2001;. The agents can be instantiated in the form of a computer program that specifies their rule-based behaviors. ...
... So it matters quite a bit which set of parameters are chosen in comparing these learning strategies. The Abrahamson & Wilensky paper (2005) explores some of the multi-dimensional parameter space, using NetLogo's BehaviorSpace research tools (Wilensky, 2001;Tisue & Wilensky, 2004). But it is not surprising that the figures in that paper, presented to the Jean Piaget Society, show the model operating with parameters in which the "Piagetian" strategy does well, while the parameters shown in Figure 3 are chosen for the Levin & Cole (2007) paper which is part of this symposium. ...
... Specialized computer-based environments (e.g., 'NetLogo, ; 'Repast, ' Collier & Abrahamson, Blikstein, Cole, Lesh, Levin, & Wilensky (2007) Sallach, 2001; 'Swarm, have been developed as research tools for investigating complex phenomena (Gilbert, 2005;North et al, 2002;Wilensky, 2001;. The agents can be instantiated in the form of a computer program that specifies their rule-based behaviors. ...
... So it matters quite a bit which set of parameters are chosen in comparing these learning strategies. The Abrahamson & Wilensky paper (2005) explores some of the multi-dimensional parameter space, using NetLogo's BehaviorSpace research tools (Wilensky, 2001;Tisue & Wilensky, 2004). But it is not surprising that the figures in that paper, presented to the Jean Piaget Society, show the model operating with parameters in which the "Piagetian" strategy does well, while the parameters shown in Figure 3 are chosen for the Levin & Cole (2007) paper which is part of this symposium. ...
... Through a careful analysis of the assumptions underpinning quantitative and qualitative methods, we will build a case that existing methods fail to adequately address the issues of non-linearity, temporality, spatiality, and phase-space that are of central to understanding emergent phenomenon. We will discuss how and the extent to which these issues can be addressed by integrating computational agent-based methods with existing quantitative and qualitative methods (Abrahamson & Wilensky, 2005;Blikstein, Abrahamson, & Wilensky, 2006;Goldstone & Janssen, 2005). In the final analysis, we propose an iterative process of building from and validating with phenomenological theory and data to seek a better understanding of the complex phenomenon of learning noting very well that, any method, be it quantitative, qualitative, or computational, used alone or in combination, will necessarily remain reductive. ...
Conference Paper
This paper provides an overview of a symposium that explored the implications of complexity for the field of the learning sciences. Two papers explored aspects of learning about complex systems in the domains of physics and electricity and of the mathematics of change and variation. The third paper viewed learning from a complexity perspective as an emergent phenomenon, and proposes to compliment traditional quantitative and qualitative methodologies used in learning sciences research with computational agent-based modeling methods. The fourth paper is a "theoretical case study" in which an "ontological network theory" based on scale free networks is proposed, and then used to reframe the debate in the learning sciences concerning "coherent knowledge" versus "knowledge-in-pieces" theories of conceptual change. Overall, it is hoped this session stimulated interest in new theoretical and methodological "lenses" for understanding the challenges of learning about complex systems and for doing research into learning as complex systems.
... In fact, recent decades have observed a surge in socialscience studies employing ABM (Epstein & Axtell, 1996;Axelrod, 1997). Recently, ABM has also been used to illustrate aspects of cognitive development (Abrahamson & Wilensky, 2005;Blikstein, Abrahamson & Wilensky, 2006;Smith & Conrey, 2006), and collaboration and group work in classrooms (Abrahamson, Blikstein, & Wilensky, 2007). ...
Article
Full-text available
Utilizing data from a classroom intervention with 8 th graders, I employ agent-based computer modeling to simulate the cognitive processes at play during the intervention, in which students transition between using multiple epistemological resources. The model substantiates the hypothesis of manifold epistemological resources, which can be activated with simple prompts and have a non-linear impact on learning.
... In recent decades there has been a surge in socialscience studies employing ABM (Epstein & Axtell, 1996). Recently, ABM has been used to illustrate aspects of cognitive development in social context Blikstein, Abrahamson & Wilensky, 2006). We argue that ABM has potential to contribute to the advancement of theory on group work and collaboration in classrooms, particularly, the computational power of ABM enables researchers to mobilize an otherwise static list of conjectured behaviors and witness emergent group-level patterns. ...
... In this paper we focus on the potential of ABM as a research tool for formulating and critiquing cognitive development theory. ABM has been used to illustrate aspects of cognitive development (seeAbrahamson & Wilensky, 2005, Blikstein, Abrahamson & Wilensky, 2006) and collaboration and group work in classrooms (Abrahamson,Blikstein & Wilensky, 2007). We, too, propose to use ABM to simulate human reasoning, yet we move forward by juxtaposing our simulation with real data using the Bifocal Modeling framework (). ...
Conference Paper
Full-text available
We discuss an innovative application of computer-based simulations in the study of cognitive development. Our work builds on previous contributions to the field, in which theoretical models of cognition were implemented in the form of computer programs in attempt to predict human reasoning (Newell & Simon, 1972; Fischer & Rose, 1999). Our computer serves two distinct functions: (1) illustrate the Piagetian theoretical model and (2) simulate it departing from clinical interview data. We focused on the Piagetian conservation experiment, and collected and analyzed data from actual (not simulated) interviews with children from 4 to 10 years old. The interviews were videotaped, transcribed, and coded in terms of parameters of the computer simulation. The simulation was then fed these coded data. We were able to perform different kinds of experiments: 1) Playback the interview and the computer model side-by-side, trying to identify behavior patterns; 2) Model validation: investigate whether the child's decision-making process can be predicted by the model.
Article
Full-text available
This paper presents a unifying account for why many biological, ecological and other science processes that are taught in school curricula and those occurring in our everyday environment (such as the spread of Covid19) are particularly difficult for students to understand. These often-misconceived processes are Emergent processes. The hypothesis is that students bring the perspective of Individualistic (a form of linear) thinking suitable for understanding Sequential processes, to explain Emergent processes, instead of the more appropriate perspective of Collective (a form of systems) thinking, thereby resulting in misunderstanding. This paper describes a framework called PAIR-C that defines the causal knowledge structures underlying both Individualistic and Collective thinking needed for understanding many science processes. PAIR-C explains why students generate misconceptions; shows how the Collective causal structure can help understand many Emergent processes; and suggests a new instructional approach.
Conference Paper
Full-text available
We have been exploring the potential of agent-based modeling methodology for socialscience research and, specifically, for illuminating theoretical complementarities of cognitive and socio-constructivist conceptualizations of learning (e.g., Abrahamson & Wilensky, 2005a). The current study advances our research by applying our methodology to pedagogy research: we investigate individual and social factors underlying outcomes of implementing collaborative-inquiry classroom practice. Using bifocal modeling (Blikstein & Wilensky, 2006a), we juxtapose agent-based simulations of collaborative problem solving with real-classroom data of students' collaboration in a demographically diverse middle-school mathematics classroom (Abrahamson & Wilensky, 2005b). We validate the computer model by comparing outcomes from running the simulation with outcomes of the real intervention. Findings are that collaboration pedagogy emphasizing group performance may forsake individual learning, because stable division-of-labor patterns emerge due to utilitarian preference of short-term production over long-term learning (Axelrod, 1997). The study may inform professional development and pedagogical policy (see interactive applet: http://ccl.northwestern.edu/research/conferences/CSCL2007/CSCL2007.html).
Rulers, models, and nonlinear dynamics
  • K W Fischer
  • S P Rose
Fischer, K. W., & Rose, S. P. (1999). Rulers, models, and nonlinear dynamics. In G. Savelsbergh, H. van der Maas, & p. van Geert, (Eds.), Nonlinear developmental processes (pp. 197 -212). Amsterdam: Royal Netherlands Academy of Arts and Sciences.
  • N Collier
  • D Sallach
Collier, N., & Sallach, D. (2001). Repast. University of Chicago (2001). http://repast.sourceforge.net.
Swarm. Santa Fe Institute, Santa Fe. www.swarm.org/release The society of mind
  • C Langton
  • G Burkhardt
Langton, C. & Burkhardt, G. (1997). Swarm. Santa Fe Institute, Santa Fe. www.swarm.org/release.html Minsky, M. (1985). The society of mind. London, England: Hienemann.