Conference PaperPDF Available

Evaluating the quality of dialogical argumentation in CSCL: moving beyond an analysis of formal structure.

Authors:
Evaluating the Quality of Dialogical Argumentation in CSCL:
Moving Beyond an Analysis of Formal Structure
Douglas Clark, College of Education, Payne 203F, Arizona State University, Tempe, AZ 85287-0911, USA,
Douglas.B.Clark@asu.edu,
Victor Sampson, College of Education, Payne 203F, Arizona State University, Tempe, AZ 85287-0911, USA,
victor.sampson@asu.edu
Armin Weinberger, Ludwig-Maximilans-Universität, Leopoldstr. 13, 80802 Munich, Germany,
armin.weinberger@psy.lmu.de,
Gijsbert Erkens, Research Centre Learning in Interaction, Utrecht University, Heidelberglaan 1, 3584 CS Utrecht,
NL, G.Erkens@fss.uu.nl
Abstract: Over the last decade, researchers have developed sophisticated online learning
environments to promote argumentative discourse between students. This symposium examines
some of the diverse ways researchers have attempted to examine how students engage in
argumentation and to assess the effectiveness of CSCL environments in fostering productive
argumentation. The papers presented as part of this symposium will focus on four different
categories of analytic frameworks: (1) nature and function of contributions within the dialog, (2)
nature of reasoning, (3) conceptual quality, and (4) patterns and trajectories of participant
interaction. Example analytic frameworks from each category are presented in detail rich enough
to illustrate their nature and structure. Synthetic discussions of each category consider the
frameworks in light of the underlying theoretical perspectives on argumentation, pedagogical
goals, and online environmental structures.
Supporting and Promoting Argumentation in CSCL Environments
Online learning environments that engage and support students in dialogic argumentation provide excellent
opportunities for students productively to propose, support, evaluate, critique, and refine ideas. Over the last decade,
a number of sophisticated environments have been developed to support students engaging in this type of
knowledge-building discourse. Measuring the nature and quality of the dialogic argumentation that takes place
within these environments, however, has proven challenging. This is due in part, to the context-specific nature of
argumentation (Andriessen, Baker, & Suthers, 2003). As a result, argumentation quality cannot be defined solely on
the basis of what it is; it must also be defined by what it is used for, who does it, and how it unfolds. Thus, in order
to facilitate research and the development of new CSCL environments, the papers presented as part of this
symposium highlight the foci, affordances, and constraints of several different analytic methods for assessing
dialogic argumentation that are currently available to researchers. In addition to providing an overview of available
methods, a major goal of this symposium is to highlight the benefits and limitations of using the different
frameworks for assessing the quality of argumentation in different contexts. The different contexts that we will
examine include: (a) the object (or subject) of the discussion, (b) the purpose for engaging in the discussion (e.g., to
persuade or to co-construct a better solution), (c) the norms that will govern how participants will distinguish
between ideas (e.g., fit with evidence or plausibility), and (d) the medium (the types of tools that have been
incorporated into the environment to support argumentation).
Analytic Frameworks Presented
Early work measuring students’ argumentation within CSCL environments relied heavily on analytic
frameworks that emphasized argument structure and the presence or absence of different structural components of
an argument as a way to assess quality (e.g., Toulmin, 1958). However, over the last decade, researchers interested
in supporting and promoting argumentation as part of CSCL environments have developed a broad range of methods
to assess the nature or quality of dialogic argumentation that better reflect the context-specific nature of
argumentation. These methods have enabled researchers to focus on specific aspects of argumentation and to
evaluate the impact of specific pedagogical goals or tools as a way to foster productive argumentation in CSCL
environments. In order to facilitate the comparison of these analytic frameworks, all of the papers presented as part
of this symposium evaluate the same short segment of student argumentation (see Table 1). The students in the
example are arguing within a customized asynchronous threaded discussion forum about their interpretations of the
data they have collected in an earlier part of the project (Clark & Sampson, 2005). At the heart of their argument is
the scientific principle of thermal equilibrium.
13 CSCL 2007
Table 1: A short sample of dialogical argumentation to facilitate comparisons.
Individual Comment
Fran: I think objects in the same room remain different temperatures because some
objects are good conductors and some are bad. This determines how much heat
energy is allowed in and out of the object.
Amy: I disagree; I think all objects in the same room are the same temperature.
Conductivity only determines how quickly an object will reach room temperature.
Fran: No, good conductors let in more heat energy than poor conductors, so objects that
let in more heat will get hotter. For example, when I put a piece of metal and a
piece of plastic in hot water the metal was a higher temperature after 30 seconds.
Amy: I guess you’re right. Maybe objects are different temperatures.
How should researchers of CSCL environments interpret our student example in terms of argumentation
quality? In answering this question, researchers must choose a valid and reliable analytic method that (a) takes into
account the context-specific nature of argumentation and (b) is compatible with their theoretical perspectives on
argumentation, pedagogical goals, and the structure of their online learning environment. For example, researchers
interested in promoting argumentation where individuals attempt to negotiate meaning by “proposing and accepting
information in an effort to modify and build on each other’s knowledge” are likely to adopt different pedagogical
goals and online structures than researchers who are trying to promote argumentation where individuals attempt to
“convince each other of their own viewpoint” (Andriessen, Erkens, Van de Laak, Peters, & Coirier, 2003, p. 82).
These differences not only influence the nature of the argumentation that takes place between the participants in a
CSCL environment but also affect what counts as a productive conversation.
The analytic methods discussed in this symposium were chosen to represent a range of promising
approaches for analyzing dialogic argumentation in online learning environments. The selection process focused on
each method’s capabilities for assessing dialogic argumentation within online environments independent of whether
or not the method had been originally developed for application in online or offline environments. As previously
mentioned, the categories of analytic focus include (1) nature and function of contributions within the dialog, (2)
nature of reasoning, (3) conceptual quality, and (4) patterns and trajectories of participant interaction. Each of the
papers presented in this symposium focuses on one of these categories and uses the example of dialogical
argumentation provided above to illustrate the constraints and affordances of the different frameworks. Each paper
then concludes with a discussion of the suitability of the frameworks for examining the quality of argumentation in
different contexts. The purpose of this discussion is not to identify some frameworks as being “better” than others;
rather it is intended to provide researchers with a way to choose a framework that is compatible with their theoretical
perspectives on argumentation, pedagogical goals, and the structure of their online learning environment.
Analytic Frameworks that Focus on the Nature and Function of Contributions
within a Dialog in CSCL Environments
Gijsbert Erkens
Research Centre Learning in Interaction, Utrecht University
Analytic frameworks that focus on the nature and function of participants’ contributions examine the types
of dialog in which students engage as well as the proportion of conceptually and argumentatively productive dialog.
An example of this type of framework has been developed by deVries, Lund, and Baker (2002) to examine ways to
promote epistemic dialogue in online learning environments. As defined by deVries, Lund, and Baker, epistemic
dialog (1) takes place in a collaborative problem-solving situation, (2) can be characterized as argumentation or
explanation, and (3) concerns the knowledge and concepts underlying the problem-solving rather than the execution
of problem-solving actions. The analytic framework specifies four main categories (explanation,argumentation,
problem resolution and management) subdivided in a total of 13 different coding categories. To foster epistemic
discourse between students, deVries, Lund, and Baker integrate structures that promote collaboration,
asynchronous communication,dynamic visualizations,socio-cognitive structuring, and awareness heightening tools
into the CONNECT environment. In this environment, students work together in order to produce a piece of text that
explains a puzzling phenomenon through a process of collaboration and negotiation.
14 CSCL 2007
Another example of this type of framework is Rainbow. Rainbow, which was developed by Baker,
Andriessen, Lund, van Amelsvoort, and Quignard (submitted) to analyze computer-mediated pedagogical debates,
comprises seven principal analytic categories. The primary focus is on the epistemic nature of the contributions that
students make during collaboration. The framework was developed to allow the researchers to investigate what it
means for participants to achieve conceptually deeper levels of interaction. At the most basic level, the Rainbow
framework distinguishes between assignment-related activity and outside-activity (any interaction that is not
concerned with carrying out the prescribed task). From there, Rainbow differentiates assignment related activity as
either task-focused or non task-focused. Non task-focused activity is categorized as either social relation (interaction
that is concerned with managing students’ social relations with respect to the task) or interaction management
(interaction concerned with managing the interaction itself). Task-focused activity is categorized as task
management (management of the progression of the task itself), opinions (interaction concerned with expressing
opinions with regard to the topic under debate), argumentation (expression of arguments and counterarguments
directly related to a thesis), and explore and deepen (interaction concerned with arguments and counterarguments
linked together, their relations, and the meaning of the arguments themselves including elaboration, definition, and
extension). Baker and colleagues ground the rationale for each of these seven categories carefully in the research on
collaborative learning, task-oriented dialogues, verbal interactions, and argumentation theory.
Janssen, Erkens, Jaspers, & Kanselaar (2006) have developed a Dialogue Act coding framework that
focuses on the communicative instead of the epistemic nature of the contributions within a dialog. The framework
first identifies the communicative function of each utterance typed by the students during their online collaboration
and communication. The five main communicative functions include: argumentative (indicating a line of
argumentation or reasoning), responsive (e.g., confirmations, denials, and answers), informative (transfer of
information), elicitative (questions or proposals requiring a response), and imperative (commands). The framework
specifies twenty-nine different dialogue acts within these five main functions. Seven of the twenty-nine focus on
argumentative dialog. Dialogue Acts are recognized by specific ‘discourse markers’ that indicate the communicative
function of the utterance, i.e. the use of the connective ‘because’ signifying an argumentative reason. The use of
discourse markers facilitates the reliability of the framework in hand coding, but offers also the possibility of
automatic coding.
Analysis of the Sample Argument
From the perspective of deVries, Lund, and Baker’s framework, the example represents desirable epistemic
discourse because all four contributions to the discussion can be characterized as either explanation or
argumentation. As previously mentioned, de Vries, Lund, and Baker suggest that explanation and argumentation are
“potentially powerful mechanisms by which students can collaboratively construct new meaning” (2002, p.64).
Similarly, Janssen, Erkens, Jaspers, and Kanselaar’s framework and Dialogue Act coding system indicates that the
student example represents an extended sequence of argumentation and is therefore of high quality. The student
example also represents quality argumentation from the perspective of the Rainbow framework because the example
involves conceptual deepening and exploration of the topic.
Constraints and Affordances
Frameworks with a focus on the nature and function of contributions within the dialog focus by definition
on ongoing discourse. They are therefore best suited for coding synchronous forums or asynchronous forums rather
than environments focusing on the juxtaposition of a small number of crafted responses or the interpretation of
dialogic artifacts. That said, however, frameworks such as Rainbow can be adapted to other formats as discussed by
Baker and colleagues. Of the three frameworks discussed, de Vries, Lund, and Baker’s framework is noteworthy for
its consideration of the types of discourse moves that students may make; the Rainbow framework is grounded
theoretically and is parsimonious enough to simplify application and analysis. Both focus on the epistemic nature of
task-oriented discourse. Janssen, Erkens, Jaspers, and Kanselaar’s framework focuses on the communicative nature
of task-oriented discourse and offers potential in terms of its automated capabilities, but is inappropriate for judging
the quality of contributions. In sum, these frameworks provide different approaches for researchers interested in
assessing the nature of student’s contributions and the overall effectiveness of online environments designed to
encourage substantive discussions about the knowledge and concepts underlying problem solving. An overview of
the suitability of these three frameworks for assessing argumentation in different contexts is provided in Table 2.
15 CSCL 2007
Table 2: Suitability of the analytic frameworks that focus on the nature and function of contributions
Nature of the Argumentation
Subject of the
Discussion
Goal of the
Discussion
Rules for
Judging Ideas
Medium
Tools used in the Environment
Framework
Well defined problem
with one solution
Complex problem with
multiple solutions
Wicked problems with
no right answer
Reach consensus or
persuade others
Learn more about the
topic
Develop a solution
Empirical
Plausibility or Logic
Moral or ethics
Easily accessible and
accessed information
Asynchronous
communication
Representations of
subject matter
Dynamic visuals of
student arguments
Socio-cognitive
structuring
Awareness heightening
tools
deVries, Lund, & Baker
(2002): Epistemic Dialog •• ••• ••• •• ••• ••• •• •• •• •• •• •• •• ••
Baker et al. (submitted):
Types of Contributions ••• ••• ••• •• ••• •• •• •• •• •• •• •• •• ••
Janssen et al. (2006):
Dialogue Acts Scoring •• ••• ••• •• ••• •• •• ••• •• •• •••
Note: ••• indicates that the framework is well suited for use in this context, •• indicates that this framework can be used in this context but
provides no specific affordances, • indicates that the framework may be inappropriate for this type of context without some modification
Analytic Frameworks that Focus on the Nature of Reasoning during
Argumentation in CSCL Environments
Victor Sampson
College of Education, Arizona State University
Analytic frameworks that examine the epistemic nature of students’ reasoning focus on the types of
reasoning students use to support their claims or to challenge the claims of others. Both Jimenez-Aleixandre,
Rodriguez, & Duschl (2000) and Duschl (2000) have developed analytic methods designed to address this question
using Walton’s (1996) argumentation schemes for presumptive reasoning as a theoretical framework. Walton
suggests that dialectical argumentation is grounded in burden of proof, presumption, and plausibility rather than in
structural form alone. Walton details twenty-five different argumentation schemes that focus on how presumptions
are brought forward in arguments as kinds of premises or as kinds of inferences that link premises to conclusions in
a context of argumentative dialog. Examples of these schemes include an argument from evidence to hypothesis
(e.g., the data we gathered indicates…) and an argument from analogy (e.g., this is just like…). The function of
these schemes is to shift the weight of presumption from one side of a dialog to the other. An opposing voice can
then respond with questions or statements that shift the weight of presumption back upon the original participant.
Analysis with this type of framework focuses on categorizing the types of reasoning employed within an argument.
Jimenez-Aleixandre, Rodriguez, & Duschl’s framework apply a standard Toulmin model (e.g., data,
warrants, and qualifiers) to identify instances when students attempt to support their ideas during small group and
whole class discussions. Once these instances are identified, they examine how students elaborate, reinforce, or
oppose the ideas of each other by classifying claims and warrants using epistemic operations based on Walton’s
categories of presumptive reasoning. Analysis then compares the proportion of these instances to the total about of
dialog and the types of epistemic moves that are most often used during the discussion or debate. More recently,
Duschl (in press) has developed an innovative way to apply Walton’s framework to scientific argumentation in the
classroom. Duschl first narrows Walton’s twenty-five categories down to the nine categories that they found to have
strong relevance to scientific argumentation in the classroom. Distinguishing between even these nine categories,
however, proves difficult in coding students’ work. Duschl and his group therefore collapsed the nine categories into
four categories including requests for information,expert opinion,inference, and analogy. They then apply these
coding categories at the level of the reasoning sequence, which is approximately at the level of each of the students’
comments in our example. Analysis then focuses on the number and proportion of each of these epistemic discourse
types in students’ discussions.
16 CSCL 2007
Analysis of the Sample Argument
The potential benefits of examining the epistemic nature of contributions to a discussion or debate become
evident when the student example is analyzed using Jimenez-Aleixandre et al. and Duschl’s frameworks. Rather
than simply documenting that the students are making claims and supporting their ideas with data, warrants, or
qualifiers, these frameworks enable us to identify the nature of their reasoning. For example, Jimenez-Aleixandre et
al’s framework suggests that these students are attempting to justify their ideas with reasons that focus on causality,
consistency, and appeals to instances rather than relying on plausibility or appeals to authority. Similarly, Duschl’s
framework suggests that the students are relying on desirable epistemic moves such as inferences from evidence to
hypothesis (the metal was a higher temperature after 30 seconds) and inferences from cause to effect (conductivity
determines how quickly an object will reach room temperature) in order support or refute an idea. The student
example therefore represents fairly high quality argumentation from the perspective of these frameworks.
Constraints and Affordances
Frameworks that focus on the epistemic nature of reasoning are designed to provide valuable information
about how students determine ‘what counts’ as warranted knowledge and how students determine which ideas
should be accepted, rejected, or modified. Rather than assessing conceptual quality of students’ contributions, this
focal category revolves around the types of reasoning that students use when they propose, support, evaluate, and
challenge ideas. In terms of specific affordances and constraints, Jimenez-Aleixandre, Rodriguez, & Duschl’s
framework is valuable because it integrates an assessment of reasoning type with structural quality. In practice,
however, differentiating between students’ epistemic operations can prove difficult, but this framework’s
consideration of the nature of students’ reasoning and argumentation structure may prove particularly fruitful for
those interested in scaffolding students as they engage in argumentation. Duschl’s framework, in turn, is noteworthy
for its distillation and synthesis of Walton’s framework into a manageable discipline-specific coding scheme.
Overall, these frameworks (and this categorical focus for analysis) apply well for those interested in
helping students to improve their discourse skills, reasoning, and ability to evaluate arguments by helping students
learn specific discourse goals (e.g., securing commitments from an opponent or undermining the opponent’s
argument) and effective strategies to help them meet these goals (e.g., justifying claims with evidence, requiring
opponents to justify their claims with evidence). These frameworks also are applicable to almost any type of
environment structure because they focus on a core attribute of all argumentation. Generally speaking, they focus on
frequency counts so they are better suited to environments supporting free flowing dialog, such as asynchronous and
synchronous discussions rather than the micro analysis of smaller segments. One advantage of this categorical focus,
however, involves the relative content independence afforded in comparison to frameworks focusing specifically on
the conceptual quality of ideas. Frameworks focusing on the epistemic nature of reasoning therefore require little
modification when applying them across related topic areas. An overview of the suitability of these two frameworks
for assessing argumentation in different contexts is provided in Table 3.
Table 3: Suitability of the analytic frameworks that focus on the nature of reasoning
Nature of the Argumentation
Subject of the
Discussion
Goal of the
Discussion
Rules for
Judging Ideas
Medium
Tools used in the Environment
Framework
Well defined problem
with one solution
Complex problem
with multiple solutions
Wicked problems with
no right answer
Reach consensus or
persuade others
Learn more about the
topic
Develop a solution
Empirical
Plausibility or Logic
Moral or ethics
Easily accessible and
accessed information
Computer mediated
communication
Representations of
subject matter
Dynamic visuals of
student arguments
Socio-cognitive
structuring
Awareness
heightening tools
Jimenez-Aleixandre et al.
(2000): Structure and
Nature of Reasoning
•• •• •• •• •• •• ••• ••• •• •• •• ••
Duschl (in press):
Application of Walton to
Dialogic Argumentation
•• •• •• •• •• •• ••• ••• •• •• •• ••
17 CSCL 2007
Analytic Frameworks that Focus on Conceptual Quality in CSCL Environments
Douglas Clark
College of Education, Arizona State University
Analytic frameworks that focus on conceptual quality examine the content or substance of the contributions
that are made during a discussion. Clark and Sampson’s framework (2005), for example, focuses on analyzing the
relationships between levels of opposition that take place during discourse episodes and the nature, conceptual
quality, and grounds quality of constituent student contributions. Kuhn and Udell’s (2003) framework, on the other
hand, focuses on the logical coherence and the relevance of the arguments generated by students as a way to
measure the conceptual quality of the ideas proposed by students. The content component is domain-specific,
involving specified hierarchical sets of arguments for (pro) and against (con) the topic being debated (which is
capitol punishment in their study). The lowest level comprises Nonjustificatory Arguments, which have little or no
argumentative force. The middle tier comprises Nonfunctional Arguments, which focus on tangential aspects of the
problem rather than core issues. At the highest level, Functional Arguments address core aspects of the problem.
This type of focus is especially well-suited for online environments where students’ are encouraged to debate and
discuss issues without clear “right” or “wrong” answers (such as capital punishment). In addition to these dialogic-
oriented frameworks, excellent rhetorical-oriented frameworks by Sandoval and others exist.
Analysis of the Sample Argument
The application of Clark and Sampson’s framework to the example of argumentation indicates the
discourse is oppositional in nature because it involves a distinct rebuttal against the grounds of an idea as well as a
rebuttal against the thesis of an idea. However, in terms of conceptual quality the argumentation is considered poor
because the students reach an inaccurate conclusion. Moreover, this episode illustrates how students can distort
evidence to match claims. In this example, Fran convinces Amy to abandon her normative idea that objects sitting in
the same room are in thermal equilibrium by providing inappropriate evidence in support of a non-normative idea.
From the perspective of Kuhn and Udell’s framework, we would view the example as exceedingly short but
representing quality argumentation. The arguments presented by Fran and Amy are functional in terms of conceptual
quality, which indicates that these students address key aspects of the problem. Moreover, the discourse moves used
by the students in this example heavily emphasize argumentative moves (e.g., challenging the ideas of others) rather
than exposition (e.g., proposing or clarifying one’s own ideas).
Constraints and Affordances
Overall, the analytic frameworks that focus on conceptual quality are well-suited for online-environments
for those interested in the relationship between argumentation and learning. For example, when the pedagogical goal
of an online environment is to help students learn how to engage in argumentation (e.g., proposing, justifying, and
challenging ideas), the analytic framework can focus on the structure of students’ contributions to the discussion and
still be sufficient. However, if the goal of the online environment is to provide an opportunity for students to learn
from argumentation (e.g., develop a more in-depth understanding of the content that is being discussed), the analytic
framework must also be able to examine the normative quality of students’ ideas in order to assess the overall
effectiveness of the environment. In choosing an analytic framework, researchers must determine the importance of
the relationship between the normativity of a comment and the relative time of its contribution. Non-normative
content at the onset of dialog followed by increasing normativity by the conclusion of the dialog might represent
something entirely different than the reverse trajectory. Kuhn and Udell address the temporal issue by measuring the
normativity of students’ arguments before and after the dialog, for example, but do not examine the trajectories
within the dialog itself.
A focus on conceptual quality of contributions or products fits well with environments that include easily
accessible and indexed knowledge bases and enriched representations of focal subject matter because these types of
functionalities are often integrated into online environments designed to help students achieve specific content
learning goals that are associated with the databases and enriched representations. In addition, environments that
integrate asynchronous communication and awareness heightening tools can also benefit from this type of focus. By
examining the content of student ideas and how students interact with each other, researchers can better support
students as they attempt to negotiate meaning or validate ideas in online environments. One challenge, however, is
that rubrics with a focus on normativity become very topic-specific and thus require significant modification for
18 CSCL 2007
application across contexts. An overview of the suitability of these two frameworks for assessing argumentation in
different contexts is provided in Table 4.
Table 4: Suitability of the analytic frameworks that focus on conceptual quality
Nature of the Argumentation
Subject of the
Discussion
Goal of the
Discussion
Rules for
Judging Ideas
Medium
Tools used in the Environment
Framework
Well defined problem
with one solution
Complex problem with
multiple solutions
Wicked problems with
no right answer
Reach consensus or
persuade others
Learn more about the
topic
Develop a solution
Empirical
Plausibility or Logic
Moral or ethics
Easily accessible and
accessed information
Computer mediated
communication
Representations of
subject matter
Dynamic visuals of
student arguments
Socio-cognitive
structuring
Awareness
heightening tools
Clark & Sampson (2005):
Conceptual Quality of
Comments
••• ••• •• ••• ••• ••• ••• ••
Kuhn & Udell (2003):
Argumentation Quality
and Types of Comments
•• ••• ••• ••• •• ••• •• ••• ••• ••
Analytic Frameworks that Focus on Patterns and Trajectories of Participant
Interaction during Argumentation in CSCL Environments
Armin Weinberger
Knowledge Media Research Center (KMRC), Tübingen
Analytic frameworks focusing on patterns and trajectories of participant interaction consider argumentation
as a primarily social activity. Examples of frameworks with that focus are Leitão (2000), Hogan, Nastasi, and
Pressley (2000), Baker (2003), and Weinberger and Fischer (2006). Leitão (2000) considers a specific sequence of
argumentation to be particularly fruitful for knowledge building. Based on Piaget’s (1985) work and his idea of
socio-cognitive conflict, Leitão envisions argumentation as a social activity in which students confront each other
with opposing views and build knowledge by resolving this conflict in a specific manner. In what Leitão calls a
knowledge building cycle, students (1) construct an argument, which consists of a position and its justification, (2)
construct a counterargument in response to the first argument, and (3) create a reply that captures the participants’
immediate and secondary reactions to the counterargument. Through these patterns of argumentation, the initial
arguments may be preserved, revised or withdrawn. Leitão argues that these patterns of argumentation optimally
shape the process of social knowledge construction.
Hogan, Nastasi, and Pressley’s (2000) framework examines discourse components, interaction patterns, and
reasoning complexity. The framework focuses on (1) how students work to improve weak or incomplete ideas, (2)
the patterns of verbal interactions that take place between individuals in scientific sense-making activities, and (3)
the relationships between discourse patterns and the sophistication of scientific reasoning in discussions. Analysis
begins with the assignment of macro-codes to the major modes of a group’s discussion at the level of conversational
turns. Macro-codes include Knowledge Construction,Logistical, and Off-Task. Micro-codes are then assigned at the
level of statement or phrase including Conceptual,Metacognitive,Question-Query,Nonsubstantive, and Other.
Micro-codes include multiple subcategories. Researchers then create discourse maps illustrating the patterns of
interactions between students based on these codes. Patterns of interaction include consensual (where a student
proposes an idea and another student agrees), responsive (where a student asks a question and another student
answers), and elaborative (where students discuss and revise each others ideas). Researchers next assess reasoning
complexity and compare this information to the interactional patterns.
Baker’s framework examines the standpoints adopted by individuals during argumentation, how ideas
change over time, and the pragmatic function of language. The framework focuses on argumentation as a way to
facilitate collaborative learning. According to the framework, argumentation transforms the epistemic status of
19 CSCL 2007
solutions by establishing relations between the proposed solutions and other knowledge or by promoting the
negotiation of new meaning. The epistemic status indicates to what extent solutions are being approved. Arguments
strengthen the epistemic status of a solution. Counter-arguments weaken the epistemic status of a solution. As a
discursive activity, argumentation establishes relations between possible solutions and other sources of knowledge.
As a dialogic activity, argumentation incorporates aspects of formal and pragmatic dialectics. Through the analyses,
this framework measures the strengthening and weakening of the epistemic status of various claims as well as the
progression of dialectic moves.
Weinberger and Fischer’s (2006) framework examines the process through which knowledge is constructed
as students engage in argumentation in online environments. Their framework assesses argumentation along four
independent dimensions. The participation dimension analyzes the amount of participation by each student and the
heterogeneity of participation within the learning group. The epistemic dimension identifies how and what
theoretical concepts students use in their argumentation them in terms of the environment’s learning goals. On the
formal argumentative dimension, Weinberger and Fischer analyze the construction of single arguments through a
simplified version of Toulmin’s scheme (1958) as well as through the argumentation sequences outlined in Leitão’s
(2000) work. Finally, on the dimension of social modes of co-construction, Weinberger and Fischer analyze the
transactivity of students’ arguments (Teasley, 1997), i.e. to what extent students refer to the arguments and operate
on the reasoning of their learning partners. Different ways to build consensus correspond with different degrees of
transactivity. Students can establish consensus by agreeing with the ideas proposed by their peers (relatively low
transactivity), integrating peers’ arguments into their own line of argumentation (relatively high transactivity), or by
engaging in a conflict-oriented negotiation of different perspectives (relatively high transactivity).
Analysis of the Sample Argument
From the perspective of Leitão’s framework, our student example represents a complete knowledge
building cycle. The episode begins with Fran contributing her initial argument. Amy then counters by bringing the
truth of the claim into question. Fran replies by dismissing Amy’s counter argument which enables Fran to preserve
her initial viewpoint. In this case, Amy accepts Fran’s ideas and withdraws her initial viewpoint. From Leitão’s
perspective, both this type of outcome and outcomes that result in a revised argument represent successful outcomes
of argumentation. Hogan, Nastasi, and Pressley’s framework would describe the sample argument as an elaborative
interaction pattern. They suggest that elaborative interaction patterns are characteristic of quality argumentation
because they prolong discussions and lead to higher levels of reasoning. Although there is no elaboration present,
the student example’s macro-code represents Knowledge Construction from the perspective of this framework. The
example also represents fairly high quality argumentation from the perspective of Baker’s framework. Although
brief, the discourse changes the epistemic status of Idea A (objects remain different temperatures) and Idea C
(objects become the same temperature) which indicates productive argumentation. Applying Weinberger and
Fischer’s framework shows that the learners participate homogeneously (participation dimension). With respect to
the epistemic dimension, both Fran and Amy engage in on-task talk and construct relations between the target
conceptual space (rather than prior knowledge) and the problem space. However, some of the concepts are being
applied inadequately. On the formal argumentative dimension, Amy and Fran build relatively complete arguments
and argumentation sequences. Finally, on the social modes of co-construction dimension, Amy and Fran clearly
engage in conflict-oriented consensus building as they refer to each other’s contributions and attempt to negotiate
meaning.
Constraints and Affordances
This analytic category increases the unit of analysis from an individual comment or fragment to an entire
knowledge building cycle. As such it allows us to focus on the actual processes of co-construction of knowledge
rather than focusing on frequency counts of elements that correlate to desirable interaction. Leitão, for example,
emphasizes the social nature of knowledge building as opposed to online contexts in which students hardly interact
with the activities of their learning partners (e.g., by composing elaborate, essay-like replies in discussion boards).
This approach thus emphasizes the coherence of argumentative talk between students. One interesting dichotomy,
however, involves the presence or absence of a pedagogical goal state within the framework to inform the
development of practice. In other words, does the framework provide a road map for instruction in terms of
desirable student practice? For example, Baker’s analytic framework provides ways to track the evolution and
change in status of the ideas discussed by students and how (or if) they are challenged, but the framework provides
us less concrete guidance for instruction. What do we want students to know or to be able to do? Other frameworks
are more prescriptive in this regard. Weinberger and Fischer (2006) have applied different kinds of computer-
20 CSCL 2007
supported collaboration scripts to successfully facilitate learners’ interaction with respect to the single dimensions of
their framework. Their line of research indicates that especially scripts that facilitate transactivity of learners in
CSCL environments, have also facilitated individual knowledge acquisition (Weinberger, Stegmann, Fischer, &
Mandl, in press).
This type of analytic focus may be applied across most collaborative online argumentation environments
independent of environment structure or the nature of the artifacts created, because this analysis can focus at
microgenetic scales as well as broad scales. Increased complexity of application accompanies this increased power,
however. The challenge of this analytic category manifests itself in terms of increased amount and complexity of
work required to reliably apply these types of analyses across larger samples. An overview of the suitability of these
four frameworks for assessing argumentation in different contexts is provided in Table 5.
Table 5: Suitability of the analytic frameworks that focus on patterns and trajectories of participant interaction
Nature of the Argumentation
Subject of the
Discussion
Goal of the
Discussion
Rules for
Judging Ideas
Medium
Tools used in the Environment
Framework
Well defined problem
with one solution
Complex problem with
multiple solutions
Wicked problems with
no right answer
Reach consensus or
persuade others
Learn more about the
topic
Develop a solution
Empirical
Plausibility or Logic
Moral or ethics
Easily accessible and
accessed information
Computer mediated
communication
Representations of
subject matter
Dynamic visuals of
student arguments
Socio-cognitive
structuring
Awareness heightening
tools
Leitão (2000):
Knowledge Building •• ••• ••• ••• ••• ••• •• •• •• •• ••• ••
Hogan et al. (2000):
Interactional Patterns ••• ••• •• ••• •• •• •• •• •• •• ••• ••
Baker (2003): How ideas
change •• ••• ••• ••• ••• •• •• •• •• ••• ••
Weinberger & Fischer
(2006): Co-construction
of knowledge
•• ••• •• ••• ••• ••• •• •• ••• •• •••
Synthesis
In this symposium we consider several frameworks for analyzing dialogic argumentation in online learning
environments. These analytic frameworks vary significantly in terms of their focus and affordances. (Each presenter
in our symposium will go into greater detail about each focal category.) Although most of the frameworks discussed
here would assess the student example as representing fairly desirable argumentative discourse, they each do so for
very different reasons. In building online environments to support argumentation, researchers therefore need to be
clear and specific in terms of their theoretical commitments about argumentation and the pedagogical goals they
wish to foster (and concomitantly measure) through the environment. These decisions are foundational in the
subsequent adoption or development of an appropriate analytic framework.
Another issue that becomes apparent when reviewing these frameworks involves the potential to
synergistically integrate multiple categories of analytic focus within a single framework. Although each paper in this
symposium examines a single focal category, all of the frameworks consider additional foci beyond their focal
categories. By coordinating the analyses of multiple categories simultaneously, we can potentially learn more about
students’ performance in terms of each individual category. Integrating other analyses within the analysis of the
patterns and trajectories of participant interaction seems the most promising. Most of the other categories of analytic
focus correlate frequency counts of various components as correlational markers for argumentation quality. Careful
tracking of participant interaction and the evolution of ideas would align our analyses more directly, and therefore
potentially more validly, with the processes of argumentation we wish to foster. The challenge, of course, rests in
the increased accompanying complexity of conducting such analyses.
21 CSCL 2007
Online learning environments offer strong affordances for grappling with these challenges and realizing
these gains. Online learning environments incorporate the potential to closely log students’ actions and interactions.
As we develop technologies to more carefully track and analyze student data, we will have the capability to track
interactions and quality more accurately in real time. Based on this information, we could then modify supports for
argumentation in real time. Dönmez, Rosé, Stegmann, Weinberger, and Fischer (2005) have made early progress in
this regard by harnessing latent text analysis technology to score the quality of students’ argumentation products.
Similarly, the Multiple Protocol Episode Analysis system (Janssen, Erkens, Jaspers, & Kanselaar, 2006) can score
extended dialogs and messages using a complex rules system instantaneously. In both of these examples, analyses
were not conducted in real time, but the potential is staggering. As we develop more sophisticated methods for
analyzing argumentation, we should therefore continue to monitor the possibilities for embedding these analytic
methods directly as real time functionality within online learning environments. These analytic models would
therefore not only improve our research capabilities but also facilitate higher levels of interactivity and customized
scaffolding for students engaging in argumentation in our schools. The discussion at the conclusion of our
symposium will also consider the implications of the frameworks beyond research in terms of these other
applications.
References
Andriessen, J., Baker, M., & Suthers, D. (Eds.). (2003). Arguing to learn. Confronting cognitions in computer-
supported collaborative learning environments. Dordrecht: Kluwer.
Baker, M. (2003). Computer-mediated argumentative interactions for the co-elaboration of scientific notions. In J.
Andriessen, M. Baker & D. Suthers (Eds.), Arguing to learn: Confronting cognitions in computer-
supported collaborative learning environments (pp. 47-78). the Netherlands: Kluwer Academic Publishers.
Baker, M., Andriessen, J., Lund, K., van Amelsvoort, M., & Quignard, M. (submitted). Rainbow: A framework for
analyzing computer-mediated pedagogical debates. International Journal of Computer Supported
Collaborative Learning.
Clark, D. B., & Sampson, V. D. (2005). Analyzing the quality of argumentation supported by personally-seeded
discussions. Paper presented at the international conference on Computer Support for Collaborative
Learning (CSCL '05), Taipei, Taiwan.
deVries, E., Lund, K., & Baker, M. (2002). Computer-mediated epistemic dialogue: Explanation and argumentation
as vehicles for understanding scientific notions. Journal of the Learning Sciences, 11(1), 63-103.
Dönmez, P., Rosé, C. P., Stegmann, K., Weinberger, A., & Fischer, F. (2005). Supporting CSCL with automatic
corpus analysis technology. Paper presented at the International Conference on Computer Supported
Collaborative Learning - CSCL 2005, Taipei, TW.
Hogan, K., Nastasi, B. K., & Pressley, M. (2000). Discourse patterns and collaborative scientific reasoning in peer
and teacher-guided discussions. Cognition and Instruction, 17(4), 379-432.
Janssen, J., Erkens, G., Jaspers, J., & Kanselaar, G. (2006, June/July). Visualizing participation to facilitate
argumentation. Paper presented at the 7th International Conference of the Learning Sciences, Bloomington,
IN.
Jiménez-Aleixandre, M. P., Rodríguez, A. B., & Duschl, R. A. (2000). "Doing the lesson" or "doing science":
Argument in high school genetics. Science Education, 84, 757-792.
Kuhn, D., & Udell, W. (2003). The development of argument skills. Child Development, 74(5), 1245-1260.
Leitão, S. (2000). The potential of argument in knowledge building. Human Development, 43, 332-360.
Piaget, J. (1985). The equilibrium of cognitive structures: The central problem of intellectual development. Chicago:
University of Chicago Press.
Teasley, S. (1997). Talking about reasoning: How important is the peer in peer collaboration? In L. B. Resnick, R.
Säljö, C. Pontecorvo & B. Burge (Eds.), Discourse, tools and reasoning: Essays on situated cognition (pp.
361-384). Berlin: Springer.
Toulmin, S. (1958). The uses of argument. Cambridge: Cambridge University Press.
Walton, D. N. (1996). Argument structure: A pragmatic theory. Toronto: University of Toronto Press.
Weinberger, A., & Fischer, F. (2006). A framework to analyze argumentative knowledge construction in computer-
supported collaborative learning. Computers & Education, 46, 71-95.
Weinberger, A., Stegmann, K., Fischer, F., & Mandl, H. (in press). Scripting argumentative knowledge construction
in computer-supported learning environments. In F. Fischer, H. Mandl, J. Haake & I. Kollar (Eds.),
Scripting computer-supported communication of knowledge - cognitive, computational and educational
perspectives.
22 CSCL 2007
... CSCL environments may be seen as complementary to such dialogue, in particular where they embody some of the systems through which exploratory and accountable dialogue are more likely to occur -the "ground rules" or guidance for production of each. It is because of these complexities that systems have been developed specifically to support particular types of formalized argumentation schema (Clark, Sampson, Weinberger, & Erkens, 2007;Weinberger, Ertl, Fischer, & Mandl, 2005;Weinberger & Fischer, 2006). However, a core consideration in such platforms is the ways in which the platform structures, or scripts, dialogue, rather than the analysis of unstructured dialogue. ...
Article
Accounts of the nature and role of productive dialogue in fostering educational outcomes are now well established in the learning sciences and are underpinned by bodies of strong empirical research and theorising. Allied to this there has been longstanding interest in fostering computer-supported collaborative learning (CSCL) in support of such dialogue. Learning analytic environments such as massive open online courses (moocs) and online learning environments (such as virtual learning environments, VLEs and learning management systems, LMSs) provide ripe potential spaces for learning dialogue. In prior research, preliminary steps have been taken to detect occurrences of productive dialogue automatically through the use of automated analysis techniques. Such advances have the potential to foster effective dialogue through the use of learning analytic techniques that scaffold, give feedback on, and provide pedagogic contexts promoting, such dialogue. However, the translation of learning science research to the online context is complex, requiring the operationalization of constructs theorized in different contexts (often face to face), and based on different data-sets and structures (often spoken dialogue).. In this paper we explore what could constitute the effective analysis of this kind of productive dialogue, arguing that it requires consideration of three key facets of the dialogue: features indicative of productive dialogue; the unit of segmentation; and the interplay of features and segmentation with the temporal underpinning of learning contexts. We begin by outlining what we mean by ‘productive educational dialogue’, before going on to discuss prior work that has been undertaken to date on its manual and automated analysis. We then highlight ongoing challenges for the development of computational analytic approaches to such data, discussing the representation of features, segments, and temporality in computational modelling. The paper thus foregrounds, to both learning-science-oriented and computationally-oriented researchers, key considerations in respect of the analysis dialogue data in emerging learning analytics environments. The paper provides a novel, conceptually driven, stance on the state of the contemporary analytic challenges faced in the treatment of dialogue as a form of data across on and offline sites of learning.
Article
Purpose – The purpose of this paper is to get a first approximation of the usefulness of online forums with regard to information seeking and knowledge generation. Design/methodology/approach – This study captures the characteristics of knowledge generation by examining the pragmatics and types of information needs of posted questions and by investigating knowledge related characteristics of discussion posts as well as the success of communication. Three online forums were examined. The data set consists of 55 threads, containing 533 posts which were categorized manually by two researchers. Findings – Results show that questioners often ask for personal estimations. Information needs often aim for actionable insights or uncertainty reduction. With regard to answers, factual information is the dominant content type and has the highest knowledge value as it is the strongest predictor with regard to the generation of new knowledge. Opinions are also relevant, but in a rather subsequent and complementary way. Emotional aspects are scarcely observed. Overall, results indicate that knowledge creation predominantly follows a socio-cultural paradigm of knowledge exchange. Research limitations/implications – Although the investigation captures important aspects of knowledge building processes, the measurement of the forums’ knowledge value is still rather limited. Success is only partly measurable with the current scheme. The central coding category “new topical knowledge” is only of nominal value and therefore not able to compare different kinds of knowledge gains in the course of discussion. Originality/value – The investigation reaches out beyond studies that do not consider that the role and relevance of posts is dependent on the state of the discussion. Furthermore, the paper integrates two perspectives of knowledge value: the success of the questioner with regard to the expressed information need and the knowledge building value for communicants and readers.
Article
Full-text available
It is now well recognised that argumentative interactions can be vehicles of collaborative learning, especially on a conceptual plane (see e.g. Andriessen & Coiner, 1999). Information and communication technologies such as Computer-Supported Collaborative Learning (“CSCL”) environments can play an important role in such learning to the extent that they enable task sequences and interpersonal communication media to be structured in ways that favour the co-elaboration1 of knowledge (e.g. Baker, 1996, 1999; Baker, de Vries, Lund & Quignard, 2001).
Article
Full-text available
In this paper we present a framework for analysing when and how students engage in a specific form of interactive knowledge elaboration in CSCL environments: broadening and deepening understanding of a space of debate. The framework is termed “Rainbow,” as it comprises seven principal analytical categories, to each of which a colour is assigned, thus enabling informal visualisation by the analyst of the extent to which students are engaging in interaction relating to potential achievement of its pedagogical goal. The categories distinguish between activities that are part of the prescribed assignment and activities that are not, and between task-focused and non-task-focused activities. Activities focused on managing the interaction itself are distinguished from argumentative interaction. Notably, an operational definition of what it means to broaden and deepen understanding in this case is also provided here. The functional Rainbow analysis is complemented by an analysis of topics and subtopics that enables identification of one form of conceptual deepening of the question. In comparison with existing analysis techniques, Rainbow synthesises much of what is known into a single framework, with a broad theoretical base. The usability and educational relevance of the framework has been validated experimentally across a variety of collaborative learning tasks and communication media. Possible and actual extensions to the framework are discussed, with respect to additional CSCL tools, domains and tasks.
Conference Paper
Full-text available
Several researchers have shown that student participation in discourse paralleling that of scientific communities is critical to successful science education. This study focuses on supporting scientific argumentation in the classroom through a personally-seeded online discussion system. Students use an online interface to build principles to describe data they have collected. These principles become the seed comments for the online discussions. The software sorts students into discussion groups with students who have built different principles so that each discussion group can consider and critique multiple perspectives. We outline a methodology for (a) coding the individual comments in terms of epistemic operation, grounds, and content normativity and (b) parsing and assessing overall argumentation structure of the oppositional episodes. This study therefore contributes to the research literature both in terms of scaffolding and assessing student argumentation in online asynchronous forums.
Chapter
Using data from two studies of scientific reasoning, this chapter explores whether transactive discussion is the basis of productive peer collaborations and questions what role the partner plays in the apparent effectiveness of this type of discussion. In the first study, dyads who engaged in transactive discussion showed more improvement than dyads who did not have transactive discussions. In the second study, both dyads and children working alone showed improvement related to talk in general. However, dyads produced more transactive types of talk and showed a more complex understanding of the problem that they generated more quickly. Having a partner was not a necessary or sufficient condition for producing transactive talk but increased likelihood that it would occur. The data from these studies suggest that the value of peer collaborations may be that the presence of a partner provides a natural context for elaborating one’s own reasoning.
Article
This article focuses on the capacity of students to develop and assess arguments during a high school genetics instructional sequence. The research focused on the locating distinction in argumentation discourse between doing science vs. doing school or doing the lesson (Bloome, Puro, & Theodorou, 1989). Participants in this classroom case study were high school (9th grade) students in Galicia (Spain). Students were observed, videotaped, and audiotaped while working in groups over six class sessions. Toulmin's argument pattern was used as a tool for the analysis of students' conversation and other frames were used for analyzing other dimensions of students' dialogue; (e.g., epistemic operations, use of analogies, appeal to consistency, and causal relations). Instances of doing science and instances of doing the lesson are identified and discussed as moments when the classroom discourse is dominated either by talking science or displaying the roles of students. The different arguments constructed and co-constructed by students, the elements of the arguments, and the sequence are also discussed, showing a dominance of claims and a lesser frequence of justifications or warrants. Implications for developing effective contexts to promote argumentation and science dialogue in the classroom are discussed.
Article
It is a matter of quasi-consensus among argumentation researchers that engaging in argumentation sets the scene for changes in people’s views. However, the actual process by which such changes occur has not yet been entirely understood. This article represents an effort towards understanding how processes of knowledge building and transformation evolve in argumentation. To achieve this goal, it outlines a unit of analysis designed to capture the process by means of which knowledge is continuously updated through argumentation. Then, the analysis of fragments of people’s argumentation illustrates how this unit of analysis can be utilized. It also provides an account of some of the ways in which reasoning is organized in face-to-face argumentation as well as in the speaker’s own discourse.
Article
In this study we examined the discourse components, interaction patterns, and reasoning complexity of 4 groups of 12 Grade 8 students in 2 science classrooms as they constructed mental models of the nature of matter, both on their own and with teacher guidance. Interactions within peer and teacher-guided small group discussions were videotaped and audiotaped, transcribed, and analyzed in a variety of ways. The key act of participants in both peer and teacher-guided groups was working with weak or incomplete ideas until they improved. How this was accomplished differed somewhat depending on the presence or absence of a teacher in the discussion. Teachers acted as a catalyst in discussions, prompting students to expand and clarify their thinking without providing direct information. Teacher-guided discussions were a more efficient means of attaining higher levels of reasoning and higher quality explanations, but peer discussions tended to be more generative and exploratory. Students' discourse was more varied within peer groups, and some peer groups attained higher levels of reasoning on their own. Ideas for using the results of these analyses to develop teachers' and students' collaborative scientific reasoning skills are presented.