Content uploaded by Ali H. Al-Hoorie
Author content
All content in this area was uploaded by Ali H. Al-Hoorie on Dec 07, 2020
Content may be subject to copyright.
PSYCHOLOGY OF LANGUAGE LEARNING AND TEACHING: 11
Student
Engagement in the
Language
Classroom
Edited by
Phil Hiver, Ali H. Al-Hoorie and
Sarah Mercer
MULTILINGUAL MATTERS
Bristol • Blue Ridge Summit
4836.indb 3 26-10-2020 12:01:02
Library of Congress Cataloging in Publication Data
A catalog record for this book is available from the Library of Congress.
Names: Hiver, Phil, 1984 - editor. | Al-Hoorie, Ali H., 1982- editor. |
Mercer, Sarah, editor.
Title: Student Engagement in the Language Classroom/Edited by Phil Hiver,
Ali H. Al-Hoorie and Sarah Mercer.
Description: Bristol, UK; Blue Ridge Summit, PA : Multilingual Matters,
2021. | Series: Psychology of Language Learning and Teaching: 11 |
Includes bibliographical references and index. | Summary: “Through a mix of
conceptual and empirical chapters, this book defines engagement for the field of
language learning. It serves as an authoritative guide for anyone wishing to
understand the unique insights engagement can give into language learning and
teaching, or anyone conducting their own research on engagement within and
beyond the classroom”—Provided by publisher.
Identifiers: LCCN 2020033226 (print) | LCCN 2020033227 (ebook) | ISBN
9781788923590 (paperback) | ISBN 9781788923606 (hardback) | ISBN
9781788923613 (pdf ) | ISBN 9781788923620 (epub) | ISBN 9781788923637 (kindle
edition)
Subjects: LCSH: Language and languages—Study and teaching—Psychological
aspects. | Second language acquisition—Psychological aspects. |
Motivation in education.
Classification: LCC P53.7 .S78 2021 (print) | LCC P53.7 (ebook) | DDC
418.0071—dc23 LC record available at https://lccn.loc.gov/2020033226
LC ebook record available at https://lccn.loc.gov/2020 0332 27
British Library Cataloguing in Publication Data
A catalogue entry for this book is available from the British Library.
ISBN-1 3: 9 78 -1-78892 -360 - 6 (hbk)
ISBN-1 3: 9 78 -1-78892 -359-0 (pbk)
Multilingual Matters
UK: St Nicholas House, 31-34 High Street, Bristol BS1 2AW, UK.
USA: NBN, Blue Ridge Summit, PA, USA.
Website: www.multilingual-matters.com
Twitter: Multi_Ling _Mat
Facebook: https://www.facebook.com/multilingualmatters
Blog: www.channelviewpublications.wordpress.com
Copyright © 2021 Phil Hiver, Ali H. Al-Hoorie, Sarah Mercer and the authors of
individual chapters.
All rights reserved. No part of this work may be reproduced in any form or by any
means without permission in writing from the publisher.
The policy of Multilingual Matters/Channel View Publications is to use papers that
are natural, renewable and recyclable products, made from wood grown in
sustainable forests. In the manufacturing process of our books, and to further
support our policy, preference is given to printers that have FSC and PEFC Chain of
Custody certification. The FSC and/or PEFC logos will appear on those books
where full certification has been granted to the printer concerned.
Typeset by Nova Techset Private Limited, Bengaluru and Chennai, India.
Printed and bound in the U K by the CPI Books Group Ltd.
Printed and bound in the US by NBN.
4836.indb 4 26-10-2020 12:01:02
75
Measuring L2 Engagement:
A Review of Issues and
Applications
Shiyao (Ashlee) Zhou, Phil Hiver and
AliH.Al-Hoorie
Introduction
Engagement is considered a ‘new kid on the block’ (Reschly &
Christenson, 2012: 4), particularly compared with other more mature and
established constructs such as motivation. Yet, despite the relatively short
history of engagement research, it has received an exponential increase in
popularity (Sinatra etal., 2015). Many stakeholders would agree that
engagement is a leading indicator of performance and ultimate attain-
ment, and thus a key contribution classroom instruction can make to stu-
dents’ ultimate learning is enhancing their engagement (Mercer, 2019;
Philp & Duchesne, 2016).
Engagement has achieved this popularity with researchers, policy-
makers and practitioners for several reasons. First, engagement plays a
critical role in educational outcomes and in learning success (Hattie,
2009). Second, the nature of engagement as a ‘meta-construct’ combining
observable behaviors, internal cognitions, emotions and sociocultural
interactions (Fredricks etal., 2016) makes it appealing to many scholars.
Third, practitioners seem to both recognize and readily grasp the phe-
nomenological manifestations of engagement and disengagement, given
their clear behavioral dimensions (although see Chapter 8 by Mercer
et al., this volume). Finally, its potential as a target for interventions
remains strong. The idea that engagement may be malleable and respon-
sive to intervention draws attention from all sides as evidence builds for
promoting engagement across social and academic contexts (Appleton
etal., 20 08).
In educational psychology, studies on engagement have centered
around four broad contexts: community, school, classrooms and learning
activity (Skinner & Pitzer, 2012). In the community, engagement concerns
5
4836.indb 75 26-10-2020 12:01:07
learners’ degree of participation and active membership in school and
other community organizations. At the school level, outcomes of engage-
ment are routinely measured through attendance, dropout or retention
rates (Finn, 1989). In foreign and second language (L2) classroom set-
tings, relevant indicators of engagement are associated with interaction,
involvement or participation in class and with outcomes related to lan-
guage use and development (Philp & Duchesne, 2016). Within a learning
activity, engagement refers to the quality and intensity of learners’ contri-
bution to completing a specific task during class time. Because engage-
ment can occur in these and other settings, a point further illustrated by
the various empirical chapters of this volume, definitions and operation-
alizations of engagement are rich and varied (Reschly & Christenson,
2012). Consensus in the academic literature is that student engagement is
a multifaceted construct with multiple, at times interwoven, dimensions
(see also Chapter 2 by Sang & Hiver, this volume). Empirical studies of
engagement might focus on the cognitive, emotional, social or behavioral
facets that lead to effective learning (Philp & Duchesne, 2016).
Additionally, because of the important role engagement appears to play in
the student learning process across contexts and within numerous learn-
ing subdomains, the need for reliable, valid and domain-specific measures
of student engagement is imperative (Anderson, 2017).
In this chapter, our objective is to explore the past, present and future
of measuring the construct of engagement. We first introduce some of the
more prominent approaches to measuring student engagement from gen-
eral education, including student self-report, experience sampling, teacher
ratings of students, interviews and observations (Hofkens & Ruzek,
2019). We describe how each approach has been applied to measuring
engagement, examine their validity and reliability and discuss the
strengths and weakness of each measurement approach for L2 research-
ers. We also examine several widely used self-report measures in student
engagement research with reference to their operational definitions, use,
samples and psychometric properties. We elaborate on considerations
related to the measurement of engagement in L2 learning, such as the dif-
ferentiation between L2 engagement and related constructs, the variety of
purposes for measuring L2 engagement, and measuring general versus
domain-specific L2 engagement (e.g. task- and skill-specific engagement).
Finally, we summarize the limitations of currently available instruments
for eliciting engagement data and discuss directions for future develop-
ment in the field.
Defining the Meta-construct of Engagement
Engagement is action (‘energy in action’ according to Lawson &
Lawson, 2013: 435). Previous work has posited that engagement is mani-
fested not only in its behavioral facet (e.g. active participation), but also
76 Part 1: Conceptual Chapters
4836.indb 76 26-10-2020 12:01:07
in demonstrations of action through the cognitive (e.g. tracking a speaker
for attention) and social dimensions (e.g. back channeling in interaction),
as well as in students’ emotional responses to learning activities and sub-
jective perceptions (Baralt etal., 2016; Henry & Thorsen, 2018; Lambert
etal., 2 017).
Cognitive engagement refers to mental processes such as the deliberate
allocation and maintenance of attention and intellectual eort (Helme &
Clarke, 2001). This cognitive dimension also implicates the active use of
relevant self-regulated strategies that facilitate these mental processes
(Philp & Duchesne, 2016). In L2 classroom settings, research on cognitive
engagement has focused primarily on verbal manifestations, including
peer interactions, asking questions, volunteering answers, exchanging
ideas, oering feedback, providing direction, informing and explaining
(Helme & Clarke, 2001). Non-verbal communication, private speech and
exploratory talk (i.e. learner discourse that occurs as they attempt to
make sense of learning) are also seen by some as further indicators of this
dimension (see e.g. Mercer & Hodgkinson, 2008).
Behavioral engagement corresponds with the amount and quality of
learners’ in-class participation and time spent on task (Reschly &
Christenson, 2012). For their part, Philp and Duchesne (2016) propose
that learners’ degree of eort, persistence and active involvement are lead-
ing indicators of behavioral engagement. Action is key, and the degree and
quality of time students spend in active participation repeatedly appears
as a positive predictor of academic achievement (Gettinger & Walter,
2012). Of course, even with this clear link between behavioral engagement
and desired learning outcomes, the true potential of engagement lies in the
interaction of its dierent facets not on any one dimension in isolation (see
also Chapter 6 by Sulis & Philp, this volume; Chapter 7 by Carver etal.,
this volume).
Definitions of emotional engagement vary with the focus of research,
from school level to specific learning activity (Skinner & Pitzer, 2012). In
most instances, emotional engagement refers to the aective character of
learners’ involvement (see also Chapter 9 by Phung et al., this volume).
Enthusiasm, interest and enjoyment – essentially markers of one’s aective
involvement during class time – have been identified as critical indicators
of emotional engagement in the classroom (Skinner etal., 2009). Perhaps
unsurprisingly, these emotions – both positive (e.g. enthusiasm, interest)
and negative (e.g. anxiety, hopelessness) – that are elicited by the learning
context, by peers and by instructional tasks and activities are assumed to
play a key role in learners’ eort (Pekrun & Linnenbrink-Garcia, 2012).
Although the social dimension is not included in all models of engage-
ment, an increasing number of scholars agree that social interactions play
an essential role in the types of engagement that foster student learning
(Fredricks etal., 2016). This aspect of engagement is defined in light of the
social forms of activity and involvement that are prevalent in communities
Measuring L2 Engag ement: A Review of Is sues and Applications 77
4836.indb 77 26-10-2020 12:01:07
of language learning and use (see also Chapter 10 by Fukuda etal., this
volume), including participation and interaction with interlocutors, and
the quality of such social interactions (Linnenbrink-Garcia etal., 2011).
The social dimension is clearly prominent both within and outside of lan-
guage classrooms (Philp & Duchesne, 2016), and this dimension may be
distinguished from other forms of engagement when considering that it is
explicitly relational in nature and its purpose is interaction with and sup-
port of others.
One domain-specific type of engagement has been prominent in the
work of Agneta Svalberg (see Chapter 3). Svalberg (2009) has described
engagement with language (EWL) as the process through which Language
Awareness is developed. In her work on the topic, she oers the following
definition:
In the context of language learning and use, Engagement with Language
is a cognitive, aective, and/or social process in which the learner is the
agent and language is the object (and sometimes vehicle). The learner is
engaged:
• Cognitively: the engaged individual is alert, pays focused attention
and constructs their own knowledge.
• Aectively: the engaged individual has a positive, purposeful, willing
and autonomous disposition towards the language and/or what it
represents.
• Socially: the engaged individual is interactive and initiating. (Svalberg,
2009: 247; see also Svalberg, 2012)
Related to this, but developed in separate lines of research is the notion of
engagement in task-based interaction (Dao, 2019; Lambert etal., 2017).
Within language learning research, the level of granularity is at times nar-
rower even than the classroom and often focused more precisely on mean-
ing- or language-focused classroom tasks. Task engagement, within these
settings, has been described as the degree to which language learners iden-
tify with the objectives of the task, relate to its content, and make eective
use of the sources available to carry out it (e.g. Bygate & Samuda, 2009).
Put dierently, task engagement is learners’ energy in action observable
during the course of exchanging ideas and information with an interlocu-
tor or while completing a language-related task.
Given these multiple dimensions and the diverse topical areas of con-
cern that engagement touches on (Philp & Duchesne, 2016; Svalberg,
2009), engagement can be positioned as a meta-construct that unites
many separate lines of research within the field.
Researching Engagement
Let us now segue to considerations regarding how these definitions are
used for engagement research. The educational research community has
78 Part 1: Conceptual Chapters
4836.indb 78 26-10-2020 12:01:08
witnessed a burst of interest and activity around the construct of engage-
ment over the last two decades. This points to a clear desire to probe the
nature of engagement, capture the necessary conditions for engagement,
explore the development of engagement over time, maintain and sustain
learners’ engagement as well as re-engage disaected students. However,
several unsettled issues that would facilitate this program of research have
yet to be resolved. The first and foremost of these is related to the fuzzi-
ness surrounding how engagement is operationalized. Reschly and
Christenson (2012) caution that engagement still suers from a jingle (i.e.
dierent terms being used to refer to identical notions or constructs) and
jangle (i.e. the same terminology being used to describe distinct notions
and constructs) in the way it is defined and operationalized. This state of
aairs is no dierent in our field (Hiver etal., in preparation). Given the
variety of operational definitions used across studies it is not uncommon
to discover, for instance, that one researcher’s conceptualization of cogni-
tive engagement is used as another’s measurement of behavioral engage-
ment (Christenson etal., 2012). Some see engagement as an outcome that
predicts learning, while others see it as a resource progressively built
during the process of learning (e.g. Janosz, 2012). This state of aairs is
puzzling to an observer and may be due to the broad conceptual defini-
tions of engagement, the overlap of its dimensions and the theoretical
starting point and perspectives of various scholars. The consequence of
this lack of clarity is that the unique contribution of engagement to stu-
dent learning and development has yet to reach its full potential (Eccles &
Wang, 2012).
In addition to these operational issues, a number of methodological
challenges related to eliciting, measuring and analyzing this multidimen-
sional construct remain. As experts have argued, ‘one of the challenges
with research on student engagement is the large variation in the measure-
ment of this construct, which has made it challenging to compare findings
across studies’ (Fredricks & McColskey, 2012: 763). To date, the most
frequently used approach in evaluating engagement is self-reports.
However, as some reviews of designs and measurement techniques for
engagement show, very few valid and psychometrically sound measures of
student engagement exist with which to assess the multidimensional
nature of engagement (Hofkens & Ruzek, 2019). Exacerbating matters,
similarly worded items (e.g. I work hard and contribute to class) in some
of the most widely used instruments are featured inconsistently or included
throughout behavioral, emotional, cognitive and social engagement
scales. To assess engagement across these dimensions as accurately as pos-
sible, it is essential to take stock of existing tools for measuring engage-
ment and to evaluate the most commonly used data collection methods.
This may be the first step to developing more systematic and better inte-
grated quantitative and qualitative methods (Glanville & Wildhagen,
2007) that can also accommodate scholars who wish to assess longer-term
Measuring L2 Engag ement: A Review of Is sues and Applications 79
4836.indb 79 26-10-2020 12:01:08
engagement and variations across learning tasks and conditions, as well
as engagement in both individual and group settings (see e.g. Hiver &
Al-Hoorie, 2020, for an extended discussion on such designs).
Data Collection Methods in Engagement Research
Various approaches to measuring student engagement are found
throughout education and the learning sciences more broadly. These
include surveys and questionnaires, direct observations, expert (e.g.
teacher, caretaker) ratings, interviews and experience sampling methods.
Each of these approaches to measuring student engagement comes with its
own strengths and limitations. In this section, we examine each approach
in turn. Table 5.1 provides a helpful summary of these.
Engagement surveys and questionnaires
Self-report surveys and questionnaires are the most frequently used
methods for measuring student engagement. In this type of measurement,
students are presented with items describing dierent facets of engage-
ment and are directed to choose the response from a range of possibilities
that best describes them. Anchors might include the degree of agreement
(i.e. ‘strongly agree/disagree’), or the extent to which an item describes the
respondent (i.e. ‘very much/not at all like me) or is true of them (i.e. ‘very
true/untrue of me’). Surveys and questionnaires can be used to assess vari-
ous domains of engagement. To date there is no single instrument that is
accepted for use across contexts – just as there is none that is accepted as
a field-specific measure of engagement. Yet, the state of aairs in educa-
tional psychology and the learning sciences is striking when considering
that surveys and questionnaires (some administered to over 350,00 0 stu-
dents!) continue to be used as a main source of data. Fredricks and
McColskey’s (2012) review provides a comprehensive introduction to such
instruments.
Although there are a small number of such instruments that are con-
tent- or domain-specific, such as math engagement (Kong etal., 2003),
science (Sinatra etal., 2015) or reading engagement (Wigfield etal., 2008),
the majority of these surveys assess learning engagement in a domain-
general way. In second language acquisition (SLA) specifically, recently,
Hiver etal. (2020) developed and piloted a set of survey scales to assess
learners’ engagement in the language classroom across the cognitive,
aective, behavioral and social domains (see Appendix 5.1). More recently,
this questionnaire has been used in a sequence of studies investigating the
role of learners’ (N > 400, from both China and Colombia) engagement
and persistence on their task performance (i.e. their syntactic and lexical
complexity, rate and amount of language production, accuracy, time on
task) under various task conditions.
80 Part 1: Conceptual Chapters
4836.indb 80 26-10-2020 12:01:08
Measuring L2 Engag ement: A Review of Is sues and Applications 81
Table 5.1 Commonly employed approaches to eliciting engagement data
Student engagement
measure
Advantages Disadvantages Sample studies
surveys and
questionnaires
Suitable for psychometric testing
and validation (e.g. item analysis,
factor analysis and item response
theory);
Simple and straightforward
administration;
Measures can be standardized
Limited to self-report;
Lack of real-time data collection;
Participant bias and other
drawbacks of self-report
Wang etal. (2016): 4-scale survey of students’ math and science
engagement and its psychometric properties
Wang etal. (2019): domain-general survey of secondary school
learners’ (N = 3632) engagement and disengagement
Hiver etal. (2020): 3-scale survey (adapted from Wang etal.,
2016) of learners’ language learning engagement. Translated
into multiple languages
observations and
expert ratings
Spans quantitative or qualitative
techniques;
Results are detailed and descriptive;
Able to capture real-time data;
Can link contextual factors to
student engagement levels;
Measures can be standardized
Results in individual or small
samples at a time;
Time-consuming to assess;
Not easily generalizable without
large-N;
Lacks ability to clearly measure
aective and/or cognitive
aspects of engagement
Järvelä etal. (2016): Classroom observations (84 hours) of
collaborative engagement and self-regulated learning in
dierent task conditions
Baralt etal. (2016): Expert ratings of L2 learners’ (N = 40)
engagement in face to face and online task-based interaction
Lambert etal. (2017): Expert ratings of L2 students’ (N = 32)
cognitive, behavioral and social engagement on learner-
generated vs teacher-generated tasks
(Continued)
4836.indb 81 26-10-2020 12:01:08
82 Part 1: Conceptual Chapters
Table 5.1 Continued.
Student engagement
measure
Advantages Disadvantages Sample studies
interviews Good for collecting cognitive
processing data;
Identifies contextual and
background factors of student
engagement;
Able to collect in-depth information
on student engagement
One interview at a time;
Time-consuming to assess;
Socially desirable responses;
Interviewer training dependent;
Dicult to generalize
idiosyncratic findings to a
population
Fredricks etal. (2016): In-depth interviews with math and
science students (N = 106) to explore multiple dimensions of
their engagement
Han and Hyland (2015): Interviews (one of multiple data
sources) examining L2 students’ (N = 4) cognitive, behavioral
and aective engagement with teacher written corrective
feedback (WCF)
Hiver etal., 2019: Longitudinal life-story interviews of whether,
how and why L2 learners (N = 8) engaged in opportunities for
language learning and use
experience sampling Real-time engagement
ratings;
Tracks length and
intensity of engagement;
Observations recorded without
interference from an observer;
Multiple students’ data
collected simultaneously;
Many data points over time able to
trace changes in development
Time-consuming and resource-
intensive;
Quality of data depends on
participation of student
respondents;
Struggle to include range
of items that represent
multidimensional nature of
constructs in each sampling
moment;
Not suitable for younger
children, student participants
Salmela-Aro et al. (2016): Signal-contingent study of situational
demands and resources associations with students’ (N = 487)
emotional engagement
Sherno et al. (2016): Signal-contingent study of associations
between quality of the learning environment and student
(N=108) engagement in six subject areas
Schmidt et al. (2018): Signal-contingent study of relations
between students’ (N = 244) momentary engagement and
science learning activities/choices
Source: Adapted from Anderson, 2017: 23–24
4836.indb 82 26-10-2020 12:01:08
The strengths of these self-report designs for measuring engagement
relate to the practicality of administering them and the modest investment
of resources needed for data elicitation purposes. First, multi-item
response scales present far fewer practical challenges than other methods
when administering such instruments in classroom settings (Dörnyei &
Taguchi, 2010). Such surveys can be administered to large and diverse
samples of students simultaneously, particularly using online distribution
methods, with a relatively low expenditure of time and eort. This makes
it possible to collect data across various levels, at multiple time points, and
compare results from dierent institutions with broad geographic repre-
sentations. In addition, the quantitative feature of surveys and question-
naires readily lend themselves to necessary psychometric testing and
validation through, for example, reliability and validity testing
(Anderson,2017).
Another reason why self-report instruments may highlight important
information is that, in contrast to the purely objective data on behavioral
indicators such as tardiness, instances of active participation, or percent-
age of assignment completion that are features of measurement at the
school engagement level, self-report can be useful to elicit data from stu-
dents’ viewpoint and acknowledge the inherent subjectivity of engage-
ment (Appleton eta l., 2006; Garcia & Pintrich, 1996). Self-report methods
are especially useful for measuring emotional and cognitive engagement,
which tend to be elusive and less easily observable or inferred from exter-
nal behaviors.
Despite the advantages of surveys and questionnaires highlighted in
the literature, there are still some concerns about this approach to measur-
ing engagement. These include, among other things, the very nature of
self-report that risks participant bias and may skew results by drawing a
less accurate picture of student engagement than desired, and the lack of
real-time data given that such response-scales are typically retrospective
in their time reference and are situation-general in their context/domain
specificity (Fredricks & McColskey, 2012).
Observations and expert ratings of engagement
Observations and teacher or caretaker ratings of engagement are
useful for eliciting data at both the individual and whole classroom levels.
The data elicited from such observations and expert ratings can be quan-
titative, qualitative or a mixture of the two. For instance, some observa-
tional measures evaluate individual students’ on- and o-task behavior
(e.g. attentiveness, note-taking, body-language) as an individual-level
indicator of behavioral engagement (Volpe etal., 2005). These data might
result in dichotomous ratings (i.e. whether a learner is on task or not),
percentages (i.e. time spent on-task), ordinal scales (i.e. the extent to
which a learner is on task) or descriptively detailed summaries (i.e. details
Measuring L2 Engag ement: A Review of Is sues and Applications 83
4836.indb 83 26-10-2020 12:01:08
of how a learner exhibited on-task behavior). The point of entry for many
scholars who have adopted observations and expert ratings to examine
student engagement is some form of predetermined categories of behav-
iors that encompass either engagement or disengagement (Fredricks &
McColskey, 2012). A more interpretivist tradition of engagement research
drawing on observations tends to employ chiefly qualitative methods to
collect descriptive and narrative data that assess student engagement
(Anderson, 2017). When compared with data elicited exclusively from
observations, expert rating methods are used primarily to measure behav-
ioral and emotional engagement (Finn etal., 1991; Wigfield etal., 2008).
Observations and expert ratings of language learners’ on-task perfor-
mance or qualitative aspects of classroom discourse (e.g. language-related
episodes (LREs)) have been the primary source of L2-specific measurements
of engagement. This is particularly the case in research adopting Svalberg’s
(2009, 2012) EWL model as a framework for engagement in the classroom.
For example, Baralt etal. (2016) draw on expert ratings of L2 learners’
engagement in face-to-face and online task-based interaction in various task
conditions. Lambert etal. (2017) also use expert ratings of L2 students’
cognitive, behavioral and social engagement to examine dierences in task
performance on learner-generated versus teacher-generated tasks.
The principal strength of observations and expert ratings is its ability
to capture contextual factors that are intertwined with indicators of stu-
dent engagement (Fredricks & McColskey, 2012). For instance, it would
be important to know if learner engagement wanes toward the end of a
learning task, how distractions can interfere with maintaining engage-
ment, or whether engagement peaks during certain interactional events in
a classroom – observation allows the researcher to record such aspects.
Observation and expert ratings can also be used to triangulate and verify
data collected from self-report methods such as surveys and interviews.
Although observations and expert ratings work well when linking
contextual factors or specific instructional events with student engage-
ment levels, neither observations nor ratings can capture a complete pic-
ture of student engagement. The limitations of both methods for data
elicitation is that they are subject to observer or rater biases, and are often
dependent on applying clear observation schemes or rater codes – an issue
particularly when such data collection is done by third-party observers or
raters. In addition, participant bias may be a concern in instances when
students are aware of being observed (Fredricks & McColskey, 2012).
There is also the question of whether observations and ratings are capable
of documenting engagement levels and behaviors for more than one spe-
cific student or classroom at a time (see also Chapter 8 by Mercer etal.,
this volume) even with very recent advances in video recording technol-
ogy. These drawbacks, while minor in nature, make this data elicitation
method time-consuming and the data that results from it not easily
generalizable.
84 Part 1: Conceptual Chapters
4836.indb 84 26-10-2020 12:01:08
Engagement interviews
Interviews are another common data collection technique used to
assess student engagement. As with most interviewing techniques, inter-
views that examine student engagement vary, from tightly structured
questions to more open-ended question types. These serve as a flexible
tool with the capability to collect in-depth information regarding both
emotional and cognitive engagement. In the field of language learning,
such interviews have been used to examine learners’ engagement with
WCF (e.g. Han & Hyland, 2015). They have also been used to examine
which contextual factors appear to connect to student engagement, how
these factors are situated in time and place, and to elicit meaningful epi-
sodes or instances pertaining to how engagement relates to student learn-
ing experiences (Hiver et al., 2019). Another strength of examining
engagement through interviews, particularly stimulated recall interviews,
is that this data can provide a window into learners’ cognitive processing
and thus shed light on the cognitive component of engagement (Reschly &
Christenson, 2012).
As a way of tapping into engagement, interviews have some limitations
as well. The interviewing technique of the interviewer has a major influence
on the data elicited and the outcome of interviews. For example, if the inter-
viewer appears to be an authority figure interrogating the participants, there
may be a tendency for interviewees to respond in socially desirable ways
when confronted with the interview questions rather than with more factual
and transparent answers about their engagement. Existing concerns about
the objectivity of what people say about themselves all apply to this data
elicitation method. After all, how individuals portray and speak about
themselves and their engagement to others is susceptible to social desirabil-
ity and selective self-censoring (Al-Hoorie, 2016a, 2016b). In addition,
interviews as a data collection technique seem to work best when conducted
with a small population of respondents, and even so, are likely to result in
large amounts of textual data that require judicious and systematic analysis
in order for any meaningful conclusions about engagement to be drawn.
This obviously decreases the generalizability of the results.
Real-time sampling methods for engagement
The experience sampling method (ESM) originates in studies of flow,
which could itself be seen as a special, intense instance of engagement that
occurs while an individual is so immersed in an activity that they lose
consciousness of time and space (Hektner etal., 2007). When used as a
data elicitation tool, ESM treats the tangible experience of engagement as
complex, emerging through interconnections between the individual and
the learning environment, and as structurally dynamic, undergoing adap-
tive change (for a more detailed overview, see Hiver & Al-Hoorie, 2020).
Measuring L2 Engag ement: A Review of Is sues and Applications 85
4836.indb 85 26-10-2020 12:01:08
The characteristic feature of using ESM to assess engagement is that
individuals are prompted to respond to data elicitation stimuli at regular
intervals (e.g. to indicate their aective state, expenditure of eort or level
of alertness throughout a learning episode each time they are signaled to
do so) (Reis & Gable, 200 0). ESM data are elicited by asking individuals
to provide systematic self-reports either at regular intervals (e.g. interval-
contingent sampling), when they are signaled (i.e. signal-contingent sam-
pling), or following a particular event of interest such as after every task
ends (i.e. event-contingent sampling) (Goetz et al., 2016). In the most
commonly used technique, individuals provide a response to these stimuli
whenever a signaling device prompts them to respond. In previous
decades, this was done through electronic beepers or pagers; however,
such signals can now be fully automated and sent to any one of the latest
smart devices or wearables (see Kubiak & Krog, 2012, for one review).
The signal is a cue to complete the data elicitation measures at that precise
moment, and intervals can be scheduled as regularly or infrequently as
desired. Some variations include every 5 or 10 minutes in an hour-long
class, between every stand-alone activity in a group meeting or project,
every two hours over a day-long period, or three times a day over a week.
While no L2-specific study has investigated engagement using an ESM
template, owing perhaps to the time- and resource-intensive nature of this
method, a new wave of studies in educational psychology use ESM as a
primary method for examining the association between qualities of the
learning environment and student engagement and learning behavior (e.g.
Salmela-Aro etal., 2016; Schmidt etal., 2018; Sherno etal., 2016).
One advantage of employing ESM in studies of learner engagement is
that ESM frees the researcher from the need to be directly involved or
physically present during any data elicitation. It also addresses a limita-
tion of observational methods that rely on data from single individuals or
single classrooms. ESM allows the tracking of many individuals’ engage-
ment levels simultaneously, over time, and across situations (Sherno
etal., 2003; Sherno & Schmidt, 2008). Compared to other self-report
methods that are retrospective, ESM taps into the ways in which individu-
als actually experience those activities and contexts in real time. Unlike
most traditional self-report techniques, it does not rely on a single assess-
ment moment but gathers repeated measurements across many occasions.
By doing this, ESM combines the ecological validity of naturalistic behav-
ioral observation with the nonintrusive nature of diaries and the rigor and
precision of psychometric techniques (Csikszentmihalyi, 2014: 23).
Of course, ESM studies of engagement must also grapple with practi-
cal implementation issues (Goetz etal., 2016). One major drawback is the
question of obtrusiveness, that is whether the repeated measurement pro-
cedure has an excessively disruptive influence on learners’ engagement in
the moment (Shiman etal., 2008). Perhaps the biggest challenge is the
heavy demand that such regular responses to the data elicitation stimuli
86 Part 1: Conceptual Chapters
4836.indb 86 26-10-2020 12:01:08
(i.e. the questions or items) impose on respondents. This can be a factor
that discourages potential participants, lowers overall completion rates
and causes data quality to deteriorate as time passes.
Measuring Engagement in Language Learning
Research measuring language-specific engagement in instructional
settings has existed for at least two decades (e.g. Dörnyei & Kormos,
2000; Platt & Brooks, 2002). This early work relied heavily on observa-
tion as a tool for data collection. These measures included, for instance,
total counts (i.e. the quantity) of talk or interaction among language
learners (Bygate & Samuda, 20 09; Dörnyei & Kormos, 2000). This type
of ‘instance’ measurement, like measures of on-task versus o-task behav-
ior, has, of course, been limited to capturing episodes of behavioral
engagement and is silent on the quality of that engagement. In more recent
literature (e.g. Baralt etal., 2016; Dobao, 2016; García Mayo & Azkarai,
2016; Svalberg, 2012), LREs (Kowal & Swain, 1994) have been employed
as the major unit of analysis for engagement in the language class as they
show the appropriateness and quality of eort in completing a classroom
task. Storch (2008), for instance, proposes that the dialogue and peer cor-
rection that occur during LREs demonstrate the degree to which language
learners are addressing the features of interest in the given task, and,
therefore, serve as a measure of cognitive engagement in the language
classroom. Others have focused more on the quality of involvement in
task-related behavior (e.g. Henry & Thorsen, 2018; Lambert etal., 2017).
Here we examine these more closely in relation to the multiple dimensions
of engagement.
The cognitive dimension
In language classrooms, both verbal interaction and nonverbal mark-
ers of interactional involvement (e.g. facial gestures) have been considered
appropriate indicators of cognitive engagement. For example, Philp and
Duchesne (2016) propose that private speech and exploratory talk –
expressions such as ‘I believe’, causal sequencing phrases such as ‘because’,
references to previous sentences through questioning or agreement, or
argument that includes reasoning and exemplification – are markers of
such deliberate, selective and sustained attention. In addition to such
negotiating of meaning, or ‘any part of the dialogue in which students talk
about the language they are producing, question their language use, or
other- or self-correct’ (Swain, 1998: 70), others see LREs as fitting indica-
tors of cognitive engagement (e.g. Baralt etal., 2016; Svalberg, 20 09).
One illustrative example of this is a study by Lambert etal. (2017)
comparing student engagement in learner-generated content as opposed
to teacher-generated content. In their study, Lambert et al. (2017)
Measuring L2 Engag ement: A Review of Is sues and Applications 87
4836.indb 87 26-10-2020 12:01:08
operationalized learners’ attention and mental eort while they repeated
a sequence of tasks as being (a) invested in task content, measured by the
number of clauses which served to expand on the semantic content of the
narrative (i.e. suggestions, propositions, elaborations, reasons and opin-
ions); and (b) devoted to clarifying meaning (whether receptive or produc-
tive), measured by the number of moves connected with the negotiation of
meaning (i.e. corrective feedback, modified output, co-constructions, con-
firmation checks, clarification requests and metalinguistic exchanges).
Results demonstrated that students engaged more with all aspects of L2
use during class time when involved in learner-generated activities than
for teacher-generated activities.
The behavioral dimension
In language learning research, much like domain-general studies, indi-
cators of behavioral engagement include students’ active participation,
persistence and expenditure of eort in the instructional setting (Philp &
Duchesne, 2016). As with classroom studies outside our field, behavioral
engagement is commonly measured by time on task (Gettinger & Walter,
2012) and word counts or turn-taking instances (Bygate & Samuda, 2009;
Dörnyei & Kormos, 200 0). Lambert et al. (2017) measured another
dimension of engagement in their study. This was one of the more meticu-
lous operationalizations of the behavioral dimension of language learner
engagement through actual language use as measured by (a) how much
semantic content learners produced while on task (i.e. the number of
words produced in pruned discourse1); and (b) persistence or how long
learners sustained the task without the need for support or direction (i.e.
the amount of time invested in performance). In this regard, these scholars
found that when students worked on tasks with learner-generated content
(as opposed to teacher-generated tasks), their behavioral engagement
increased in terms of the contributions they made to the task, the time
they spent on tasks, the degree to which task content was embellished and
discussed, and overall responsiveness when performing a task.
The emotional dimension
In classroom settings, emotional involvement is often manifested in
learners’ personal aective reactions as they participate in target lan-
guage-related activities or tasks. Emotionally engaged learners are char-
acterized as having a ‘positive, purposeful, willing, and autonomous
disposition’ toward language and associated learning tasks (Svalberg,
2009: 247). Emotional engagement is considered to have a key impact on
other dimensions of engagement because the subjective attitudes or per-
ceptions learners carry with them in a class or through language-related
tasks are fundamental to the other dimensions of engagement (Henry &
88 Part 1: Conceptual Chapters
4836.indb 88 26-10-2020 12:01:08
Thorsen, 2018; Swain, 2013). On close scrutiny, however, it is not clear
how Svalberg’s (2009) description qualifies as exclusively emotional.
Additionally, when cross-referenced with definitions from mainstream
education, it is apparent that very little work has been done on aective
engagement in SLA. The commonly used method to measure emotional
or aective engagement in L2 includes questionnaires or surveys including
items pertaining to student attitudes or feelings in classroom settings or
in particular tasks, and interviews with stimulated recalls. As with the
other dimensions, self-report measures such as journals, interviews, ques-
tionnaires and experience sampling can be used to tap into aective
engagement through attitudes and feelings toward learning contexts, indi-
viduals in that context, learning tasks, and their own participation in
those settings (Baralt etal., 2016; Fredricks & McColskey, 2012). One
study in particular (Dao, 2019) explores quite explicitly how emotional
engagement during peer interaction predicted low-proficiency learners’
question development. He operationalized emotional engagement as
laugh episodes and post-task questionnaire ratings and comments. Given
these measures, his results showed that although there was a positive asso-
ciation between emotional engagement and L2 question development,
emotional engagement could not significantly predict L2 question devel-
opment for low-proficiency learners. Only cognitive engagement was
found to significantly predict L2 question development in this study.
The social dimension
Because much of language learning and use is relational and serves
important social functions, social engagement occupies a central place in
language learning (Philp & Duchesne, 2008). Social engagement underlies
the connections among learners in terms of the learner’s aliation with
peers in the language classroom or community, and the extent of his/her
willingness to take part in interactional episodes and learning activities
with others. This dimension of engagement is linked also to phenomena
such as reciprocity and mutuality (Storch, 2008) as manifested in learners’
willingness to listen to one another or pay attention to the teacher’s talk.
For instance, a useful metric for social engagement may be the number of
backchannels2 produced in language learning tasks (Lambert etal., 2017).
Besides backchannels, turn taking with an interlocutor, and developing
topics in interaction, other prosocial expressions of aliation such as
empathetic discourse moves and responsive laughter are also considered
to be indicators worth investigating in social engagement. With the help
of data such as stimulated recalls or videos and interviews, it is possible to
examine the extent to which such actions contribute to language-related
task completion.
What is also clear from the above is that none of the dimensions of
engagement are fully coherent when separated and should instead be
Measuring L2 Engag ement: A Review of Is sues and Applications 89
4836.indb 89 26-10-2020 12:01:08
considered together. In essence, examining the interacting and overlap-
ping processes of the combined components of engagement, and their con-
nections to learning, must be the default instead of focusing on one or two
dimensions separately. One way of doing this may be to integrate quanti-
tative analyses that can deal with high-dimensional aggregated data on
engagement at a larger level of granularity or temporal window with
qualitative analyses of specific dimension of engagement at a smaller
grainsize.
The Future of Engagement Measures
In this final section, we turn to measurement issues that seem to hold
potential in advancing the measurement of engagement and depth of
insights obtained from learning activities. More specifically, we focus on
indirect measures of engagement, but we also see value in the use of skill-
specific measures and in ways that the dynamics of engagement (e.g. how
it is sustained and how it deteriorates) could be made more prominent. We
highlight the role technology could play, the potential contribution of
implicit measures and the relevance of biological markers.
Starting with technology, there currently seems to be an exponential
increase in interest in investigating the role of technology in learning in
general and in L2 learning in particular (Al-Hoorie, 2017). This is under-
standable considering the easy access and ubiquity of technology in many
language educational settings and the potential it has in facilitating learn-
ing. A substantial amount of research has looked at the value of flipped
learning, an instructional design in which students study the material out-
side of class (a task which is made more feasible with technology) and then
come to class not to learn the material but to practice it, ask question and
help struggling peers.
A number of platforms can be utilized by the teacher to prepare activi-
ties and tasks for learners. Examples include Coursera, Khan Academy,
EdX, Udemy and other MOOC providers. When used for research pur-
poses, such platforms usually oer the investigator a useful function,
namely, the ability to follow the progress of each individual learner. The
investigator can obtain detailed engagement-related statistics such as:
• which videos or lectures are watched and for how long;
• which readings are done and for how long;
• which exercises and tasks are attempted (un)successfully;
• which parts of the material attract the more/less attention from
learners;
• which tasks seem more easy/dicult with more (in)correct attempts
made;
• which learners seem to be particularly struggling and thus need spe-
cial attention from the teacher;
90 Part 1: Conceptual Chapters
4836.indb 90 26-10-2020 12:01:08
• which learners seem to be doing particularly well and thus may benefit
from more advanced material.
With the use of technology, engagement researchers can easily obtain a
large (perhaps overwhelming) amount of data about engagement of learn-
ers in dierent task- and skill-specific activities over a course. This is real-
time, authentic data that can be used to test hypotheses and the
eectiveness of dierent interventions.
Even without these platforms, there has been some interest in the use
of social media (or Web 2.0) for L2 learning (see Al-Hoorie, 2017). One
example is the use of private Facebook groups. Here, engagement in writ-
ing, reading, viewing and listening activities can be easily stimulated and
measured through various indicators such as the number of reads, ‘likes’
and comments. Emotional responses are indicated by the use of emojis
and emoticons. These can be content-specific or time-specific and as such
reflect the situated and dynamic nature of engagement. Analysis of com-
ments and discussions is not limited to quantitative approaches but can
also be analyzed qualitatively, revealing, for example, intensity and qual-
ity of engagement.
Admittedly using social media, though convenient for any teacher
with internet access, may not capture as deep insights on engagement as
other specialized platforms. Teachers may only have access to relatively
‘superficial’ indicators of engagement such as a ‘like’ or number of ‘reads,’
let alone the multitude of potential distractions that students may encoun-
ter while trying to learn on social media. In some contexts, this may also
lead to attentional concerns related to classroom management, especially
with younger learners. Nevertheless, the use of social media in education
is booming (Manca & Ranieri, 2017).
Another indirect approach that may be useful to measuring attitudes
toward dierent tasks is the use of implicit measures. Implicit measures
are typically computerized experiments that focus on reaction time
(Al-Hoorie, 2016a, 2016b). Depending on the exact type of the implicit
measure, the participant may be asked to respond to stimuli appearing on
the screen as quickly as possible (e.g. to classify them to the right or left
of the screen; see Al-Hoorie, 2020). Many participants realize that they
have ‘implicit’ associations in that they cannot control how well they per-
form in certain tasks. Such implicit associations may reveal implicit atti-
tudes about the certain tasks (positive or negative) that may or may not
coincide with explicit, self-reported attitudes.
Neuroscientific research has shown that implicit and explicit attitudes
correlate with dierent areas in the brain. While explicit attitudes corre-
late with activation in the frontal cortex – which is responsible for control
and self-regulation – implicit attitudes correlate with activation in the
amygdala (Cunningham et al., 20 03, 2004; Phelps et al., 2000).
Considering that the amygdala is the area in the brain responsible for
Measuring L2 Engag ement: A Review of Is sues and Applications 91
4836.indb 91 26-10-2020 12:01:08
emotions, Dasgupta etal. (2003: 241) have argued that implicit attitudes
are not cold cognition but may be ‘capturing something warm and aect-
laden’. Indirect measures that can tap into the subconscious side of engage-
ment have potential to open entirely novel avenues of research that would
tie in with the parts of instructed language learning research where
implicit processes are ‘a stable topic of investigation’ (Al-Hoorie, 2017: 5).
Finally, L2 researchers should engage more seriously with the biologi-
cal sciences. While there is some research has been done on the neurosci-
entific underpinnings of L2 learning, far less has been done on other
biological markers. However, such investigation has the potential to reveal
interesting insights into learner engagement. For example, one critical
factor influencing engagement is how the task is perceived and the amount
of stress and anxiety it generates (Pekrun & Linnenbrink-Garcia, 2012).
Stress is associated with a host of biological reactions, activating neuro-
endocrine, catecholamine and opioid systems, as well as more familiar
markers such as heart rate and blood pressure. Research has shown that
such biological markers interact with psychological states, and the ways
they are interpreted plays an important role in their influence on motiva-
tion, stress and anxiety (see Bandura, 1997).
L2 engagement researchers will certainly benefit from incorporating
such measures. Researchers will obtain a more fine-tuned, micro-level
view of how their participants react in dierent situations and how their
reactions fluctuate within a single situation. Merely asking the partici-
pant to provide self-report responses at regular intervals assumes that all
processes of interest are consciously reportable, while some biological
measures might be too subtle or imperceptible by the individual.
Furthermore, asking the participant to provide responses at regular inter-
vals can be intrusive and may interfere with the task at hand. For this
reason, some measures, such as the idiodynamic method (Hiver &
Al-Hoorie, 2020; MacIntyre, 2012), have been devised to provide retro-
spective accounts to avoid interfering with the task. However, as long as
these methods make the participant the ultimate arbiter of the inner
workings of their learning, they can be richly complemented with a bio-
logical vantage point.
Conclusion
Engagement holds substantial promise as a future direction for the
psychology of language learning and teaching. However, it is crucial that
our field employs a selection of measures to capture the dierent dimen-
sions of this complex construct. In this chapter, we set out to explore the
past, present and future of measuring engagement. We reviewed opera-
tional definitions of engagement and then explained various domain-
general and language-specific aspects that have been measured. We then
provided an overview of instruments that have been used to measure the
92 Part 1: Conceptual Chapters
4836.indb 92 26-10-2020 12:01:08
dierent aspects of engagement both in general learning sciences and in
L2 learning more specifically. We highlighted a caveat to this program of
measuring engagement: at times, dierent terms are used to refer to the
same notions, while at other times the same terms are used to refer to
distinct notions. We then ended the chapter by discussing some novel
approaches to measuring engagement (e.g. indirect measures) that might
be useful for future research. Given the array of topical areas in language
learning and teaching for which the study of engagement has untapped
potential, attention to conceptual and operational clarity is crucial to
ensure robust methods and designs are adopted. By investing in the devel-
opment and refining of measures for engagement in the L2 classroom,
researchers can rest assured that subsequent analysis and reporting will
be on firm ground. This will go far in advancing an agenda of innovation
and rigor in language learning research.
Notes
(1) A method used in discourse analysis that redacts disfluencies such as hesitations, fill-
ers, false starts, etc. from a text as a way of reducing clutter and making the underly-
ing message easier to understand.
(2) Verbal or non-verbal expressions or responses directed toward a speaker that serve a
meta-conversational purpose, such as signifying the listener’s attention, understand-
ing or agreement.
References
Al-Hoorie, A.H. (2016a) Unconscious motivation. Part I: Implicit attitudes toward L2
speakers. Studies in Second Language Learning and Teaching 6 (3), 423–454.
Al-Hoorie, A.H. (2016b) Unconscious motivation. Part II: Implicit attitudes and L 2
achievement. Studies in Second Language Learning and Teaching 6 (4), 619–649.
Al-Hoorie, A. H. (2017) Sixty years of language motivation research: Looking back and
looking forward. SAGE Open 1-11. doi:10.117 7/2158244 017 701 976
Al-Hoorie, A. H. (2020) Motivation and the unconscious. In M. Lamb, K. Csizér, A.
Henry and S. Ryan (eds) Handbook of Motivation for Language Learning. Palgrave
Macmillan.
Anderson, E. (2017) Measurement of online student engagement: Utilization of continu-
ous online student behaviors as items in a partial credit Rasch model (Unpublished
doctoral dissertation). University of Denver, Denver, CO.
Appleton, J.J., Christenson, S.L. and Furlong, M.J. (2008) Student engagement with
school: critical conceptual and methodological issues of the construct. Psychology in
the Schools 45, 369–386.
Appleton, J.J., Christenson, S.L., Kim, D. and Reschly, A.L . (20 06) Measuring cognitive
and psychological engagement: Validation of the Student Engagement Instrument.
Journal of School Psychology 44, 427–445.
Bandura, A . (199 7) Self-Ecacy: The Exercise of Control. New York: Freeman.
Baralt, M., Gurzynski-Weiss, L. and Kim, Y. (2016) Engagement with language: How
examining learners’ aective and social engagement explains successful learner-gen-
erated attention to form. In M. Sato and S. Ballinger (eds) Peer Interaction and
Second Language Learning: Pedagogical Potential and Research Agenda (pp. 2 09 –
240). Amsterdam: John Benjamins.
Measuring L2 Engag ement: A Review of Is sues and Applications 93
4836.indb 93 26-10-2020 12:01:08
Bygate, M. and Samuda, V. (2009) Creating pressure in task pedagogy: The joint roles of
field, purpose, and engagement within the interaction approaches. In A. Mackey and
C. Polio (eds) Multiple Perspectives on Interaction: Second Language Research in
Honour of Susan M. Gass (pp. 90–116). New York: Routledge.
Csikszentmihalyi, M. (ed.) (2014) Flow and the Foun dations of Positive Psycholog y: The
collected works of Mihaly Csikszentmihalyi. New York: Springer.
Cunningham, W.A., Johnson, M.K., Gatenby, J.C., Gore, J.C. and Banaji, M.R. (2003)
Neural components of social evaluation. Jour nal of Personal ity and Socia l Psycholog y
85 (4), 639–349.
Cunningham, W.A., Johnson, M.K., Raye, C .L., Gatenby, J.C., Gore, J.C. and Banaji,
M.R. (2004) Separable neural components in the processing of black and white faces.
Psychological Science 15 (12), 806–813.
Dao, P. (2019) Eects of task goal orientation on learner engagement in task performance.
International Review of Applied Linguistics in Language Teaching. Advance online
access. doi:10.1515/iral-2018-0188
Dasgupta, N., Greenwald, A.G. and Banaji, M.R. (20 03) The first ontological chal-
lenge to the IAT: Attitude or mere familiarity? Psychological Inquiry 14 (3 – 4),
238–243.
Dobao, A.F. (2016) Peer interaction and learning: A focus on the silent learner. In M. Sato
and S. Ballinger (eds) Peer Interaction an d Second Language Learning: Pedagogical
Potential and Research Agenda (pp. 33– 62). Amsterdam: John Benjamins.
Dörnyei, Z. and Kormos, J. (2 00 0) The role of individual and social variables in oral task
performance. Language Teaching Research 4, 2 75–300 .
Dörnyei, Z. and Taguchi, T. (2010) Questionnaires in Second Langu age Research:
Construction, Administration, and Processing (2nd edn). New York: Routledge.
Eccles, J. and Wang, M.-T. (2012) So what is student engagement anyway? In S.L.
Christenson, A.L. Reschly and C. Wylie (eds) Handbook of Research on Student
Engagement (pp. 133–148). New York: Springer.
Finn, J.D. (1989) Withdrawing from school. Review of Educational Research 59,
117–142 .
Finn, J.D., Folger, J. and Cox, D. (1991) Measuring participation among elementar y grade
students. Educational and Psychological Measurement 51, 393–4 02 .
Fredricks, J.A., Filsecker, M.K. and Lawson, M.A. (2016) Student engagement, context,
and adjustment: Addressing definitional, measurement, and methodological issues.
Learning and Instruction 40, 1–4.
Fredricks, J.A., Wang, M., Schall, J., Hofkens, T., Parr, A. and Snug, H. (2016) Using
qualitative methods to develop a measure of math and science engagement. Learning
and Instruction 43, 5–15.
Fredricks, J.A. and McColskey, W. (2012) The measurement of student engagement: A
comparative analysis of various methods and student self-report instruments. In S.L .
Christenson, A.L. Reschly and C. Wylie (eds) Handbook of Research on Student
Engagement (pp. 763–782). New York: Springer.
García Mayo, M.P. and Azkarai, A. (2016) EFL task-based interaction: Does task
modality impact on language-related episodes? In M. Sato and S. Ballinger (eds) Peer
Interaction and Secon d Language Learning: Pedagogical Potential and Research
Agenda (pp. 241–266). Amsterdam: John B enjamins.
Garcia, T. and Pintrich, P. (1996 ) Assessing students’ motivation and learning strategies
in the classroom context: The motivation and strategies in learning questionnaire. In
M. Birenbaum and F.J. Dochy (eds) Alternatives in Assessment of Achievements,
Learning Processes, and Prior Knowledge (pp. 319–339). New York: Kluwer.
Gettinger, M. and Walter, M.J. (2012) Classroom strategies to enhance academic engaged
time. In S.L. Christenson, A.L. Reschly and C . Wylie (eds) Handbook of Research
on Student Engagement (pp. 653 –673). New York: Springer.
94 Part 1: Conceptual Chapters
4836.indb 94 26-10-2020 12:01:08
Glanville, J.L. and Wildhagen, T. (2007) The measurement of school engagement:
Assessing dimensionality and measurement invariance across race and ethnicity.
Education al and Psychological Measurement 67, 1019–1041.
Goetz, T., Bieg, M. and Hall, N. (2016) Assessing academic emotions via the experience
sampling method. In M. Zembylas and P.A. Shutz (eds) Methodologic al Advances in
Research on Emotion and Education (pp. 244 –258). New York: Springer.
Han, Y. and Hyland, F. (2015) Exploring learner engagement with written corrective feed-
back in a Chinese tertiary EFL classroom. Journal of Second Language Writing 30,
31–44.
Hattie, J. (20 09) Visible Learning: A Synthesis of over 800 Meta-analyses Related to
Achievement. New York: Routledge.
Hektner, J.M., Schmidt, J.A. and Csikszentmihalyi, M. (2007) Experience Sampling
Method: Measuring the Quality of Everyday Life. Thousand Oaks, CA: SAGE.
Helme, S. and Clarke, D. (2001) Identifying cognitive engagement in the mathematics
classroom. Mathematics Education Research Journal 13 (2), 133–153.
Henry, A. and T horsen, C. (2018) Disaection and agentic engagement: ‘Redesigning’
activities to enable authentic self-expression. Language Teaching Research. Advance
online access. doi:10.1177/1362168818795976
Hiver, P. and Al-Hoorie, A.H. (2020) Research Methods for Complexity Theory in
Applied Linguistics. Bristol: Multilingual Matters.
Hiver, P., Al-Hoorie, A.H., Vitta, J. and Wu, J. (in preparation) A Systematic Review of
20 years of L2 Engagement Research.
Hiver, P., Obando, G., Sang, Y., Tahmouresi, S., Zhou, A. and Zhou, Y. (2019) Reframing
the L2 learning experience as narrative reconstructions of classroom learning. Studies
in Second Language Learning and Teaching 9, 85–118.
Hiver, P., Zhou, A., Tahmouresi, S ., Sang, Y. and Papi, M. (2020) Why stories matter:
Exploring learner engagement and metacognition through narratives of the L2 learn-
ing experience. System 91, 1–12 .
Hofkens, T.L. and Ruzek, E. (2019) Measuring student engagement to inform eective
interventions in schools. In J. Fredricks, A.L. Reschly and S.L. Christenson (eds)
Handbook of Student Engagement Interventions: Working with Disengaged
Students (pp. 309 –324). San Diego, CA: Academic Press.
Janosz, M. (2012) Outcomes of engagement and engagement as an outcome: Some con-
sensus, divergences, and unanswered questions. In S.L. Christenson, A.L . Reschly
and C. Wylie (eds) Handbook of Research on Student Engagement (pp. 695–703).
New York: Springer.
Järvelä, S., Järvenoja, H., Malmberg, J., Isohätälä, J. and Sobocinski, M . (2016) How do
types of interaction and phases of self-regulated learning set a stage for collaborative
engagement? Learning and Instruction 43, 39–51.
Kong, Q., Wong, N. and Lam , C. (200 3) Student engagement in m athematics: Development
of instrument and validation of a construct. Mathematics Education Research
Journal 54, 4–21.
Kowal, M. and Swain, M. (1994) Using collaborative language production tasks to pro-
mote students’ language awareness. Language Awareness 3 (2), 73–93.
Kubiak, T. and Krog, K. (2012) Computerized sampling of experiences and behavior. In
M.R. Mehl and T.S. Conner (eds) Handbook of Research Methods for Studying
Daily Life (pp. 124 –143). New York: Guilford Press.
Lambert, C., Philp, J. and Nakamura, S. (2 017) Learner-generated content and engagement
in second language task performance. Language Teaching Research 21, 665–680.
Lawson, M.A. and Lawson, H.A. (2013) New conceptual frameworks for student engage-
ment research, policy, and practice. Review of Educational Research 83, 432–479.
Linnenbrink-Garcia, L., Rogat, T. and Koskey, K. (2011) Aect and engagement during
small group instruction. Contemporary Educational Psychology 36, 13–24.
Measuring L2 Engag ement: A Review of Is sues and Applications 95
4836.indb 95 26-10-2020 12:01:08
MacIntyre, P.D. (2012) The idiodynamic method: A closer look at the dynamics of com-
munication traits. Communication Research Reports 29 (4), 361–367.
Manca, S . and Ranieri, M. (2017) Implications of social network sites for teaching and
learning. Where we are and where we want to go. Education and Information
Technology 22, 605–622.
Mercer, N. and Hodgkinson, S. (eds) (2008) Exploring Talk in School. London: SAGE.
Mercer, S. (2019) Language learner engagement: Setting the scene. In X . Gao (ed.) Secon d
Handbook of English Language Teaching (pp. 1–19). Basel: Springer.
Platt, E. and Brooks, F.B. (2002) Task engagement: A turning point in foreign language
development. Language Learning 52 , 365–40 0.
Pekrun, R. and Linnenbrink-Garcia, L. (2012) Academic emotions and student engage-
ment. In S.L. Christenson, A.L. Reschly and C. Wylie (eds) Handbook of Research
on Student Engagement (pp. 259–282). New York: Springer.
Philp, J. and Duchesne, S. (2008) When the gate opens: The interaction between social
and linguistic goals in child second language development. I n J. Philp, R. Oliver and
A. Mackey (eds) Child’s Play? Second Language Acquisition and the Young Learner
(pp. 83–104). Amsterdam: John Benjamins.
Philp, J. and Duchesne, S. (2016) Exploring engagement in tasks in the language class-
room. Annual Review of Applied Linguistics 36, 50 –72.
Phelps, E.A., O’Connor, K.J., Cunningham, W.A., Funayama, E. S., Gatenby, J.C., Gore,
J.C. and Banaji, M.R. (20 00) Performance on indirect measures of race evaluation
predicts amygdala activation. Journal of Cognitive Neuroscience 12 (5), 729–738.
Reis, H.T. and Gable, S.L. (200 0) Event-sampling and other methods for studying every-
day experience. In H.T. Reis and C.M. Judd (eds) Handbook of Research Methods
in Social and Personality Psychol ogy (pp. 190 –222). New York: Cambridge University
Press.
Reschly, A.L . and Christenson, S.L . (2012) Jingle, jangle, and conceptual haziness:
Evolution and future directions of the engagement construct. In S.L. Christenson,
A.L . Reschly and C. Wylie (eds) Handbook of Research on Student Engagement (pp.
3–19). New York: Springer.
Salmela-Aro, K., Moeller, J., Schneider, B., Spicer, J. and Lavonen, J. (2016) Integrating
the light and dark sides of student engagement using person-oriented and situation-
specific approaches. Learning and Instruction 43, 61–70.
Schmidt, J., Rosenberg, J. and Beymer, P. (2018) A person-in-context approach to student
engagement in science: Examining learning activities and choice. Journal of Research
in Science Teaching 55 (1), 19– 43.
Sherno, D.J., Kelly, S., Tonks, S., Anderson, B., Cavanagh, R. … and Abdi, B. (2016)
Student engagement as a function of environmental complexity in high school class-
rooms. Learning and Instruction 43, 52–60.
Sherno, D.J., Csikszentmihalyi, M., Schneider, B. and Sherno, E .S. (20 03) Student
engagement in high school classrooms from the perspective of flow theory. School
Psychology Quarterly 18, 158–176.
Sherno, D.J. and Schmidt, J.A. (2 00 8) Further evidence of the engagement-achievement
paradox among U.S. high school students. Journal of Youth and Adolescence 5,
564 –58 0.
Shiman, S., Stone, A.A. and Hu ord, M.R. (2008) Ecological momentar y assessment.
Annual Review of Clinical Psychology 4, 1–32 .
Sinatra, G.M., Heddy, B.C. and Lombardi, D. (2015) The challenges of defining and
measuring student engagement in science. Educational Psychology 50, 1–1 3.
Skinner, E.A. and Pitzer, J.R. (2012) Developmental dynamics of student engagement,
coping, and everyday resilience. In S.L. Christenson, A.L . Reschly and C. Wylie
(eds) Handbook of Research on Student Engagement (pp. 21– 44). New York:
Springer.
96 Part 1: Conceptual Chapters
4836.indb 96 26-10-2020 12:01:09
Skinner, E.A., Kindermann, T.A. and Furrer, C. (2009) A motivational perspective on
engagement and disaection: Conceptualization and assessment of children’s behav-
ioral and emotional participation in academic activities in the classroom. Education al
and Psychological Measurement 69, 493 –525.
Storch, N. (2008) Metatalk in a pair work activity: Level of engagement and implications
for language development. Language Awareness 17, 9 5–114.
Svalberg, A. M.L. (20 09) Engagement with language: Interrogating a construct. Language
Awareness 18, 242–258.
Svalberg, A.M.L . (2012) Thinking allowed: Lang uage awareness in language learning and
teaching: A research agenda. Language Teaching 45, 376 –38 8.
Swain, M. (19 98) Focus on form through conscious reflection. In C. Doughty and J.
Williams (eds) Focus on Form in Classroom Second Language Acquisition (pp. 471–
484). Cambridge: Cambridge University Press.
Swain, M. (2013) The inseparability of cognition and emotion in second language learn-
ing. Language Teaching 46, 195–207.
Volpe, R.J., DiPerna, J.C., Hintze, J.M. and Shapiro, E.S. (2005) Observing students in
classroom settings: A review of seven coding schemes. School Psychology Review 34,
454–474 .
Wang, M.-T., Fredricks, J.A., Ye, F., Hofkens, T.L. and Linn, J.S. (2016) The math and
science engagement scales: Scale development, validation, and psychometric proper-
ties. Learning and Instruction 43, 16–26.
Wang, M.-T., Fredrick s, J.A., Ye, F., Hofkens, T.L. an d Linn, J.S. ( 2019) Concept ualization
and assessment of adolescents’ engagement and disengagement in school: A multidi-
mensional school engagement scale. European Journal of Psychological Assessment
35 (4), 592– 606.
Wigfield, A., Guthrie, J., Perenchevich, K., Taboala, A., Klanda, S., McCrae, A. and
Barbosa, P. (2008) Role of reading engagement in mediating eects of reading com-
prehension instruction on reading outcomes. Psychology in the Schools 45,
432–445.
Appendix 5.1
(In my language class today/this week…)
Behavioral engagement
I stayed focused even when it was dicult to understand.
I participated in all the activities.
I kept trying my best even when it was hard.
I continued working until I completed my work.
I just pretended like I was working. (R)
I didn’t participate much in class. (R)
I did other things when I was supposed to be paying attention. (R)
I paid attention and listened carefully.
Emotional engagement
I looked forward to the next class.
I enjoyed learning new things.
Measuring L2 Engag ement: A Review of Is sues and Applications 97
4836.indb 97 26-10-2020 12:01:09
I wanted to understand what I was learning.
I felt good while I was in the class.
I felt frustrated while I was in the class. (R)
I found it boring to be in the class. (R)
I didn’t want to be in the class. (R)
I felt that I didn’t care about learning. (R)
Cognitive engagement
I went through my work carefully to make sure it was done right.
I thought about dierent ways to solve problems in my work.
I tried to connect new learning to the things I already learned before.
I tried to understand my mistakes when I got something wrong.
I preferred to be told the answer than do the work. (R)
I didn’t think too hard while I was doing the work. (R)
I only studied the easy parts because the class was hard. (R)
I did just enough to get by. (R)
98 Part 1: Conceptual Chapters
4836.indb 98 26-10-2020 12:01:09