ArticlePDF Available

Study Process Questionnaire Manual

Authors:
The Revised Two Factor Study Process Questionnaire: R-SPQ-2F
John Biggs 1, David Kember 2 & Doris Y.P. Leung 2
1 University of Hong Kong
2 Hong Kong Polytechnic University
The Revised Two Factor Study Process Questionnaire: R-SPQ-2F
Aim. To produce a revised two factor version of the Study Process Questionnaire (R-SPQ-
2F) suitable for use by teachers in evaluating the learning approaches of their students. The
revised instrument assesses deep and surface approaches only, using fewer items.
Method. A set of 43 items were drawn up for the initial tests. These were derived from; the
original version of the SPQ, modified items from the SPQ, and new items. A process of
testing and refinement eventuated in deep and surface motive and strategy scales each with 5
items, 10 items per approach score. The final version was tested using reliability procedures
and confirmatory factor analysis.
Sample. The sample for the testing and refinement process consisted of 229 students from the
health sciences faculty of a university in Hong Kong. A fresh sample of 495 undergraduate
students from a variety of departments of the same university was used for the test of the final
version.
Results. The final version of the questionnaire had acceptable Cronbach alpha values for scale
reliability. Confirmatory factor analysis indicated a good fit to the intended two factor
structure. Both deep and surface approach scales had well identified motive and strategy sub-
scales.
Conclusion. The revision process has resulted in a simple questionnaire which teachers can
use to evaluate their own teaching and the learning approaches of their students.
2
Introduction
The Study Process Questionnaire (SPQ) (Biggs, 1987a, 1987b), and its school-level
companion, the Learning Process Questionnaire (LPQ) (Biggs, 1987c), were developed in the
late 1970s. Since that time, the nature of the tertiary sector has changed dramatically, with
respect to such factors as the heterogeneity of the student population, the structure and
administration of institutions, the range and depth of curricula, methods of delivery and
assessment, and so on. However, as is indicated by the ever-expanding demand for such
instruments as the SPQ and the similar Approaches to Study Inventory (ASI) (Entwistle &
Ramsden, 1983) for research and development, the utility of the constructs being measured is
just as high as ever. It is therefore timely that the factor structure of the constructs might be
re-examined and the items defining the constructs reviewed and possibly updated.
The norms themselves are another matter. Originally it was thought desirable and
feasible to provide norms for different student populations, and so, as far as Australia was
concerned, norms were provided in Arts, Science and Education, for males and females,
separately for College and University sectors (Biggs, 1987b). Now those two sectors have
merged, and Arts and Science are no longer the modal faculties, in terms either of size or of
academic priority. Likewise, in Hong Kong, there now exist five more tertiary institutions than
were sampled ten years ago (Biggs, 1992). The question today is not only what norms to
provide for what sub-populations, which is a much more complex issue now than it ever was,
but whether to it is useful in fact to provide norms at all. For the purposes we have in mind for
the present revision, we suggest that norms are in fact unnecessary, or can be obtained intra-
institutionally by users, as explained below.
But apart from the basic need simply to update the instruments, work and conditions
in recent years have suggested the utility of providing a shortened version dealing only with
surface and deep approaches, principally for work on teaching effectiveness and staff
development. In these days of changing teaching contexts, accountability, and concerns with
3
quality assurance and particularly with quality enhancement, instruments like the SPQ have an
increasingly important role to play that was not envisaged fifteen or so years ago.
The development of the SPQ
Many inventories addressing learning processes are derived top-down from cognitive
psychology, particularly information processing theories (Moreno & DiVesta, 1991; Schmeck,
Geisler-Brenstein, & Cercy 1991; Weinstein, Schulte, & Palmer, 1987), with the deliberate
intention that they address universal and “culture-fair” mechanisms (Moreno & DiVesta,
1991). Such a framework, however, seems particularly inappropriate for such a context-
dependent issue as student learning, where student strategy use is dependent upon a host of
factors, such as students’ values and motives, their perceptions of task demands, teaching and
assessment methods, classroom climate, and so on.
By studying how students perceived a particular reading task and then went about
learning it, Marton and Säljö (1976a & 1976b) came up with the powerful idea of “approach
to learning”, which became the point of departure for the emerging conceptual framework
known generically as “student approaches to learning” (SAL) theory (Biggs, 1993a; Entwistle
& Waterston, 1988). SAL theory has in fact become a meta-theory for conceptualising
teaching and learning, which has gone in two major directions: phenomenography (Marton,
1981; Prosser & Trigwell, 1998) and constructivism and systems theory (Biggs, 1999; Dart &
Boulton-Lewis, 1998). However, the notion that students’ perceptions and learning-related
activities are central to teaching and learning is common to all SAL sub-theories (Biggs,
1993a, 1999; Entwistle & Waterston, 1988).
The Study Process Questionnaire (SPQ) (Biggs, 1987a, 1987b) was developed from
an earlier 10 scale Study Behaviour Questionnaire (SBQ), conceived within an information-
processing framework (Biggs, 1976). Higher order factor analysis suggested that the 10 scales
could be interpreted in terms of three higher order factors. The most suitable interpretation of
4
these factors, however, was in terms of the SAL conceptual framework, not the original IP
theory, because the three factors were found to be comprised of two kinds of items, those
relating to a motive, and those relating to a congruent strategy. In this, the factor analysis
recalled Marton and Säljö’s original point that a student handled a reading task according to
his or her intentions prior to engaging the task. However, whereas those authors were
concerned with two intentions or motives, to remember significant facts and details or to try
to understand what the author was trying to say, we are here dealing with three such motives:
to keep out of trouble with minimal effort, to engage the task appropriately, and to maximise
grades. Each such motive was associated with a congruent strategy: selective memorising,
seeking for meaning, and optimal time and space management, respectively (see Table1).
Given the differing methodologies and contexts, the similarity between the first two motives
and strategies and the Swedish work on surface and deep approaches was remarkable, and to
capture that similarity, the surface/deep terminology was adopted for the first two
dimensions. Thus, the SPQ yielded three Approach scores, Surface, Deep, and Achieving
respectively, and a component Motive and Strategy score for each Approach.
-- Table 1 goes here --
What do SPQ scores measure?
Students’ approaches to learning are conceived as forming part of the total system in which an
educational event is located, as schematised in the Presage-Process-Product (3P) model
(Figure 1) (Biggs, 1987a; 1993a, 1993b). In the 3P model, student factors, teaching context,
on-task approaches to learning, and the learning outcomes, mutually interact, forming a
dynamic system (Figure 1):
-- Figure 1 goes here
5
Presage factors refer to what exists prior to engagement that affects learning. On the student
side this include such factors as prior knowledge, ability, and their preferred approaches to
learning; and on the side of the teaching context, the nature of the content being taught,
methods of teaching and assessment, the institutional climate and procedures, and so on.
These factors interact to determine the on-going approach to a particular task, which in turn
determines the outcome. However, as the reversible arrows show, each such factor affects
every other factor, so that for instance the student’s preferred approach will adjust to the
particular context and course being taught, and to the success or otherwise of the outcome.
The heart of the teaching/learning system is at the process level, where the learning-
related activity produces or does not produce the desired outcomes. As Shuell puts it:
If students are to learn desired outcomes in a reasonably effective manner, then the teacher's
fundamental task is to get students to engage in learning activities that are likely to result in
their achieving those outcomes. It is important to remember that what the student does is more
important than what the teacher does. (Shuell 1986: 429)
A generic way of describing “what the student does” is precisely in terms of their ongoing
approaches to learning. There are many possible interactions between student perceptions and
teaching demands here. A student who typically picks out likely items for assessment and rote
learns them, finds that strategy won’t work under portfolio assessment, so goes deep. Another
student, who normally interacts deeply, may decide to go surface in a module that is
overloaded with content and assessed by MCQ. Indeed the generic aim of good teaching is
precisely to encourage students to adopt a deep approach and to discourage the use of a
surface approach (Biggs, 1989). Here, the mean of the approaches of the students in a class
gives an index of the quality of the teaching in that class, so we may then refer to one outcome
of teaching as a “contextual approach to learning.”
It is therefore quite inappropriate to categorise students as “surface” or “deep”
learners on the basis of SPQ responses, as if an approach score measured a stable trait of the
6
individual. SPQ responses are a function of both individual characteristics and the teaching
context. Both teacher and student are jointly responsible for the outcome, the teacher for
structuring the enabling conditions, the learner for engaging them. Thus, an approach to
learning describes the nature of the relationship between student, context, and task.
Approach scores on the SPQ are thus in marked contrast both to scores on learning or
cognitive style instruments, which are designed specifically to be insensitive to context (Biggs,
1993a, in press), and to the constructs accessed by questionnaires deriving from information
processing theory (Biggs, 1993a; Moreno & DiVesta, 1991). Given that, it is astonishing to
read that “the SPQ appears … as if it were a learning styles instrument, measuring some
characteristic of the person rather than their relation with the object of learning.” (Bowden &
Marton, 1999: 60). This statement either reveals a profound misunderstanding of systems
theory, or a cavalier disregard for the published work on the SPQ (Biggs, 1987a, 1993a),
particularly that relating to its extensive use in a before-after mode to assess the impact of
innovations (e.g. Kember, Charlesworth, Davies, McKay, & Stott, 1997). Be that as it may,
the fact that it has been made at all suggests we should be careful in clarifying how the SPQ
can be used to characterise teaching contexts.
In sum, SPQ scores can be quality indicators at presage, process and product levels, as
referring to preferred, ongoing, and contextual approaches to learning (see Figure 1):
at the presage level, they may describe how individuals differ within a given teaching
context (preferred approach)
at the process level, they may describe how specific tasks are handled (ongoing approach)
at the product level, they may describe how teaching contexts differ from each other
(contextual approach).
In the first case, the preferred approach tells us the extent to which an individual
differs from other students in a similar context. This is the case where norms are appropriate,
so that a student’s motive, strategy and/or approach scores may be compared to the scores of
7
others in the same cohort. That is, we are interested in variability between individuals in a
given context.
In the second case, the on-going approach would be obtained by requiring individuals
to respond to items reworded to suit a specific task, as did Tang (1991) in connection with
different assessment tasks. In this way, ongoing approach scores tell us how a particular task
was handled.
In the last case, the contextual approach is assessed by calculating class or sub-group
means, so that differences between means tell us how different classes or teaching contexts
differ. We might compare different classes, different institutions even, or before-after mean
scores after the introduction of an intervention in the same class. It is contextual approaches
on which we would like to concentrate here. Contextual approach scores can tell us when the
system is working (when motives, strategies and approaches are predominantly deep), and
when it is not working (when motives, strategies and approaches are predominantly surface).
In an ideal system, all students would be expected to engage the highest level learning
activities and thus to handle the task, or to solve the problem, appropriately. This is in fact the
generic definition of a deep approach, while a student using a surface approach would use
lower order verbs in lieu of the higher order. The following illustrates this clearly:
I hate to say it, but what you have got to do is to have a list of ‘facts’; you write down ten
important points and memorise those, then you’ll do all right in the test ... If you can give a bit
of factual information -- so and so did that, and concluded that -- for two sides of writing, then
you’ll get a good mark.
A psychology undergraduate, quoted in Ramsden (1984: 144)
It is unlikely that the teacher of this student thought that an adequate understanding of
psychology could be manifested by selectively memorising. Rather, an inappropriate
assessment task allowed the students to get a good mark on the basis of memorising facts. As
it happened, this particular student wrote brilliant essays, and later graduated with First Class
8
Honours. The problem is therefore not that this student is irredeemably cursed with a surface
“style”, but that under current conditions of teaching and assessment, he made a strategic
decision that a surface approach would see him through this task. As indeed it did.
Teaching and assessment methods often encourage a surface approach when they are
not aligned to the aims of teaching the subject, as in the above case. The presence of a surface
approach signals that something is out of kilter in our teaching or in our assessment methods,
but that it is something we can hope to address. The approaches that prevail tell us something
about the quality of the teaching environment. Thus questionnaires like the SPQ can be useful
for evaluating teaching environments (Biggs, 1993b; Kember, Charlesworth, Davies, McKay,
& Stott, 1997), and are often more sensitive when reworded for a particular subject (Eley,
1992), or assessment tasks (Tang, 1991; Thomas & Bain, 1984).
A particularly depressing finding is that most students in most undergraduate courses
become increasingly surface and decreasingly deep in their orientation to learning (Biggs,
1987a; Gow & Kember 1990; Watkins & Hattie, 1985). There are however exceptions;
students with aspirations for graduate study do not show this pattern in their chosen area of
study (Biggs, 1987a), nor do students taught using problem based learning, who become
increasingly deep, and less surface, in their orientations (Newble & Clark, 1986). For most
undergraduate students, however, something is happening as they progress that is increasingly
supporting the use of lower cognitive level activities, which is of course the opposite of what
is intended by a university education (Gow & Kember, 1990). One might call it the
“institutionalisation” of learning, whereby students tend to pick up the tricks that get you by,
such as “memorising the ten most important points” (see above).
The role of the achieving-related scales
In using the SPQ as a means of monitoring teaching/learning environments, the role of the
achieving-related scales is not as evident as those of deep and surface scales. In fact, the
9
achieving motive and strategy had a different relationship with the deep and surface motives
and strategies from the outset (Biggs 1978). Whereas deep and surface strategies describe the
way students engage the task itself, the achieving strategy refers to how the student organises
when and where the task will be engaged, and for how long. Higher order factor analyses
usually associate the achieving motive and strategy with deep the approach (Biggs, 1987a),
but depending on the subjects and teaching conditions, sometimes achieving-related scores
load on the surface approach (Biggs & Kirby, 1984). Indeed, Kember and Leung (1998) have
shown that using confirmatory factor analysis, the SPQ can most conveniently be described in
terms of two factors: deep and surface, with achieving motive and strategy sub-scales aligning
themselves on both factors. The confirmatory factor analysis of LPQ data by Wong, Lin, and
Watkins (1996) could also be interpreted as consistent with this finding.
To summarise, then, there appears to be a need for a shorter two-factor version of the
SPQ, addressing deep and surface approaches only, that can be administered quickly and easily
by a regular teacher, for use in monitoring teaching contexts. Such uses might include:
1. teachers monitoring their teaching from class to class, or following some innovation in
teaching or assessment in an action research design.
2. an outcome measure of teaching in more formally structured research.
3. suggesting to staff developers where teachers or departments may need help.
4. diagnosis of students with study problems, by comparing individuals’ deep and surface
scores and comparing individuals to others in the same cohort.
5. examining the relationship of approaches to learning with other curriculum variables with a
view to fine-tuning curricula based on the insights obtained.
6. quality assurance exercises in much the same way as the Course Experience Questionnaire
is used. In this last case institutions would keep their own norms but they would be used
on a class or departmental basis, not on the basis of an individual student.
10
Developing the R-SPQ-2F
In developing a new version of the SPQ, the strategy was to start with a reasonably large pool
of items which would be reduced through testing to a smaller set which had the best fit to the
projected two factor model. In devising the pool of items, the original items from the deep and
surface scales of the SPQ were examined in the light of the insights below. Some were
included in their original form and others were re-worded.
Guidelines for revising items
Firstly some items needed re-wording to update the terminology. Higher education has
undergone a major transformation since the original questionnaire was developed so it was
inevitable that some items needed adapting.
The questionnaire was also developed before the insights into approaches to learning
gained from the intensive study of the approaches of Asian students (Kember, 1996; Watkins
& Biggs, 1996). For this simple two-factor version of the SPQ the intention was not to
develop scales which fully characterised the possible combinations of understanding and
memorising. The work, though, was utilised to ensure that the deep and surface approach
items were consistent with the clearer descriptions which had emerged from this body of
work.
The other important insight was concerned with a better understanding of extrinsic
motivation, which had contributed to the original surface motive scale. Kember, Wong, &
Leung, (1999) argued that there was evidence that courses which provided a good career
preparation provided very positive motivation, which was entirely compatible with intrinsic
motivation. Wording of items intended to characterise surface motivation needed to reflect a
minimising tendency rather than this career motivation. It was also clear that the original
surface motivation sub-scale consisted of sub-components measuring fear of failure and the
desire for a qualification for the sake of obtaining a well-paid job.
11
Reduction of items
The revision of existing items and the development of new ones ultimately resulted in 43 items
for testing. These were combined in random order into a single questionnaire. Students were
asked to respond to the questions on a 5-point Likert scale range from ‘always true of me’ to
‘only rarely true of me’. A sample of health science students from a university in Hong Kong
were asked to complete the questionnaire. A total 229 usable questionnaires were returned,
with a high return rate since the questionnaires were handed out for completion in class.
Two statistical tests were use to determine which items to delete and which to retain.
The Reliability procedure of SPSS (Norusis, 1986) produces useful statistics following a test
of the reliability of items specified as forming a hypothesised scale. The procedure calculates a
Cronbach Alpha coefficient for the scale and, more importantly for our purposes, indicates the
alpha for the scale if an item were deleted. The inter-item correlation matrix also provides
useful information about the degree to which an item can form part of a coherent scale.
The more powerful test was through using the EQS program (Bentler, 1995) in a
confirmatory factor analysis mode. The initial test was of the original 43 items fitting to a
model with deep and surface approaches, each with strategy and motive sub-scales. The test of
the model produces a regression coefficient and an error term showing the degree of
relationship between each item and its appropriate latent variable or factor. Items which make
the most useful contribution to a scale have relatively high regression coefficients and low
error terms. Determination of malfitting parameters in the model was assessed by means of
multivariate Lagrange Multiplier (LM) and Wald Tests accomplished in EQS. The LM tests
provide information to identify those parameters (both paths and covariances) which when
added to the hypothesised model result in a significant drop in the model χ2 value. The Wald
tests help in assessing the statistical significance of the parameter estimates. Hence, the LM
tests tell us when to add new paths, and the Wald tests when to delete old paths. (Doris –
12
correct?) As items are removed from the model the goodness of fit of the new overall model
can also be estimated by a number of goodness of fit indicators (see discussion below).
An encouraging indication of the robustness and appropriateness of these procedures
was that there was broad concurrence between the two quite different approaches. The
process of trial and revision through reduction of items was repeated for two cycles. At each
stage the questionnaire was further revised by deleting items which did not contribute to a
component. The outcome of this exercise in reducing items was two deep and surface factors
each with ten items. Within each of these two factors it was possible to distinguish strategy
and motive sub-scales. Each of the sub-scales consisted of five items. The final version of the
questionnaire therefore has two main scales Deep Approach (DA) and Surface Approach (SA)
with four sub-scales, Deep Motive (DM), Deep Strategy (DS), Surface Motive (SM), and
Surface Strategy (SS) (see Appendix A)
Testing the new version of the SPQ
The final version of the revised questionnaire was then tested with a sample of 495
undergraduate students from various disciplines across each year of study from one university
in Hong Kong. The dimensionalities of the four components were examined by confirmatory
factor analysis to check whether items contributed to the intended component. The Cronbach
alpha values for each component were then computed to determine the scale and sub-scale
reliabilities. The results from the final test of the questionnaire are given in detail.
Reliability and unidimensionality of sub-scales
The unidimensionality of each of the sub-scales was separately tested by fitting a single factor
model to the corresponding five items by the EQS program (Bentler, 1995). The goodness-of-
fit of the model to the data can be assessed by many fit indexes with conventionally accepted
cut-off criteria (see, for example, Bentler, 1990; Hoyle & Panter, 1995; Marsh & Hau, 1996).
13
However, Hu & Bentler (1999) recently found that, in practice, the ‘rule of thumb’
conventions for cut-off criteria were inadequate in evaluating model fit, suggesting instead the
use of a two index presentation strategy, using the standardised root mean squared residual
(SRMR) supplemented with one other fit index. From their simulations, the new strategy can
better control for both Type I and Type II errors. Here, following the suggestions
recommended by Hu and Bentler (1999), the comparative fit index (CFI), and the SRMR were
chosen for this study. A CFI value greater than 0.95, and SRMR less than .08 can be used as
an indication of a relatively good fit between the hypothesised model and the observed data.
The results of separately testing each of the sub-scales are shown in Table 2. Good fits
of the single factor models for the four sub-scales to the observed data were supported and
hence we can conclude that the items are unidimensional for each of the four sub-scales. Once
the homogeneity of the items have been established, we can use the Cronbach alpha to
determine the sub-scales’ reliability (Schmitt, 1996). Cronbach alpha values for each sub-scale
in the instrument were computed and are given in Table 2. The values all reach acceptable
levels indicating that the sub-scales can be interpreted as internally consistent.
It is expected that most of the routine users will compute the scores for both Deep and
Surface Approaches by summing up the corresponding 10 items. Thus, it would be useful to
provide the reliabilities of the two latent constructs for our sample. The Cronbach alpha values
are 0.73 for DA and 0.64 for SA in the sample, which are considered as acceptable.
Insert Table 2 about here
Hypothesised models
Two models were tested to show different aspects of the questionnaire. The first (Model 1)
looks at the structure of the complete instrument from the items level. The model consists of
the four sub-scales formulated as latent constructs with their corresponding five items as
14
indicators. The relationship between the motive and strategy sub-scale are shown as a
correlation, which in structural equation modelling terms is a more general case than
combining them into a single higher order factor (Rindskopf & Rose, 1988). These two higher
order factors are then hypothesised as being negatively correlated since deep and surface
approaches are envisaged as dichotomous. The hypothesised model is shown in Figure 1,
though to avoid duplication, the diagram includes the standardised path estimates and error
terms resulting from the testing of the model.
Model 2, shown in Figure 2, concentrates upon testing the dimensionality of the whole
instrument so treats the sub-scales as indicators of two latent factors, namely Deep Approach
(DA), and Surface Approach (SA). DA has two indicators, dm and ds while SA has sm and ss
as indicators. The four indicators, dm, ds, sm and ss, were created by summing the
corresponding five items. Note that the four indicators are now observed variables, instead of
latent variables, and hence they are labelled with lower letter cases to make them
distinguishable from the latent constructs used in Model 1. As with Model 1 the relationship
between DA and SA was hypothesised as a negative correlation.
The two models are complementary, in that they examine alternative aspects of the
same relationship. The first should show whether the individual items conform to the expected
pattern. The second more clearly tests the anticipated dimensionality.
Insert Figures 2& 3 about here
The goodness-of-fits of the two hypothesised models were tested with confirmatory factor
analysis using the EQS program (Bentler, 1995). Table 3 gives the corresponding covariance
matrix used in the analysis for Model 2 (The matrix used for Model 1 is omitted to save space,
however, is available on request). The two indexes, CFI and SRMR, were reported for both
15
Models 1 and 2, based on the robust estimation method provided by EQS for model
evaluation.
Model 1
The result with standardised paths for Model 1 is illustrated in Figure 2. For Model 1, CFI =
0.904 and SRMR = 0.058 which indicate quite a reasonable fit to the data, given the
complexity of the model, however, there is a possibility of a Type II error (Hu & Bentler,
1999).
All the paths from the constructs to the items were statistically significant at the 5%
level. The standardised path coefficients range from 0.31 to 0.63 suggesting that the items are
good indicators of the four constructs. The latent constructs DM and DS are positively
correlated as expected. The correlation of 0.93 suggests that the two constructs are similar,
but the motive and strategy components of an approach have always been envisaged as
intimately related. The high correlations between both strategy-motive sub-scales is also
reassuring in view of the anticipated use of the questionnaire in its simplest form as just two
deep and surface approach scales.
Model 2
The results for Model 2 are presented in Figure 3 and an adequate fit to the data is suggested
by the values of the indexes, CFI = 0.992 and SRMR = 0.015. Note that two cases were
considered as outliers and were deleted in the analysis as they had very large multivariate
kurtosis values relative to the other cases (Byrne, 1994). After deleting the two cases, both the
CFI and SRMR values remained nearly unchanged while the path estimates changed
substantially.
The statistical significance of all the paths from latent constructs to indicators was
observed. The two higher order constructs, DA and SA, were negatively related as expected.
16
Conclusion
This article has presented an argument for re-developing the SPQ into a simple two factor
version which teachers can use in a number of ways to evaluate the learning environment in
their own classrooms. The development process commenced with testing 43 items taken from
the original SPQ, taken from the SPQ in modified form, or new items. The process of drawing
up this pool of items was guided by insights into approaches to learning established since the
original version was devised.
A process of testing and refinement resulted in a final version with deep and surface
approach scales. Each of these scales consists of ten items so the questionnaire is short and
simple enough for use by teachers. At the same time, the rigorous testing described in this
article shows that the final version of the questionnaire has very good psychometric properties.
We would envisage most routine users of the questionnaire would use it in a two
factor, deep and surface approaches, form as these are the indicators which are most pertinent
to its intended use by teachers in their classrooms. The two main factors, though, do have
clearly identified motive and strategy sub-components which may be of interest to some
researchers. From a theoretical viewpoint it is also reassuring to see that the development and
testing of the questionnaire confirmed the vision of an approach as consisting of congruent
motive and strategy components.
The principal motivation for the re-development of the instrument was our
commitment to teachers researching the learning environment in their own classrooms (Biggs,
1999; Kember, 2000). We believe that the most effective way of ensuring high quality teaching
and learning is for teachers to take responsibility for ensuring that assessment and other
contextual elements in the teaching and learning system are constructively aligned to promote
deep approaches to learning. We believe that this revised two factor version of the SPQ will
be an ideal tool for teachers to use in evaluating and researching their own classrooms.
17
The Revised-SPQ-2F is printed in full in Appendix A. Readers are invited to use it for
genuine research purposes, the one condition being that they acknowledge the source as the
present paper.
References
Bentler, P.M. (1990). Comparative fit indexes in structural models. Psychological Bulletin,
107, 238-246.
Bentler, P.M. (1995). EQS Structural Equations Program Manual. Encino, CA: Multivariate
Software.
Biggs, J.B. (1976) Dimensions of study behaviour: Another look at ATI. British Journal of
Educational Psychology, 46, 68-80.
Biggs, J.B. (1978). Individual and group differences in study processes. British Journal of
Educational Psychology, 48, 266-279.
Biggs, J. B. (1987a). Student Approaches to Learning and Studying. Camberwell, Vic.:
Australian Council for Educational Research.
Biggs, J.B. (1987b). The Study Process Questionnaire (SPQ): Manual. Hawthorn, Vic.:
Australian Council for Educational Research.
Biggs, J.B. (1987c). The Learning Process Questionnaire (LPQ): Manual. Hawthorn, Vic.:
Australian Council for Educational Research.
Biggs, J.B. (1989). Approaches to the enhancement of tertiary teaching. Higher Education
Research and Development 8: 7-25.
Biggs, J. (1992). Why and how do Hong Kong students learn? Using the Learning and Study
Process Questionnaires. Hong Kong: Hong Kong University.
Biggs, J.B. (1993a). What do inventories of students' learning processes really measure? A
theoretical review and clarification. British Journal of Educational Psychology, 63, 1-
17.
Biggs, J. (1993b). From theory to practice: A cognitive systems approach. Higher Education
Research and Development, 12, 73-86.
18
Biggs, J.B. (1999). Teaching for quality learning at university. Buckingham: The Open
University Press.
Biggs, J.B. (in press). Enhancing learning: A matter of style or approach? In R.J. Sternberg
and L.F. Zhang (Eds.). Perspectives on thinking, learning, and cognitive styles.
Mahwah: Lawrence Erlbaum.
Biggs, J.B. & Kirby, J. (1984). Differentiation of learning processes within ability groups.
Educational Psychology, 4, 21-39.
Bowden, J., & Marton, F. (1999). The university of learning: Beyond quality and
competence. London: Kogan Page.
Byrne, B. M. (1994). Structural Equation Modeling with EQS and EQS/Windows: Basic
Concepts, Applications, and Programming. Thousand Oaks CA: Sage.
Dart, B. & Boulton-Lewis, G. (eds.) (1998). Teaching and learning in higher education.
Camberwell: Australian Council for Educational Research.
Eley, M. G. (1992). Differential adoption of study approaches within individual students.
Higher Education 23, 231-54.
Entwistle, N., & Ramsden, P. (1983) Understanding student learning. Croom Helm, London.
Entwistle, N. & Waterston, S. (1988). Approaches to studying and levels of processing in
university students. British Journal of Educational Psychology, 58, 258-265.
Gow, L. & Kember, D. (1990). Does higher education promote independent learning? Higher
Education, 19, 307-322.
Hoyle, R.H. & Panter, A.T. (1995). Writing about structural equation models. In R. Hoyle
(Ed), Structural Equation Modeling. Thousand Oaks, CA: Sage.
Hu, L. & Bentler, P.M. (1999). Cutoff criteria for fit indexes in covariance structure analysis:
conventional criteria versus new alternations. Structural Equation Modeling, 6, 1-55.
Kember, D. (1996). The intention to both memorise and understand: Another approach to
learning? Higher Education, 31, 341-351.
Kember, D. (2000). Action learning and action research: Improving the quality of teaching
and learning. London: Kogan Page.
Kember, D., Charlesworth, M., Davies, H., McKay, J., & Stott, V. (1997). Evaluating the
effectiveness of educational innovations: Using the Study Process Questionnaire to show
that meaningful learning occurs. Studies in Educational Evaluation, 23(2), 141-157.
19
Kember, D. & Leung, D.Y.P. (1998). The dimensionality of approaches to learning: an
investigation with confirmatory factor analysis on the structure of the SPQ and LPQ.
British Journal of Educational Psychology, 68, 395-407.
Kember, D., Wong, A., & Leung, D. Y. P. (1999). Reconsidering the dimensions of
approaches to learning. British Journal of Educational Psychology, 69, 323-343.
Marsh, H.W. & Hau, K.T. (1996). Assessing goodness-of-fit: Is parsimony always desirable?
The Journal of Experimental Education, 64, 364-390.
Marton, F. (1981). Phenomenography - Describing conceptions of the world around us.
Instructional Science, 10, 177-200.
Marton, F., & Saljo, R. (1976a). On qualitative differences in learning -- I: Outcome and process.
British Journal of Educational Psychology. 46: 4-11.
Marton, F. , & Saljo, R. (1976b). On qualitative differences in learning -- II: Outcome as a
function of the learner’s conception of the task. British Journal of Educational
Psychology. 46: 115-27.
Moreno, V., & DiVesta F. (1991). Cross-cultural comparisons of study habits. Journal of
Educational Psychology, 83, 231-239.
Newble, D., & Clarke, R.M. (1986). The approaches to learning of students in a traditional
and in a problem-based medical school. Medical Education, 20, 267-273.
Norusis, M. (1986). SPSS Guide to Data Analysis for SPSS-X with Additional Instructions
for SPSS/PC+. Chicago, IL: SPSS Inc..
Prosser, M. & Trigwell, K. (1998). Teaching for learning in higher education. Buckingham:
Open University Press.
Ramsden, P. (1984). The context of learning. In F. Marton, D. Hounsell and N. Entwistle
(Eds.), The experience of learning (pp. 144-164). Edinburgh: Scottish Universities
Press.
Rindskopf, D., & Rose, T. (1988). Some theory and applications of confirmatory second-
order factor analysis. Multivariate Behavioral Research, 23, 51-67.
Schmeck R., Geisler-Brenstein E., & Cercy, S. (1991). The Revised Inventory of Learning
Processes. Educational Psychology 11, 343-362.
Schmitt, N. (1996). Uses and abuses of coefficient alpha. Psychological Assessment, 8, 350-
353.
20
Shuell, T.J. (1986). Cognitive conceptions of learning. Review of Educational Research, 56,
411-436.
Tang, C. (1991). Effects of different assessment methods on tertiary students' approaches to
studying. Unpublished Ph.D. Dissertation, University of Hong Kong.
Thomas, P. R. & Bain, J.D. (1984). Contextual dependence of learning approaches: The
effects of assessments. Human Learning, 3, 227-240.
Watkins, D. & Biggs, J.B. (eds.) (1996). The Chinese learner: Cultural, psychological and
contextual influences. Melbourne and Hong Kong: Australian Council for Educational
Research and the Comparative Education Research Centre, University of Hong Kong.
Watkins, D. A, & Hattie, J. (1985). A longitudinal study of the approach to learning of
Australian tertiary students. Human Learning, 4(2), 127-142
Weinstein, C., Schulte, A., & Palmer, D. (1987). Learning and Study Strategies Inventory
(LASSI) Clearwater, Fla.: H&H Publications.
Wong, N. Y., Lin, W. Y., & Watkins, D. (1996). Cross-cultural validation of models of
approaches to learning: An application of confirmatory factor analysis. Educational
Psychology, 16, 317-327.
21
Table 1: The original Study Process Questionnaire: Dimensions, Motives and Strategies
Surface Deep Achieving
Motive fear of failure intrinsic interest achievement
Strategy narrow target, rote learn maximise meaning effective use of space and time.
Table 2: Unidimensionality and reliability check for the four sub-scales
Sub-scales CFI SRMR Alpha
Deep Motive (DM) 0.997 0.01 0.62
Deep Strategy (DS) 0.998 0.02 0.63
Surface Motive (SM) 0.988 0.02 0.72
Surface Strategy (SS) 0.998 0.02 0.57
Note: CFI = comparative fit index, SRMR = standardised root mean squared residual, Alpha =
Cronbach alpha.
Table 3: Covariance matrices used in the analysis for Model 2
DM DS SM SS
DM 8.567
DS 4.963 8.669
SM -1.740 -1.033 12.376
SS -1.001 -1.159 5.090 9.004
22
Figure 1. The “3P” model of teaching and learning
PRESAGE PROCESS PRODUCT
_________________ ____________________ ________________
STUDENT
FACTORS
prior knowledge
ability
preferred approaches
to learning
LEARNING-FOCUSED LEARNING
ACTIVITIES OUTCOMES
Ongoing approaches to quantitative, facts, skills
learning qualitative, structure, transfer
contextual approach to
TEACHING learning
CONTEXT
objectives
assessment
climate/ethos
teaching
institutional procedures
Figure 2:
Figure 3:
23
Appendix A: The Revised Study Process Questionnaire (R-SPQ-2F)
This questionnaire has a number of questions about your attitudes towards your studies and
your usual way of studying.
There is no right way of studying. It depends on what suits your own style and the course you
are studying. It is accordingly important that you answer each question as honestly as you can.
If you think your answer to a question would depend on the subject being studied, give the
answer that would apply to the subject(s) most important to you.
Please fill in the appropriate circle alongside the question number on the “General Purpose
Survey/Answer Sheet”. The letters alongside each number stand for the following response.
A — this item is never or only rarely true of me
B — this item is sometimes true of me
C — this item is true of me about half the time
D — this item is frequently true of me
E — this item is always or almost always true of me
Please choose the one most appropriate response to each question. Fill the oval on the Answer
Sheet that best fits your immediate reaction. Do not spend a long time on each item: your first
reaction is probably the best one. Please answer each item.
Do not worry about projecting a good image. Your answers are CONFIDENTIAL.
Thank you for your cooperation.
1. I find that at times studying gives me a feeling of deep personal satisfaction.
2. I find that I have to do enough work on a topic so that I can form my own conclusions
before I am satisfied.
3. My aim is to pass the course while doing as little work as possible.
4. I only study seriously what’s given out in class or in the course outlines.
5. I feel that virtually any topic can be highly interesting once I get into it.
6. I find most new topics interesting and often spend extra time trying to obtain more
information about them.
7. I do not find my course very interesting so I keep my work to the minimum.
8. I learn some things by rote, going over and over them until I know them by heart even
if I do not understand them.
9. I find that studying academic topics can at times be as exciting as a good novel or
movie.
10. I test myself on important topics until I understand them completely.
11. I find I can get by in most assessments by memorising key sections rather than trying to
understand them.
12. I generally restrict my study to what is specifically set as I think it is unnecessary to do
anything extra.
24
13. I work hard at my studies because I find the material interesting.
14. I spend a lot of my free time finding out more about interesting topics which have been
discussed in different classes.
15. I find it is not helpful to study topics in depth. It confuses and wastes time, when all you
need is a passing acquaintance with topics.
16. I believe that lecturers shouldn’t expect students to spend significant amounts of time
studying material everyone knows won’t be examined.
17. I come to most classes with questions in mind that I want answering.
18. I make a point of looking at most of the suggested readings that go with the lectures.
19. I see no point in learning material which is not likely to be in the examination.
20. I find the best way to pass examinations is to try to remember answers to likely
questions.
25
... Derin yaklaşımı benimseyen öğrenciler anlama ve içeriğin kazanılmasına yönelik içsel bir motivasyona sahipken, yüzeysel yaklaşımı benimseyen öğrencilerin davranışları genellikle dışsal motivasyon kaynakları tarafından şekillenir ve minimum çabayla akademik görevi tamamlama çabası içerir (Appelhans ve Schmeck, 2002). Öğrenme yaklaşımları kuramının ilerleyen yıllarında öğrenme yaklaşımı modeline 3. bir bileşen olarak stratejik yaklaşım dahil edilmiştir (Entwistle ve Ramsden, 1983;Biggs, 1987). Stratejik yaklaşım doğrudan öğrenmeye yönelik eylemlerden farklı olarak değerlendirme yöntemlerine uygun çalışma, çabanın ve zamanın ayarlanması, öğrenmenin planlanması gibi başarı puanını artırmaya yönelik düzenleyici davranışları içermektedir (Entwistle ve Ramsden, 1983). ...
... While students who adopt deep approach have an intrinsic motivation to gain understanding and master the content, the behaviors of students who adopt the surface approach are usually shaped by external motivation sources and include an effort to complete the academic task with minimal effort (Appelhans & Schmeck, 2002). In the later years of learning approaches theory, the strategic approach was included as a third component in the learning approach model (Entwistle & Ramsden, 1983;Biggs, 1987). The strategic approach differs from the actions aimed at direct learning in that it includes regulative behaviors aimed at increasing S.Özgüngör, & G, Yıldırım / Pamukkale University Journal of Education, XX, 1-28, 2022 3 the achievement grade such as working in accordance with assessment methods, arranging effort and time, and planning learning (Entwistle & Ramsden, 1983). ...
Article
Full-text available
Bu araştırmada bilişötesi farkındalık ile derin ve yüzeysel öğrenme arasındaki ilişkide öz kontrol-öz yönetimin aracı rolü incelenmiştir. Araştırmaya 266 eğitim fakültesi öğrencisi katılmıştır. Veriler Bilişötesi Farkındalık Ölçeği, Özkontrol-Özyönetim Ölçeği ve Öğrenme Yaklaşımları Ölçeği ile toplanmıştır. Bilişötesi farkındalık ile öğrenme yaklaşımları arasındaki ilişkide öz kontrol-öz yönetimin aracı rolü yapısal eşitlik modellemesiyle incelenmiş, bilişötesi farkındalığın derin öğrenme yaklaşımını olumlu, yüzeysel öğrenme yaklaşımını ise olumsuz yönde yordadığı, öz kontrol-öz yönetimin bu ilişkide aracı rolü oynadığı tespit edilmiştir. Araştırma, bilişötesi farkındalık ile öğrenme yaklaşımları arasındaki ilişkinin anlaşılmasına katkı sunmuş ve bu süreçte öz kontrol-öz yönetim becerisinin geliştirilmesinin önemini ortaya koymuştur.
... Learning Approaches Scale: The Revised Approaches to Learning Scale (R-SPQ-2F) (Biggs et al., 2001), which was originally developed by Biggs (1987b) and revised in 2001, was adapted into Turkish by Bati et al. The approaches to learning scale consist of two subdimensions, namely the deep approach and the surface approach, and two sub-dimensions, including two sub-determinants of these dimensions. ...
Conference Paper
Full-text available
Nowadays, the increasing integration of digital technologies in the field of education is transforming learning and teaching processes. In this context, chatbots facilitate access to information and support learning processes by providing students with personalised learning experiences. However, the relationship between these technological tools and students' approaches to in-depth learning (understanding concepts and making connections) and surface learning (memorising and studying for exams) has not yet been sufficiently investigated. In this context, the aim of this study is to examine the deep and surface learning levels of university students in terms of the use of artificial intelligence language robots in education. The study, which was conducted using the general survey method, one of the quantitative research methods, was attended by 271 university students who use chatbots in education. Descriptive statistics, an independent sample t-test, and one-way analysis of variance (ANOVA) were used to analyse the data. When the findings of the study were examined, no difference was found in terms of the deep and surface learning levels of university students according to gender and class level variables. However, differences were found in terms of learning approaches according to the department of study, the frequency of chatbot use for education, and the role of use. In terms of surface learning level, there was a difference in favour of those who study in the Department of Econometrics and use chatbots 21-30 times a week among all participants. In terms of deep learning, differences were found in favour of students who used chatbots for "guiding" and "learning" roles. At the end of the study, it can be said that the roles and duration of chatbot use in the education of university students with different learning approaches have changed.
... However, a variety of student factors play a role in study success, such as entry characteristics, e.g., demographic variables, personality traits, intelligence and prior education, and factors related to the student-in-training, e.g., motivation/emotion, cognition, behaviour, context, and integration (Credé & Kuncel, 2008;Nauwelaerts, Doumen, & Verhaert, 2023;Richardson, Abraham, & Bond, 2012;Robbins et al., 2004). Accordingly, there are many self-report questionnaires on various student factors, and specifically student-in-training factors, such as the Learning and Study Strategies Inventory (LASSI) (Weinstein, Palmer, & Schulte, 1987), the Motivated Strategies for Learning Questionnaire (MSLQ) (Pintrich, Smith, Garcia, & McKeachie, 1993), the Approaches and Study Skills Inventory for Students (ASSIST) (Tait, Entwistle, & McCune, 1998), the Study Process Questionnaire (SPQ) (Biggs, 1987), the Student Readiness Inventory (SRI) (Le et al., 2005), and many others. This article focuses on the Learning Strategies and Motivation Questionnaire (LEMO) (Donche, Van Petegem, Van de Mosselaer, & Vermunt, 2010), a questionnaire that is often used in Flanders, Belgium, including in the orientation tool for teacher training programmes. ...
Article
Full-text available
As part of efforts to enhance academic achievement in higher education, incoming first-year students are becoming more and more subjected to surveys and assessments, e.g., regarding motivation and learning strategies. The Learning Strategies and Motivation Questionnaire (LEMO; Donche, Van Petegem, Van de Mosselaer, & Vermunt, 2010) is one of these surveys, applied mostly in professional bachelor programmes. The current study examines the reliability and predictive validity of the LEMO questionnaire in a sample of 416 first-year university students. All 13 scales were included in the study, i.e. Concrete Processing, Analysing, Memorising, Critical Processing, Relating-Structuring, External Regulation, Self-Regulation, Lack of Regulation, Amotivation, Controlled Motivation, Autonomous Motivation, Self-Efficacy, and Learning Together. In line with its reliability in previous studies, Cronbach’s alfa of most LEMO scales was below .70, which is the minimum threshold for scientific research, as was the Composite Reliability of eight of the 13 LEMO-scales. A confirmatory factor analysis showed that several factor loadings were below .70, resulting in an average variance extracted (AVE) below .50 for 11 of the 13 scales. Most scales had no or only a limited correlation to first-year GPA (FYGPA). Only Self-Efficacy and Analysing correlated ≥ .20 with FYGPA. These two scales explained 10.4% of the variance in study success. Hereby, Self-Efficacy is the most important predictor. The other 11 scales had no significant contribution to the prediction of academic performance in addition to Self-Efficacy and Analysing (ΔR2 = 3.4%, n.s.). Additional analyses showed that the correlation between the LEMO scales and FYGPA varied according to Bachelor programme.
... Learning motivation is defined as the psychological will to drive and maintain behavioral patterns (Woolfolk, 2021) that may be generated by learning experiences and rewards. Motivation may be intrinsic or extrinsic (Biggs, 1987), the former referring to an internal intention in individuals to seek and overcome challenges, whereas the latter relates to external benefits, such as obtaining credits, avoiding punishment, and pleasing someone (Woolfolk, 2021). Among those with intrinsic motivation, the behavior itself gives internal and psychological pleasure without external reason or reward (Hytti et al., 2010), whereas extrinsic motivation relies on the environment to reap the rewards and thus is linked to negative emotions and maladaptive behaviors, at least to some extent (Vallerand et al., 1992). ...
Article
Full-text available
We studied the relationships between entrepreneurship education and entrepreneurial intention among college students, with a focus on the mediating role of an entrepreneurial mindset as well as the moderating roles of learning motivation and prior entrepreneurial exposure. More than 90,000 students from 100 colleges or universities participated in the investigation, the data were subjected to structural equation modeling with Mplus. The results indicated that entrepreneurship education (curriculum attendance and extracurricular activity) significantly enhanced the entrepreneurial mindset of students, which, in turn, strengthened their entrepreneurial intention. In terms of learning, intrinsic motivation positively moderated the relationships between curriculum attendance and entrepreneurial intention/mindset, whereas extrinsic motivation moderated it negatively. Entrepreneurial exposure positively moderated the correlation between extracurricular activity and academic performance. Implications concerning the adjustment of entrepreneurship education to the entrepreneurial climate are discussed.
... Qualitative data was collected through student and teacher focus group discussions. Pre and post selfreported questionnaire was used to measure students' (i) learning depth which includes dimensions of deep learning (deep motive and deep strategy), surface learning (surface motive and surface strategy, measured by a 20-item scale adapted from the Revised Study Process Questionnaire (R-SPQ-2F) (Biggs, 1987;Biggs, Kember, & Leung, 2001), and social goals such as prosocial and responsibility measured by a 9-item scale adapted from the social goal scale (Wentzel, 1993;Wentzel, Battle, Russell, & Looney, 2010). These scales demonstrated good internal reliability, with Cronbach alpha scores ranging from 0.73 to 0.91. ...
... In contrast, deep learning is based on internal motive or interest (Xie et al., 2022), which emphasizes an in-depth understanding of learning content and shows that students know what they have already mastered and what they need to work hard at and are willing to expend more effort on (Chiou et al., 2012). Biggs also developed a research tool for deep learning for the first time from the perspective of assessment -the Study Process Questionnaire (SPQ) for college students (Biggs, 1987a). The original construct of the SPQ was composed of three different elements -surface, deep, and achieving learning -and each element contained two measurement sub-scales of motivation and strategy (Biggs, 1987b;Wong et al., 1996). ...
Article
Full-text available
Deep learning is a type of high-level learning that has received widespread attention in research on higher education; however, learning scenarios as an important variable have been ignored to some extent in past studies. This study aimed to explore the learning state of engineering students in three learning scenarios: theoretical learning, experimental learning, and engineering practice. Samples of engineering university students in China were recruited online and offline; the students filled in the engineering Education-Study Process Questionnaire, which was revised from the R-SPQ-2F. The results of clustering analysis showed four types of learning approaches in the three scenarios: typical deep learning, typical shallow learning, deep-shallow learning, and free learning. Engineering learners in different learning scenarios tended to adopt different learning approaches and showed gender differences. Due to factors such as differences in culture and choice of learning opportunities, the deep and shallow learners demonstrated excellent learning performance, which is in sharp contrast with the “learning failure” exhibited by such students abroad.
... The three-point Likert scale (-1, disagree, 0, neutral, 1, agree) was used to quantify students' views and self-assessment. The questionnaire is modi ed according to Paul Ramsden's curriculum experience questionnaire and Biggs' learning process questionnaire, and has been veri ed to be reliable and valid [10][11]. In addition, students need to report on the time spent in class preparation. ...
Preprint
Full-text available
Purpose In the flipped classroom teaching of the key and difficult chapters, the purpose of this study is investigate whether the use of rain classroom software can achieve the maximum teaching benefits for more students. Methods A total of one hundred and twenty-two third-year anesthesiology students from clinical medical college of Southwest Medical University were randomly divided into two groups, with 61 students in each group. One group was taught with flipped classroom teaching mode as the control group (FC group: flipped classroom group) and the other group was taught with flipped classroom software as the experimental group (FR group: flipped classroom + rain). The teaching content is the chapter of "acid base electrolyte balance and blood gas analysis". The teaching effect was evaluated by relevant theoretical tests, and the subjective feelings of students on the improvement of learning ability were evaluated by questionnaire survey. Results Compared with FC group, the theoretical test scores of FR group were higher, and the subjective evaluation of multiple learning ability improvement in FR group was also higher. (P < 0.05). Conclusions In the flipped classroom teaching of key and difficult sections, using rain classroom software can make more students obtain greater teaching income than using flipped classroom teaching only.
Article
This article presents the design, construct validation, and reliability of a self-report instrument in Spanish that aims to characterize different types of strategies that students can use to learn computer programming. We provide a comprehensive overview of the identification of learning strategies in the existing literature, the design and development of preliminary questionnaire items, the refinement of item wording, and the examination of the internal structure and reliability of the final instrument. The construction of the items was based on the educational theory of Self-Regulated Learning. The final version of the questionnaire, called the Computer Programming Learning Strategies Questionnaire (CEAPC), was administered to 647 students enrolled in computer programming courses. The data collected from the participants were used to examine the construct validity and reliability of the questionnaire. The CEAPC consists of 13 subscales, each corresponding to a different type of learning strategy, and a total of 89 items. Statistical analyses of the data indicate that the CEAPC has adequate construct validity. In addition, the results of the internal consistency analysis indicate satisfactory reliability across the different subscales of the instrument. This study contributes to the field of educational research, particularly in the area of self-regulated learning in computer programming.
Article
Full-text available
The ability to regulate one’s own learning processes is a key factor in educational scenarios. Self-regulation skills notably affect students’ efficacy when studying and academic performance, for better or worse. However, neither students or instructors generally have proper understanding of what self-regulated learning is, the impact that it has or how to assess it. This paper has the purpose of showing how learning analytics can be used in order to generate simple metrics related to several areas of students’ self-regulation, in the context of a first-year university course. These metrics are based on data obtained from a learning management system, complemented by more specific assessment-related data and direct answers to self-regulated learning questionnaires. As the end result, simple self-regulation profiles are obtained for each student, which can be used to identify strengths and weaknesses and, potentially, help struggling students to improve their learning habits.
ResearchGate has not been able to resolve any references for this publication.