Content uploaded by Michael Hubenthal
Author content
All content in this area was uploaded by Michael Hubenthal on Nov 08, 2024
Content may be subject to copyright.
Journal of Geoscience Education
ISSN: (Print) (Online) Journal homepage: www.tandfonline.com/journals/ujge20
Undergraduate persistence in an open-access,
online, scientific computing training is influenced
by expectancy, value, and cost
Michael Hubenthal & Michael Brudzinski
To cite this article: Michael Hubenthal & Michael Brudzinski (05 Nov 2024):
Undergraduate persistence in an open-access, online, scientific computing training
is influenced by expectancy, value, and cost, Journal of Geoscience Education, DOI:
10.1080/10899995.2024.2416365
To link to this article: https://doi.org/10.1080/10899995.2024.2416365
© 2024 The Author(s). Published with
license by Taylor & Francis Group, LLC.
View supplementary material
Published online: 05 Nov 2024.
Submit your article to this journal
View related articles
View Crossmark data
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=ujge20
JOURNAL OF GEOSCIENCE EDUCATION
Undergraduate persistence in an open-access, online, scientic computing
training is inuenced by expectancy, value, and cost
Michael Hubenthala* and Michael Brudzinskib
aIncorporated Research Institutions of Seismology, Education and Public Outreach Program, Washington, DC, USA; bDepartment of Geology and
Environmental Earth Science, Miami University, Oxford, Ohio, USA
ABSTRACT
During 2020 a free, 3-month long, certificate-granting, online workshop was offered to increase
undergraduates’ knowledge, skills, self-efficacy, and interest in observational seismology and
scientific computing. Course registrants were comprised of 760 upper-level undergraduates from
across science and math majors, representing 60 countries. Performance data revealed 58% of
registrants initiated the workshop by completing at least one assignment, and 30% successfully
completed the full workshop. These rates were higher than most comparable large-scale, open-access
courses. This study explores factors associated with persistence by examining registration,
performance, and pre/post survey data through the lens of the expectancy-value theory (EVT ).
Results indicate that EVT constructs show moderate to weak associations with initiation, including
intrinsic (p<.001), utility (p<.001), and attainment value (p=.24), and expectancy (e.g., completion
intentions [p<.001], prior computing experience [p<.001], seismology research experience [p=.01],
seismology self-efficacy [p=.84]). Two expectancy factors, academic major (p=.050) and prior
geophysics coursework (p<.001), showed weak associations with completion. Demographic factors
such as race, ethnicity, and gender were not associated with initiation or completion, but primary
language showed an association with attainment and intrinsic values. Dissuading factors or costs
were also captured through surveys. The few responses from non-completers limited the analysis.
However, an examination of acute and accumulated costs suggests they may influence persistence,
especially at completion. Based on these results, interventions are proposed to enhance persistence
in future workshops. These include increasing motivational messaging, tailoring course structure to
mitigate avoidable costs, and where possible, increasing communication to set time and effort
expectations and showcase prior participants overcoming costs.
Introduction and motivation
The process of fulfilling undergraduate curricular require-
ments and transitioning to Science, Technology, Engineering,
or Math (STEM) graduate programs to develop research skills
and practices plays an essential role in positioning the United
States as a leading force in the world’s economy by solving
challenges facing the nation (NASEM, 2018). The knowledge
and skills of incoming graduate students can vary widely,
especially quantitative (e.g., NASEM, 2020) and computational
skills. As a result, there have been repeated calls for students
to develop computational aptitudes across STEM disciplines
to harness the data revolution (NSF, 2007, 2011, 2017). Within
the geosciences specifically, the need for more computational
skills, including data management and data analytics were
identified as skills of critical importance to geoscience employ-
ers (Mosher & Keane, 2021). Although recent literature on
computational thinking recognizes there is still no single defi-
nition for computational thinking (e.g., Lockwood & Mooney,
2018; Tang et al., 2020), we find merit in Weintrop et al.’s
computing-based conception (2016) which emphasizes the
combination of data practices, modeling and simulation prac-
tices, computational problem-solving practices, and systems
thinking practices. However, the most effective way to develop
these skills within a discipline is not well understood, espe-
cially at the undergraduate level. Within the geosciences, there
is no clear roadmap for all undergraduates to develop this
expertise, nor is there a generally accepted threshold for what
is adequate skill development (Mosher & Keane, 2021).
Some undergraduates gain such skills through participa-
tion in research opportunities, such as the National Science
Foundation (NSF) funded Research Experiences for
Undergraduates (REU) in the US, which have been shown
to develop a range of research skills including data analysis,
interpretation, and other technical skills (e.g., Bauer &
Bennett, 2003; Junge et al., 2010; Thiry et al., 2012). Such
opportunities are limited in number and highly competi-
© 2024 The Author(s). Published with license by Taylor & Francis Group, LLC.
CONATCT Michael Hubenthal Michael.Hubenthal@EarthScope.org EarthScope Consortium, Engagement Program, 1200 New York Avenue NW, Suite 454,
Washington, DC 20005, USA.
*Now at EarthScope Consortium, Engagement Program
Supplemental data for this article can be accessed online at https://doi.org/10.1080/10899995.2024.2416365.
https://doi.org/10.1080/10899995.2024.2416365
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-nc-nd/4.0/),
which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any
way. The terms on which this ar ticle has been published allow the posting of the Accepted Manuscript in a repository by the author(s) or with their consent.
ARTICLE HISTORY
Received 22 October 2023
Revised 11 September 2024
Accepted 9 October 2024
KEYWORDS
Motivation; Expectancy Value
Theory; MOOC; Scientic
Computing; Seismology
2 M. HUBENTHAL AND M. BRUDZINSKI
tive with selectivity rates for the geosciences around 10%
(Walters et al., 2016). Others may develop scientific comput-
ing skills within coursework. However, this can be challeng-
ing for students as science courses may prioritize the learning
of disciplinary content over than the development of scien-
tific computing and data analysis skills within a discipline
(Weintrop et al., 2016). Even when these skills are included,
the vertical alignment and scaffolding of skills across an
entire undergraduate program, in courses taught by multiple
faculty, seems unlikely. Alternatively, some undergraduates
pursue these skills through a computer science course. While
these teach the basics of programming, this is rarely within
the framework of geoscience data analysis. The combination
of a lack of intention and structured access may contribute
to Mosher and Keane’s conclusion that computer program-
ming and data analytics skills may be difficult to get in the
regular course of study (2021). Thus, many students cur-
rently in the geoscience pipeline could benefit from supple-
mental, discipline-based training opportunities in scientific
computing, offered outside of the academy.
The role of the SAGE facility in the seismology learning
ecosystem
Like most STEM fields, institutions of higher learning are
the primary pillars of the seismology learning ecosystem.
However, research facilities such as the NSF’s Seismological
Facility for the Advancement of Geoscience (SAGE) facility,
operated by the Incorporated Research Institutions for
Seismology (IRIS), have taken on an increasing role through
their Education and Outreach Programs. As directed by the
seismology community, the SAGE facility has provided
instruction directly to students where appropriate for the
advancement of the field. For example, IRIS facilitated sum-
mer research internships for more than 250 undergraduates
since 1998 (Hubenthal, 2019), hosted a technical webinar
series for advanced students and faculty reaching nearly
7000 viewers annually, and has run numerous workshops
and trainings for undergraduate and graduate students
(Taber et al., 2015). Such direct-to-student instruction
empowers students to become the next generation of facility
users and extends the impact of the NSF-funded facility by
enhancing understanding of the facility and the facilitation
of the latest tools and approaches to search for, request, and
use seismic data.
A key example of this external, direct-to-student instruc-
tion has been a series of 3- to 5-day short courses from 2009
- 2017 (IRIS, 2021). Alternating annually between introduc-
tory and advanced graduate student audiences, the short
courses pushed the community to explore new ways to use
very large datasets effectively and embrace the quantity and
complexity of the data resulting from an unprecedented, con-
tinental scale experiment (Aderhold et al., 2021). Most short
courses employed “traditional instruction” consisting of lec-
tures to introduce the seismic computing topic and key oper-
ations followed by a series of step-by-step instructions to
code. However, prompted by a desire to offer something
informed by recent educational research, coauthor Brudzinski
developed an initial set of tutorial-based, active e-learning
modules (Sit & Brudzinski, 2017) to deliver basic scientific
computing training. The interactive assignments used a learn-
ing management system (LMS) to provide instructions for
performing scientific computing tasks that were regularly eval-
uated using embedded questions that also delivered real-time
feedback and guidance for incorrect responses. Simultaneously,
these modules were also piloted and then integrated as a sta-
ple of the IRIS summer research internship program.
In 2020, the COVID-19 pandemic created large disrup-
tions to most educational systems. This disruption extended
to REU programs (Sloan et al., 2020), including IRIS’s that
uses virtual tools to collaborate and build cohorts at multi-
ple sites across the country (Hubenthal & Judge, 2013). This
left a tremendous gap in the seismology educational ecosys-
tem impacting both graduate and undergraduate learners
and was identified as a critical need during IRIS’s (Hubenthal
et al., 2020) and the American Geophysical Union seismol-
ogy sections’ (Dugick et al., 2021) pandemic response.
To address this need, IRIS staff time and financial
resources were reallocated to quickly ramp-up the develop-
ment and delivery of a free Seismology Skill Building
Workshop (SSBW). The goals of the SSBW were to increase
undergraduates’ knowledge, skills, and interest in observa-
tional seismology and scientific computing, self-efficacy in
using seismic data, and competitiveness in the application
process for graduate school, summer internships, or profes-
sional jobs (for a full workshop overview see Brudzinski
et al., 2021). Despite their infrequent use in the geosciences
(Mosher & Keane, 2021), the authors employed a Massive
Open Online Course (MOOC) design.
A MOOC is course made available over the Internet
without charge to a very large number of people. Since
anyone who decides to take a MOOC simply registers,
MOOCs have a tremendous potential for impact and
global reach. For example, the World Economic Forum
reports that a global audience of 92 million users regis-
tered for Coursera, one of the most popular MOOC pro-
viders, though now offered as a low cost rather than free
option (Wood, 2022). Although the SSBW is smaller in
scale, 760 registrants from 60 countries were empowered
in 2020 to learn scientific computing within a seismology
context at a flexible pace, and without financial restric-
tions associated with participation in a traditional course
(Brudzinski et al., 2021).
MOOCs have tremendous potential, but they are also
notorious for having low completion rates (e.g., Cagiltay
et al., 2020, Jordan, 2015, Ho etal., 2014, Khalil & Ebner,
2014). For example, in a study of the MITx MOOCs,
Cagiltay et al. (2020) found completion rates between 2%
and 4% depending on the course subject matter. Therefore,
understanding student motivation and how they may
evolve from registration to initiation to completion is
important. Student motivation may have been particularly
important in the case of the 2020 SSBW as 25% of regis-
trants completed all assignments and 29% completed 80%
of the assignments (Brudzinski et al., 2021); rates well
above typical MOOCs.
JOURNAL OF GEOSCIENCE EDUCATION 3
Theoretical framework
Previous studies have identified student motivation as one
of the main factors that can impact MOOC completion
rates (e.g., Abdullatif & Velázquez-Iturbide, 2020; Badali
etal., 2022; Chaw & Tang, 2019; Maya-Jariego et al., 2020).
However, it is still unclear exactly which types of motiva-
tional factors influence learners’ academic choices to per-
sist in MOOCs (Badali et al., 2022). For example, Badali
et al. (2022) found that academic motives play an import-
ant role in MOOCs retention. Maya-Jariego et al. (2020)
reported the intensity of initial motivation was positively
related to the satisfaction and perceived quality of MOOC
experience. Luik and Lepp (2021) found a statistically sig-
nificant higher ratio of completers to non-completers in
the opportunity motivated (e.g., those who felt the course
was the only possibility or suitable opportunity to study an
interesting topic), success motivated (e.g., those who felt
the course provided knowledged they needed to be suc-
cessful), and interest motivated (e.g., those who were just
generally interested in the topic of the course) groups of
learners. Abdullatif and Velázquez-Iturbide (2020) found
that internal motivation strongly affects intention to con-
tinue with MOOCs. Chaw and Tang (2019) showed that
believing and having confidence in one’s ability to perform
well, valuing learning for its usefulness, importance, and
relevance, and learning to solve problems and develop
skills were positive motivations which promotes an
increased tendency to complete MOOCs.
As illustrated above, much of this research has focused
solely on students’ positive motivational values. The expec-
tancy value theory (EVT) provides a useful lens for examin-
ing such achievement choices (e.g., persisting in a course
that is free and not offered for credit after registering)
because it allows for negative motivational values, such as
perceived cost, which has been less well studied in MOOCs
(Wei et al., 2024). However, very few studies have applied
this approach to MOOCs previously.
EVT is a multidimensional approach to examining stu-
dents’ motivations, which the theory suggests are determined
primarily by students’ expectancies for success on the task
and the extent to which they value the task (e.g., Eccles
et al., 1983; Wigfield et al., 2016; Wigfield & Eccles, 2000,
Eccles & Wigfield, 2002) (Figure 1). In turn, many research-
ers have shown that students’ expectancy-related beliefs and
task values predict their performance and course-taking in
STEM fields (e.g., Meece et al., 1990; Musu-Gillette et al.,
2015; Simpkins et al., 2006).
Expectancy refers to students’ current beliefs in their abil-
ity to be successful at a task and students’ beliefs about their
abilities to continue to be successful in the future (Eccles &
Wigfield, 2002). Expectancy is like the concept of self-efficacy
(e.g., Bandura, 1997; Pajares, 1996). However, expectancies
generally refer to broad abilities or domains (e.g., Math),
while self-efficacy can generally be thought of as referring to
a specific task or skill within a domain (Hulleman et al.,
2016). Expectancy is often considered together with students’
current ability to complete a task to create a broader con-
struct of competence-related beliefs (e.g., Simpkins et al.,
2006; Wigfield et al., 1997).
Values, on the other hand, describe a student’s percep-
tion of the task’s merit that might motivate learners to
engage with the tasks (Eccles & Wigfield, 2002). These
values can be described across three major categories; 1)
the intrinsic value or the enjoyment or satisfaction a
learner anticipates that they will experience from engaging
with a task (Wigfield & Eccles, 2000), 2) the utility value
or the perceived usefulness or importance of a task on a
learner’s current or future goals (Wigfield & Eccles, 2000),
and 3) the attainment value or the perceived importance
of the task to the learners self-concept or identity (Eccles
& Wigfield, 2002).
In addition to these values, Wigfield and Eccles (2000)
also describe a collection of factors that might dissuade a
learner from engaging in tasks. These dissuading factors,
referred to as “costs”, include perceived emotional or psycho-
logical consequences, such as the possibility of failure at the
activity, the effort of doing a task, and the consideration of
what students are not able to do because of their participa-
tion in the activity (Wigfield et al., 2017). In some cases,
such as the consideration of what students are not able to
do because of their participation, these costs may be antici-
pated by students. However, in other cases, such as perceived
emotional or psychological consequences, or unexpected life
events occurring (e.g., personal or family illness), they may
also be unanticipated.
Research questions
This study seeks to understand the following context related
questions with the goal of using the results to redesign or
develop new interventions to improve future outcomes.
• What motivational factors are associated with stu-
dents’ decisions to continue past registration and ini-
tiate the 2020 workshop?
• What motivational factors are associated with stu-
dents’ completion of the 2020 workshop?
Figure 1. Diagram illustrating concepts of expectancy value theory in a gener-
alized association to the SSBW, where expectancy is a student’s perception of
ability to complete tasks, value is perception of task merit, and cost is percep-
tion of consequences for participating, some of which may be unanticipated.
4 M. HUBENTHAL AND M. BRUDZINSKI
Study population and setting
The SSBW did not seek to recruit participants from a geo-
graphic region, so this free workshop opportunity was adver-
tised to students and faculty through a mix of electronic
mailing lists that reach both US-based and international
audiences. Examples of such lists include those offered by
IRIS, American Geophysical Union, National Association of
Geoscience Teachers, and the IRIS social media channels.
Announcements were re-shared and forwarded, but the
extent to which this happened is not fully known.
The workshop was not offered by an academic institu-
tion, so no academic credit was offered to students for com-
pletion of the workshop. Instead, a detailed, module-by-module
performance report was offered at the conclusion of the
workshop (Brudzinski etal., 2021). Such completion reports
or certificates are common in noncredit education as they
provide information required for seeking credit at their own
institutions (Clark, 2005; D’Amico et al., 2020) or recogni-
tion on a resume or professional portfolio.
The 2020 SSBW ran from June 1 to August 31 with an
expected student time investment of 5-6 h/week. The work-
shop was divided into seven 2-week-long modules, each
with two synchronous hour-long webinars recorded for
asynchronous viewing. Webinars introduced seismological
and computational concepts while also emphasizing how a
seismologist might think about and approach the dataset or
methodology. Additional participant-instructor and participant-
participant interaction occurred through a regularly used
Slack workspace.
Each module consisted of 5-7 interactive assignments
constructed using a tutorial-based, active e-learning approach
(Sit & Brudzinski, 2017) (Figure 2) that introduced a spec-
trum of observational seismology concepts likely to be
encountered in graduate school, integrated with computa-
tional skills. Assignments provided participants with step-by-
step instructions and justifications for performing real-world
scientific computing tasks on a Linux virtual machine (VM).
Participants’ understanding of these tasks, their applications
to seismological concepts, and their numeracy and spatial
reasoning skills when interpreting code output were assessed
using questions embedded regularly throughout each
assignment. Feedback for incorrect responses encouraged
reflective thinking and reinforced learning. These assign-
ments were delivered through the Moodle LMS, while scien-
tific computing occurred locally via a VM with pre-installed
software that participants downloaded and installed on their
computers. The VM was a critical element that ensured a
common operating environment necessary for instructors to
anticipate the exact products and errors that might be pro-
duced by students as they worked.
As reported by Brudzinski et al. (2021) the workshop
received 760 undergraduate (including recent graduates
∼1 year out from graduation) STEM majors residing in 60
countries. Registrants were primarily Geology (n = 246),
Geophysics (n = 161), and Earth Science (n = 137) majors.
However, 30% came from other STEM fields such as
Engineering, Physics, and Computer Science. Just under half
of the pre-survey respondents (47%) described their gender
as female and 2% described their gender as nonbinary. Ages
ranged from 19 to 66 years old with 66% falling within the
“traditional student” range of 19 and 23 years of age. Further,
41% of respondents identified as a race or ethnicity that has
been traditionally underrepresented in the geosciences within
the United States. A majority (69%) were graduating within
a year of the workshop or had recently graduated. Few had
previously conducted research in seismology or geophysics
(15%). Participants’ experience with the workshop software
was generally limited (e.g., Linux/Unix (14%), Python (28%),
SAC (5%)). Over 62% of pre-survey respondents had previ-
ously taken at least one online course, and 66% of those
indicated that they would recommend an online course to
other students.
In the evaluation of the workshop, Brudzinski et al.
(2021) concluded that of the 760 registrants, 440 completed
at least one assignment, 224 completed at least 80% of the
assignments, and 191 completed all 35 assignments. They
also found that participants who completed all assignments
Figure 2. The SSBW employed a tutorial-based, active e-learning approach where interactive assignments within the LMS provided instructions for performing
scientic computing tasks that were regularly evaluated using embedded questions. The design sought to inspire critical thinking when evaluating coding outputs
and when responding to seismological questions. Participants were prepped for the active e-learning with webinars and introductory reading. Modied from
Brudzinski et al. (2021).
JOURNAL OF GEOSCIENCE EDUCATION 5
showed normalized gains in scientific computing skills at
>60%, interest in seismology, scientific computing, and grad-
uate school at 47%, and perceived preparedness to apply to
graduate school in seismology and to seek employment at
37%. Importantly, they noted that the workshop may have
provided more equitable access to the field of seismology.
The percentages of females (61%) and underrepresented
minorities (20%) completing the workshop were greater than
those receiving geoscience undergraduate degrees annually
in the United States (Gonzales & Keane, 2020).
Research design
The development of this project did not originally center
around EVT due to its development in response to the
urgent demands of the COVID-19 pandemic. Instead, the
data was collected by conducting a contextual evaluation via
participant survey to identify and diagnose problems or bar-
riers which might inhibit attainment of course goals and
objectives (Stufflebeam & Shinkfield, 2007). This evaluative
process was intended to facilitate continuous enhancements
to the course structure in future years.
However, upon obtaining the survey data, the project nat-
urally evolved toward investigating the factors influencing
student behavior within a MOOC environment, with a focus
on attrition and retention. This shift in focus prompted the
formulation of research questions aimed at unraveling the
determinants of participant engagement. Consequently, a
correlational research design was employed to use a subset
of the data collected during the contextual evaluation to
assess participant motivation. This methodological choice
was necessitated by the challenges associated with identify-
ing and controlling influencing variables in a non-experimental
setting (Privitera, 2014). Through the implementation of the
correlational research design, this study explores potential
associations between demographic characteristics and moti-
vational factors, as interpreted through the framework of the
EVT, and the persistence of registrants within the course.
The authors of this study served as the instructors for the
SSBW in addition to researchers on this project. The lead
author also identifies as a science teacher educator with
functional coding skills and very limited seismic data analy-
sis experience, while the second author identifies as a geo-
science educator and seismologist with expert coding and
seismic data analysis skill. These backgrounds lead the
authors to approach this research with a keen desire to
enable student success to the greatest extent possible.
Participation in the research was optional for students, and
the solicitation to participate was separated from the instruc-
tors. Once collected, the data was also anonymized prior to
analysis. To enhance the trustworthiness of the interpreta-
tion of data collected the authors employed three strategies.
First, they engaged in ongoing reflection and critical
self-awareness regarding their own positionality. Next, they
sought out peer debriefing by presenting preliminary results
and interpretations to colleagues at informal meetings and
conferences. To respond to this project’s research questions
regarding students’ motivations, a subset of the 2020 SSBW
evaluation data sources were used (Brudzinski et al., 2021).
These included registration form data, pre- and post-survey
data, as well as performance data captured by the LMS. All
students completed the registration form. The pre-survey
was administered to all workshop registrants and included
an item measuring participants’ “Completion Plans’’, which
have been shown to serve as a benchmark for completion of
other MOOC-style courses (Reich, 2014). The workshop
post-survey provided information about students’ percep-
tions of and satisfaction with the workshop, challenges they
faced in completing the workshop, perceptions of changes in
skills and interest in scientific computing in seismology, as
well as future intentions related to scientific computing in
seismology. Three different versions of a post-survey were
administered to participants depending on the degree to
which the participant completed the workshop. Response
rates for each of these data sources are described in Table 1.
Items constructed to evaluate the SSBW were assessed for
face validity by sharing them with other education profes-
sionals’ familiar with seismology and scientific computing.
Items were also piloted with potential users. Pilots were only
completed with US-based users as the large international
participation in the course was not anticipated. All surveys
were English language, matching the course materials, and
averaged a Flesch Reading Ease score of 51, which falls
within the “Fairly Hard to Read ‘‘ category and correlates to
a tenth to twelfth grade reading level (Flesch, 1981).
Due to the rapid development and rollout of the work-
shop in response to the pandemic, items on the survey were
not intentionally developed with the EVT theory in mind.
Rather, the evaluation items were developed based on the
authors’ experience working with students in similar settings
Table 1. Data sources used in the evaluation and the response rate to each. Additional details in supplemental materials.
Data Sources Description (n) Response Rate
Registration Data Basic information (e.g., academic institution, major,
perceived incoming skills, etc.)
760 100%
Pre-Survey Demographic and background information (e.g., gender,
race, ethnicity, parental education, previous online
course experience, English prociency, enrollment
reasons, etc.)
336 44.2%
Post-Survey (Completers) Perceptions of the SSBW and the impact of the workshop
on students
84 77.1%
Post-Survey (Partial) Perceptions of the SSBW and reasons they did not
complete the workshop
84 46.7%
Post-Survey (Non-starters) Reasons they did not start the workshop 26 8.0%
Learning Management System Performance data including time on assignments, number
of attempts used, accuracy, completion, etc.
All NA
6 M. HUBENTHAL AND M. BRUDZINSKI
and academic programs, and a desire to provide useful
information to support continuous improvement of the
workshop in future years. For example, participant registra-
tion forms provided information about academic majors,
anticipated graduation dates, students’ prior experience with
geophysics and seismology including coursework, prior
research experiences, and self-reported experience with sci-
entific computing. Demographic information, students’ rea-
sons for registering for the workshop, prior experience with
online learning, parents’ highest degree, citizenship status,
self-reported English proficiency, and students’ self-efficacy
related to scientific computing in seismology were collected
from items on the workshop’s pre-survey. However, during
the data analysis phase after the 2020 SSBW, these data were
organized using EVT as a lens for understanding student
motivation. The following broad constructs that describe
registrants were ultimately created: demographics, expec-
tancy, and value (Table 2).
Where applicable, multiple survey items measuring con-
cepts that are conceptually connected were summed to cre-
ate a single sum score for each factor. This approach was
employed as the use of multiple items helps to average out
errors and specificities that are inherent in single items, thus
leading to increased reliability and construct validity
(DeVellis, 2003). While statistical tests for internal consis-
tency might be expected, the repurposing of an evaluation
survey resulted in only a small number of items per multi-
dimensional construct with each measuring a related but
distinct aspect. For example, participants’ self-reported abili-
ties to use each of the five scientific computing tools (e.g.,
Linux, GMT, Python, Jupyter Notebooks, and Matlab) were
summed to create a holistic scientific computing experience
score. Similarly, scores were calculated for Utility, Attainment,
and Intrinsic value factors by assigning a + 1 for each reason
that registrants selected from a list for participating in the
workshop (see supplemental materials for full description).
Considering the similarities of the SSBW to a MOOC,
this study was designed to identify factors that may have
motivated registrants at two key points in the workshop
completion process. The first was whether a student con-
verted from registering for the workshop, a relatively low
effort task, to initiating work on the workshop materials.
Initiation was determined by examining data from the LMS.
Registrants who logged into the LMS and completed at least
one question in an assignment were considered to have ini-
tiated the workshop. Those who either never logged in or
logged in but did not attempt any questions were considered
non-initiators.
The second was whether those that initiated the work-
shop persisted to successfully complete it. Completion was
measured for all participants who initiated the course.
Completers were defined as those who initiated the course
and successfully completed 100% of the workshop assign-
ments. Those who initiated the workshop but did not reach
this completion threshold were categorized as non-completers.
A range of factors collected were compared to these two
binary outcomes (e.g., “Did not Initiate/Initiated” and “Did
not complete/Completed”). Each factor and its levels were
then tested for possible association with each of the out-
comes using Chi-squared test of independence or Fisher’s
exact test, both in R, as appropriate for the data. Factors
with a significance at, at least, the .05 level were identified
and their effect size was determined using Cramér’s V
(Cramér, 1946). Effect sizes were then interpreted depending
on the degrees of freedom as either weak, moderate, or
strong (Cohen, 1988) and reported. While this single variate
analysis is simplistic, it was deemed a reasonable starting
point since the data collection tools were not designed with
EVT in mind.
To estimate costs in the context of EVT, we used a ques-
tion asking for reasons students did not initiate or complete
the workshop. This offered a priori choices based on the
authors interactions with students who dropped out along
with an open-response option “Other, please describe”. This
pairing allowed unanticipated themes, not present in
close-ended choices, to be identified. Data from the
open-response option was analyzed by both authors using a
thematic analysis approach (Braun & Clarke, 2006). The lead
author used a recursive process where responses were read
and organized around major themes that emerged from the
data and represented participants’ experiences without losing
the detailed nuance of the individual responses. To enhance
dependability, themes and responses were iteratively reviewed
with the second author throughout the coding process.
When disagreement occurred, the authors worked to refine
and revise the constructs that underpinned each code and
tested the efficacy of revisions by exploring the inclusion
and exclusion criteria across multiple text samples. Any
responses that were deemed ambiguous or nonsensical were
placed in a miscellaneous category.
Results
Converting from registration to initiation
Testing the demographic factors found that registrants’
underrepresented minority status, gender, and US citizenship
did not associate with starting the workshop (Figure 3).
Table 2. Organization of evaluation data based on the Expectancy Value Theory
(EVT).
Construct Factors
Demographics US Citizenship
Primary Language
Familiarity w/ English
Age
Gender
URM Status
Parent’s Degree
Anticipated Graduation Year
Expectancy Intention to Complete
Academic Major
GPA
Perceived Computing Ability
Would Recommend Online Course (If taken)
Prior Seismology/Geophysics Research Experience
Prior Research Experience (Other)
Seismology Self-Ecacy
Prior Seismology/ Geophysics Course
Value Utility
Intrinsic
Attainment
JOURNAL OF GEOSCIENCE EDUCATION 7
This was a particularly important finding given that the
SSBW was designed in part to increase access to seismology
and scientific computing training. English as one’s primary
language also did not associate with starting the workshop.
However, one’s familiarity with the English language did
have a statistically significant, weak association. Registrants
who reported being ‘Not at All Familiar’ with the English
language initiated less frequently than those who were
‘Somewhat Familiar’ or ‘Extremely Familiar’ with the English
language. Three other demographic factors also showed a
statistically significant weak association with starting the
workshop (Figure 3). Students who were one year or three
years from graduation started the workshop at a greater rate
than those who were graduating, two years from graduation,
or had already graduated. Traditional students and those
students whose parents or guardians had earned an under-
graduate degree were also more likely to begin than nontra-
ditional students or those whose parents did not earn an
undergraduate degree.
Results from the expectancy factors revealed a statistically
significant association with initiation for five of the nine fac-
tors we identified (Figure 4). Prior experience conducting
seismology research did show a statistically significant weak
association with starting the workshop. Registrants’ academic
majors showed a similar pattern of association with
geophysics and geoscience majors being slightly more likely
to not start the workshop than non-geoscience majors. Three
expectancy factors showed a statistically significant moderate
association with initiation: completion plans, scientific com-
puting experience, and whether they would recommend an
online course (for those who had taken an online course
previously). Specifically, students who planned to complete
some or all of the workshop were more likely to start the
workshop than those who were undecided at the time of
registration. Similarly, registrants reporting medium or high
scientific computing skills at the outset were also more likely
to start the workshop than those with low scientific comput-
ing skills. The remaining factors (grade point average (GPA),
prior geophysics course work, prior research experience in
fields other than seismology and geophysics, and a seismol-
ogy self-efficacy score) did not show any statistically signif-
icant association between starting and not starting the
workshop (Figure 4).
Value was estimated based on students being asked to
select the reasons why they registered for the workshop
from a list of nine items. Reasons were sorted as utility
value, intrinsic value, and attainment value (see supple-
mental materials for full description). The number of
items selected within each type were tabulated for each
student and then analyzed for association with whether
Figure 3. Associations between workshop registrants’ demographic factors and their initiation (light shading) or not (dark shading) in the workshop as measured
by rst assignment completion. The number of participants for each factor is illustrated on the y-axis. Factors with a statistically signicant association to regis-
trants’ initiation at
≤
.05 level are highlighted in bold with the strength of association φc shown.
8 M. HUBENTHAL AND M. BRUDZINSKI
registrants initiated or did not initiate the workshop
(Figure 5). Results indicate that for all value types, the
number of items selected were associated with a greater
percentage of registrants that initiated the workshop.
However, these findings were only statistically significant
for utility value and intrinsic value, not for attainment
value. The effect size was strong for utility value and
moderate for the intrinsic value.
Figure 4. Associations between factors that align with expectancy, or students’ current and future beliefs in their ability to be successful at a task. and whether
registrants initiated the workshop (light shading) or not (dark shading) as measured by rst assignment completion. The number of participants for each factor is
illustrated on the y-axis. Factors with a statistically signicant association to registrants’ initiation at
≤
.05 level are highlighted in bold, with the strength of asso-
ciation φc shown.
Figure 5. Students were asked to select reasons they registered for the workshop from a list. These reasons were grouped by value type: utility value (max = 5),
intrinsic value (max = 2), and attainment value (max = 2). The association between value types and whether registrants initiated the workshop (light shading) or
not (dark shading) as measured by rst assignment completion are shown. The number of participants for each factor is illustrated on the y-axis. Value types with
a statistically signicant association to registrants’ initiation at the
≤
.05 level, are highlighted in bold with the strength of association φc shown.
JOURNAL OF GEOSCIENCE EDUCATION 9
Converting from initiation to completion
Demographic factors for students who had started the work-
shop, were also tested for association with successful com-
pletion of the workshop. Most demographic factors, including
gender, race and ethnicity, student age (e.g., traditional vs
nontraditional), English familiarity, graduation year, and par-
ent/guardian educational level did not show any association
with completion of the workshop (Figure 6). However, the
remaining two factors, citizenship, and primarily language,
did show a weak association with completion of the work-
shop. Interestingly, non-US citizen participants were more
likely to complete the workshop than their US citizen peers
despite the course being offered from the U.S. Similarly, par-
ticipants whose primary language was not English were
more likely to complete the workshop than their peers
whose primary language was English. The relationship
between citizenship and primary language factors was exam-
ined using a chi-square test of independence. The relation
between these variables was significant, Χ2 (1, N = 133) =
61.807, p <. 001.
To further probe the interaction between Citizenship and
Primary Language, we constructed Table 3 showing the con-
versation rates, as percentages, from initiation in the work-
shop to completion of the workshop. As shown below,
non-US citizens converted from initiators to completers at
the highest rates, regardless of primary language. However,
US citizens whose primary language was not English con-
verted at a higher rate that US citizens whose primary lan-
guage was English, and that rate was similar to the higher
conversion rates of non-US citizens. This indicates the pri-
mary difference was lower conversion for US participants
with English as their primary language compared to the
other participant groups.
EVT offers possible explanations for the two correlated
demographic factors, citizenship and primary language,
that demonstrated an association with completion (Figure
7; Table 3). For example, students from outside the US
may perceive the value of the workshop differently than
US participants. To test this, the value scores for com-
pleters of US citizens were compared to non-US citizens.
As shown in Table 4, non-US citizens reported higher
Figure 6. Associations between demographic factors of students who initiated the workshop and their successful completion of the workshop (light shading) or
not (dark shading). The number of participants for each factor is illustrated on the y-axis. Factors with a statistically signicant association to registrants’ initiation
at
≤
.05 level are highlighted in bold with the strength of association φc shown.
Table 3. Comparison of conversion rates from initiation to completion based
on participants’ citizenship and primary language.
English Other
Completer Initiator
Conversion
% Completer Initiator
Conversion
%
Non-US 12 19 63% 64 98 65%
US 49 112 43% 8 14 57%
10 M. HUBENTHAL AND M. BRUDZINSKI
attainment value scores than their US citizen peers (i.e.,
28% with score of 2 for non-US citizens versus 5% for US
citizens), while the US citizens reported lower attainment
value scores (i.e., 44% with score of 0 for US citizens ver-
sus 28% for non-US citizens). While more subtle, the pat-
tern for intrinsic value scores was reversed with
proportionally more US citizens reporting the highest
intrinsic value score compared to their non-US citizen
peers (i.e., 60% with a score of 2 for US citizens versus
49% for non-US citizens). The reported utility value scores
tracked closely between these two groups. Since US citi-
zens whose primary language was not English also con-
verted at a higher rate than their US citizen peers whose
primary language was English (Table 3), their value scores
were also compared (Table 5) and followed similar pat-
terns to those found between non-US citizens and US cit-
izens (Table 4). For example, US citizen completers whose
primary language was English reported lower attainment
value scores and higher intrinsic value scores than their
US citizen peers whose primary language was something
other than English. Again, reported utility value for the
two groups were similar.
Next, factors which contribute to students’ expectancies
were examined with respect to completion. Most factors did
not show a significant association with successfully complet-
ing the workshop including completion plans, GPA, prior
research experience, perceptions of online courses, scientific
computing experience, and seismology self-efficacy (Figure
7). However, two factors did demonstrate a weak association
with completion and are conceptually related. Participants
who were majoring in geophysics were more likely to com-
plete the workshop than their peers who were geoscience
majors or non-geoscience majors. Similarly, participants who
had previously completed a course in geophysics, were more
likely to successfully complete the workshop than their peers
who had not previously completed a geophysics course.
Finally, participants’ utility value, intrinsic value, and
attainment value scores were analyzed for association with
completion of the workshop. As illustrated in Figure 8, none
of the value factors showed a statistically significant associa-
tion with completion. However, collectively utility values 3-5
had higher completion than noncompletion and utility val-
ues 0-2 had higher noncompletion than completion, though
not a statistically significant linear relationship.
Figure 7. Associations between factors that align with expectancy, or students’ current and future beliefs in their ability to be successful at a task. And whether
students who initiated the workshop successfully completed the workshop (light shading) or not (dark shading). The number of participants for each factor is
illustrated on the y-axis. Factors with a statistically signicant association to registrants’ initiation at
≤
.05 level are highlighted in bold, with the strength of asso-
ciation φc shown.
JOURNAL OF GEOSCIENCE EDUCATION 11
Reasons for not initiating and/or completing the
workshop
In addition to examining factors that had been identified as
possible sources of motivation for students to initiate and
complete the workshop, we also explored costs, or factors
that might have negatively impacted students’ motivation
and persistence. We start with students who did not initiate,
focusing on the question asking why they did not initiate.
Not surprisingly, the participation rate for this population
was extremely low with only six responses received from
individuals out of the 26 total who had consented to partic-
ipate and abandoned the course. The most frequently cited
reasons included “The course required more time than I was
able to dedicate”, and “Personal reasons”. Each was identified
by three of the six respondents. The next most frequently
cited reason was “I encountered too many technical difficul-
ties with the software”. This was cited by two of the six par-
ticipants. Six other reasons were identified by at least one of
the participants. These included the following reasons:
• e course did not align with the reasons I registered
for the course.
• e English language of the course was challenging.
• e course required more eort than I anticipated.
• ere was not enough support for my learning in the
course.
• ere was not enough interaction with the course
instructors.
• ere was not enough interaction with other course
participants.
None of the respondents wrote in their own additional
reasons for not starting the course.
A similar question was asked of participants who con-
sented to participate and completed some but not all the
workshop (n = 84). This group was asked to select one or
more reasons why they did not complete the workshop from
the same list (Figure 9) or write in additional reasons.
Following a similar pattern, the two most cited reasons were
that “The course required more time than I was able to ded-
icate”, and “Personal reasons”. The former was cited by nearly
two-thirds of the respondents, while the latter was cited by
just over half of the respondents. Twelve additional reasons
were each cited by at least one participant. However, none
were cited by more than 17% of the respondents.
Unlike the previous group, 41 participants wrote in their
own reasons that they did not complete the workshop. In
most cases (n = 19), these provided additional clarification to
existing items they had also selected. For example, one
respondent who selected that the course required more time
than they were able to dedicate, also used the open-response
option to clarify that they had “Started a full-time, in-person
internship”. Other respondents, however, did use the field to
communicate reasons that were not included in the original
list. Two new reasons, each with multiple respondents, were
identified from the remaining data and indicated with an
asterisk in Figure 9. These included finding it difficult to
catch-up once they had fallen behind, and technical chal-
lenges beyond the workshop, such as loss of internet access
or a laptop crash. The remaining five responses were added
to a separate “Other” category that contained one-off or
Table 4. Number and percent of completers with the dierent value scores
based on their citizenship.
Attainment Value Intrinsic Value Utility Value
Score Non-US US Non-US US Non-US US
0 21(28%) 25(44%) 11(14%) 6(10%) 0(0%) 0(0%)
1 34(45%) 29(51%) 28(37%) 17(29%) 1(1%) 0(0%)
2 21(28%) 3(5%) 37(49%) 35(60%) 8 (11%) 5(9%)
3 NA NA NA NA 23(30%) 14(24%)
4 NA NA NA NA 26(34%) 24(41%)
5 NA NA NA NA 18(24%) 15(26%)
Table 5. Number and percent of US citizen completers with the dierent value
scores primary language.
Attainment Value Intrinsic Value Utility Value
Score English Other English Other English Other
0 22(45%) 3(38%) 5(10%) 1(12%) 0(0%) 0(0%)
1 25(51%) 4(50%) 13(27%) 4(50%) 0(0%) 0(0%)
2 2(4%) 1(12%) 31(63%) 3(18%) 3(6%) 2(25%)
3 NA NA NA NA 14(29%) 0(%)
4 NA NA NA NA 21(43%) 3(38%)
5 NA NA NA NA 11(22%) 3(38%)
Figure 8. Students were asked to select reasons they registered for the workshop from a list. These reasons were grouped by value type; utility value (max = 5),
intrinsic value (max = 2), and attainment value (max = 2). Association between value types and whether students who initiated the workshop successfully com-
pleted the workshop (light shading) or not (dark shading). The number of participants for each factor is illustrated on the y-axis. Value types with a statistically
signicant association to registrants’ initiation at the
≤
.05 level, are highlighted in bold with the strength of association φc shown.
12 M. HUBENTHAL AND M. BRUDZINSKI
nonsensical responses. For example, a one-off response
includes “I’d recommend the number of quizzes lesser in
each module”, while a nonsensical response included “I
should have completed the course”.
Discussion
As noted previously, the evaluation of this workshop was not
initially developed with the EVT in mind. Rather, survey
items were developed based on the authors’ experience
working with students in similar courses and research expe-
riences. When the data was collected and initially analyzed
for associations with important outcomes (e.g., initiation and
completion), the data showed complexity that did not imme-
diately reveal clear trends. However, organizing factors
according to demographic and EVT constructs, as presented
above, allowed interesting and useful patterns to emerge in
the workshop outcomes. The differing results for registra-
tion, initiation, and completion suggest that the contribu-
tions of expectancy, value, and cost to student motivation
may decline, accumulate, and/or evolve from the beginning
to end of the 14-week course. This aligns with Eccles and
Wigfield’s (2020) belief that the weighting of these major
components of EVT varies across time and situation.
EVT posits that students’ motivation to pursue different
achievement tasks is determined most directly by their
expectancies for success on the task and the extent to which
they value the task (Wigfield et al., 2016, for review). For
example, in the case of the SSBW it would be reasonable to
assume that expectancy and value drive motivation when
students register, as there is essentially no cost to the stu-
dent. A student may see a workshop advertisement and
decide that the workshop looks interesting and fun to them,
aligns with their future goals, could help them make prog-
ress in their career path, and is something they believe they
could successfully complete. Because the workshop is free
and has no acceptance criteria such, as prior experience,
demonstrated ability, or grades, students can be registered
after a few mouse clicks and sharing some minor personal
information.
As participants transition from registration to initiation,
expectancy and value appear to continue driving motivation.
As demonstrated in Figures 4 and 5, more than 50% of the
expectancy factors and two-thirds of the value factors asso-
ciated with initiation in the workshop. Expectancy factors
showed both weak and moderate associations, while the
value factors showed moderate and strong associations.
However, initiation is more complicated than registering.
For example, registrants had to read and complete the steps
necessary to successfully download a very large virtual disk
(>7GB), install specialized software to mount the virtual
disk on their personal computer, and follow additional steps
to log into the LMS to initiate the first assignment. This
workflow has several places where students might encounter
challenges. Overcoming these challenges may come at the
expense of other things that the student might do instead.
In this way, small incremental costs may be incurred, with
participants who encountered challenges accumulating more
costs than those who did not encounter challenges or
encountered fewer of them. This notion of accumulating
costs aligns with work by Rosenzweig et al. (2020) who
found that perceived costs increased across the semester for
students in a college physics course.
Simultaneously, participants’ expectancies or values may
also be impacted during this transition to initiation. For
example, those who encounter challenges and struggle to
resolve them might begin to question how successful they
can be in the workshop which would negatively impact their
expectancy. Conversely, students who encountered only
minor difficulties and successfully overcame them may see
reward in learning how to run a new operating system on
their computer for their own personal uses. In turn, such
experiences may increase their perceptions of the workshop’s
value. Indeed, numerous studies at the college level have
shown that value interventions have improved students’
interest and course performance (e.g., Canning et al., 2018;
Harackiewicz et al., 2016; Hulleman etal., 2010).
The transition from initiation to completion might
increasingly be explained through a balancing of expectancy,
value, and cost. Figure 7 above indicates that less than a
quarter of the expectancy factors showed even weak associ-
ations with completion. Of note is the lack of association
between the expectancy factor “Completion Plans’’, despite
using an identical survey question and scale as Reich (2014),
who did find associations with completion of MOOCs.
Similarly, none of the value factors had a statistically
Figure 9. Reasons participants did not complete the SSB workshop and their frequency. Participants were allowed to choose all that applied or write-in their own
reasons. *Indicates reason codes derived from respondents’ “write-in” responses. (N = 84 Respondents).
JOURNAL OF GEOSCIENCE EDUCATION 13
significant association with completion (Figure 8). It might
be tempting to surmise that this lack of association with
completion stems from non-completers simply deciding the
workshop was not for them or that the workshop and mate-
rials were of inferior quality. However, an analysis of work-
shop participant perceptions found that even non-completers
were satisfied with the workshop (71%) and described the
workshop as high quality (92%) (Brudzinski et al., 2021).
This aligns well with work by Wigfield et al. (2017) who
proposed that when students perceive significant cost associ-
ated with a task their overall valuing of the task decreases.
This combination in turn reduces the likelihood that stu-
dents will continue in STEM courses (Barron & Hulleman,
2015; Wigfield et al., 2017).
While attainment and intrinsic values did not associate
with completion for all the participants collectively (Figure
8), non-US participants and US citizen participants whose
primary language was other than English both appear to
perceived the workshop as more important to their
self-concept or identity, which includes career identity, than
their comparable peers. Conversely, US citizens whose pri-
mary language was English appeared to perceive a greater
likelihood of enjoyment or satisfaction from engaging with
the workshop than the rest of their peers. This may mean
that both attainment and intrinsic value are positive motiva-
tors for initiation (Figure 5), but that intrinsic value may not
be strong enough of a motivator to positively influence com-
pletion for a 14 week-long workshop. Unfortunately, the
underlying reasons for these differences in reported values
are not discernable from the current data available and is
something to explore in future work.
Finally, EVT suggest that an accumulation of costs by
participants further explain the observed completion/
non-completion results. Wei et al. (2024) found cost was a
moderator in the relationship between attitudes, learning
engagement, and perceived learning outcomes in MOOCs.
Perez et al. (2019) also found negative association between
costs and learning outcomes but in an on-campus rather
than MOOC context. However, the weighting and role of
perceived cost to influence academic outcomes is thought to
vary in individuals and contexts (Eccles & Wigfield, 2020;
Perez et al., 2019). Based on the different associations from
initiation to completion, costs may also vary over time and
in some cases become cumulative. For example, some costs
identified in Figure 9, such as medical issues (e.g., influence
of COVID) or technical difficulties (e.g., hard drive failures)
appear to be acute and severe enough to immediately lead
to course abandonment. Other costs such as a perceived lack
of time or personal reasons, which likely include things that
students did not do because of their initial participation in
the SSBW, may accumulate over the duration of the work-
shop. This accumulation may outpace any positive changes
to expectancy and value beliefs, and thus could also lead to
workshop abandonment. It is important to note that these
costs are not likely to be evenly experienced by all partici-
pants for a range of reasons. For example, factors such as
socioeconomic status (e.g., access to stable high-speed inter-
net), access to health care, and access to technical support,
may insulate some from incurring costs. The demographic
results do not point to specific student groups having worse
outcomes than others. However, this could be related to the
sensitivity and focus of the data collection tools employed.
For example, despite the international audience, language or
cultural difficulties were not written in as reasons that stu-
dents did not persist, though it is possible that the
English-only survey was self-selecting.
The results suggest that completion of this free, open-access
summer workshop for STEM majors may be driven through a
balancing of accumulated costs versus expectancy and value.
For example, a student who completed the workshop despite
starting with low expectancy and value beliefs may have faced
low costs or those beliefs increased over the course of the sum-
mer to prevent the costs from overcoming the expectancy and
value. On the opposite end of the spectrum, a student who did
not complete the workshop despite high initial expectancy and
value beliefs may have accumulated high costs or experienced
a decrease in their expectancy and beliefs value (e.g., the work-
shop is too demanding) for the costs to exceed the perceived
benefits. Thinking about the relationship of expectancy and
value with cost in this way for similar online courses may be
particularly important as costs are inherently difficult to mea-
sure as those who are experiencing cost in a significant way
may also be less likely to participate in evaluations.
In addition to factors that can be categorized through the
EVT lens, four demographic factors also showed a weak
association with initiation and an additional two demo-
graphic factors showed a weak association with completion.
However, from the data collected as part of this study, it is
difficult to determine why each of these demographic factors
showed associations. It is possible that some of these factors
may be proxies for costs, but there is not enough informa-
tion available to adequately evaluate this. For example, non-
traditional students may have initiated at a rate that was
lower than their traditional peers (Figure 3), because they
may have more competing factors on their time (e.g., family
or work commitments, etc.). Other demographic factors may
be associated with practical constraints. For example, stu-
dents who reported that they were not at all familiar with
the English language were less likely to initiate the workshop
(Figure 3). Such students may have been able to register for
the workshop successfully but may have found reading tech-
nical instructions more difficult when it came time to initi-
ate the assignments. However, we do not find this trend
extends to completion, as participants whose primary lan-
guage was English completed at a rate that was lower than
students whose primary language was something other than
English. We do find evidence (Tables 4 and 5) that students
from outside the US may perceive the value of the MOOC
workshop differently than US participants.
It is also notable that factors such as gender, and race
and ethnicity did not have any association with either initi-
ation or completion of the course. This suggests equal access
to the workshop, and an absence of biases and other factors
that might have prevented some groups from succeeding
once in the course. This is consistent with the earlier finding
that the percentages of women and URM students from the
United States completing the SSBW are greater than those
receiving geoscience degrees annually (Brudzinski etal., 2021).
14 M. HUBENTHAL AND M. BRUDZINSKI
Limitations
Like all studies, there are several important limitations of
this work that must be considered. First, while the overall
response rates to the survey were strong, it was not consis-
tent across all populations. For example, students who regis-
tered for the workshop but did not actually initiate the
workshop had a very low response rate. As a result, their
perspectives may be underrepresented here. Additionally, the
design of the survey was limited to only those students who
consented to participate and were motivated enough to com-
plete the surveys. Therefore, the views of some groups of
students may not be fully represented due to the bias
induced by using surveys.
Next, the evaluation instruments were not designed, from
the outset, to explicitly examine student motivation through
the lens of EVT. As a result, the internal reliability of some
of the items may be compromised; items included in the
analysis may not measure the entirety of each construct cat-
egory that they have been assigned (see supplemental mate-
rial). In some cases, this may explain why we see only weak
associations between factors and the student outcomes. Of
particular concern in terms of assessment is the construct of
cost. This represents a very broad construct and may be
under-defined by the sample as those who encountered the
largest costs may be least likely to engage with evaluation.
Additionally, some costs may be more important than oth-
ers. In this study, the evaluation item measuring costs was
focused on capturing the range of costs students encoun-
tered rather than their relative importance to one another.
Next, the analysis of this data employed a rather simplis-
tic, single-variate approach that does not capture complex
interactions between groups of factors. Additionally, the
analysis commonly binned together responses. While useful
for an exploratory investigation, it does also blunt the inves-
tigation by overly simplifying things into broad categories.
Therefore, some nuances and relationships within the data
may have been lost in the process.
Finally, an important consideration is that the workshop
ran during the depths of the pandemic when many other
opportunities for students were canceled. Therefore, it is not
clear how representative the population of registrants is and
how representative their experiences and values are when
compared to a normal year. For example, one concern was
that more students may have been driven to the workshop
because of the loss of other opportunities. However, enroll-
ments in the SSBW have been similar in 2021 and 2022, sug-
gesting that student interest in the workshop was not specific
to the pandemic. Nevertheless, what students valued and the
costs they encountered may have been impacted by the stress
and external time demands induced by the pandemic.
Implications
It is anticipated that this workshop will continue through at
least 2024, and this initial evaluation offers important
insights for both future implementations of the Seismology
Skill Building Workshop as well as for other STEM disci-
plines who are considering open-access, online workshops to
teach scientific skills. This work establishes EVT as an
important lens to both examine and re-design elements of
the workshop with the goal of maximizing participant moti-
vation and persistence. It also guides the further develop-
ment of assessments to measure the workshop’s impact on
learners.
With regards to the workshop design, these results,
along with recent efforts by others to test EVT based inter-
ventions (Rosenzweig etal., 2020; Rosenzweig et al., 2022),
suggest that enhancing student motivations to pursue the
workshop may be achieved. This would likely occur through
a combination of strategies designed to nourish students’
expectancy and value beliefs, while also mitigating real or
perceived costs whenever possible. For example, future iter-
ations of the workshop could incorporate resources that
highlight how past participants have benefited from suc-
cessfully completing the workshop. This might include a
detailed tutorial describing resume building strategies to
feature the workshop and the skills gained, ranging from
computational (e.g., using Python packages such as Obspy)
to more personal ones (e.g., self-motivation and discipline
required to complete an ∼70-h workshop that was not for
credit). Additionally, interventions such as motivational
messaging could be designed to target the needs of various
subgroups such as the value differences highlighted in
Table 4 and 5 above. Such efforts have been previously
shown to enhance participants’ utility values (e.g., Gaspard
et al., 2015; Hulleman et al., 2010), as well as
competency-based beliefs (Brisson et al., 2017; Canning &
Harackiewicz, 2015; Rosenzweig etal., 2022), which in turn
have contributed to enhanced STEM student performance
(e.g., Canning et al., 2018; Harackiewicz et al., 2016;
Rosenzweig et al., 2022).
Efforts to mitigate real costs by defining pinch points in
the workshop where students encounter unnecessary or
avoidable costs, and reducing students perceived costs both
offer promise. For example, a major hurdle many students
faced at the outset of the workshop was the need for fast
internet with stable and consistent connection to support
the download of a large (<7GB) virtual machine and tech-
nical knowledge to install software that occasionally required
manipulating the computer BIOS. The virtual machine was
required to create the common computing environment
necessary to support the workshop’s pedagogy. Cloud based
computing environments may eliminate these costs as they
would not require installations and should limit the need
for such a robust internet connection. Additionally, since
all computing would occur virtually, it may also increase
accessibility for students with Chromebooks and tablets;
students with these devices were excluded in 2020 due to
the need for local computational power. To address these
issues, a cloud-based solution was implemented in the
2022 SSBW.
Increased communication about costs may also be helpful
to reduce students’ perceptions of the workshop’s psycholog-
ical costs. For example, experience with the 2020 workshop
should provide facilitators with significantly more accurate
time estimates required to complete the workshop. This
could help set more accurate student expectations and enable
JOURNAL OF GEOSCIENCE EDUCATION 15
students to employ better time management strategies.
Additionally, including examples of other students overcom-
ing costs could help to decrease participants’ negative per-
ceptions of costs. This was a benefit recently shown when
students in a college level physics course read and evaluated
examples of how students experienced and overcame costs of
the course (Rosenzweig et al., 2020). It also showed an
increase in self-perception of competence that indicated
raised expectancy as well.
Another important implication for future workshops is to
consider strategies for improving the evaluation of the work-
shop. After going through this process, we would change the
completion threshold from the 100% used in this study to
80%. Such a change would more closely align “success” in
this workshop considering successful students attend most
but not all classes in a traditional course and few complete
every assignment. It would also align more closely to other
MOOCs which typically report completion rates as a per-
centage of registrants that simply passed the course (Jordan,
2020). This influenced our evaluation as participants’ com-
pletion was used to determine which end-of-the-workshop
survey students received. Lowering the threshold would have
increased the number of students receiving the more robust
survey provided to completers, including those who may
have overcome more costs.
Finally, it will be important to align the overall evaluation
with EVT from the outset. This should include revising and
expanding survey items to ensure that each construct is ade-
quately sampled and has a robust content validity. Leveraging
existing assessments (e.g., Dietrich et al., 2017; Kosovich
et al., 2015) either in their entirety or with modification as
applicable will greatly help in this regard. This will be espe-
cially important for value constructs or costs that, as
described above, may have been under sampled in this cur-
rent work. In addition, employing different approaches to
the data collection, such as reducing the survey length and
exploring incentivizing responses may increase the number
of students who did not complete the workshop in the eval-
uation dataset (Abdelazeem et al., 2023).
Acknowledgements
We benefited from discussions with and reviews by S. Fasola and G.
Goldhagen, as well as comments from two anonymous reviewers, a
research editor, and an associate editor. The data was collected under
IRB protocol #03825e.
Disclosure statement
No potential conict of interest was reported by the author(s).
Funding
is manuscript was supported by NSF Grants 2121503 and 2121342.
ORCID
Michael Hubenthal http://orcid.org/0000-0002-2790-7054
Michael Brudzinski http://orcid.org/0000-0003-1869-0700
References
Abdelazeem, B., Hamdallah, A., Rizk, M. A., Abbas, K. S., El-Shahat,
N. A., Manasrah, N., Mostafa, M. R., & Eltobgy, M. (2023). Does
usage of monetary incentive impact the involvement in surveys? A
systematic review and meta-analysis of 46 randomized controlled
trials. PloS One, 18(1), e0279128. https://doi.org/10.1371/journal.
pone.0279128
Abdullatif, H., & Velázquez-Iturbide, J. Á. (2020). Relationship between
motivations, personality traits and intention to continue using
MOOCs. Education and Information Technologies, 25(5), 4417–4435.
https://doi.org/10.1007/s10639-020-10161-z
Aderhold, K., Busby, R., Dorr, P., Frassetto, A., Sweet, J., Woodward, R.
(2021). Reections on the accomplishments of EarthScope’s USArray.
EarthScope. https://www.earthscope-program-2003-2018.org/articles/
Reections_on_USArray.html
Badali, M., Hatami, J., Banihashem, S. K., Rahimi, E., Noroozi, O., &
Eslami, Z. (2022). e role of motivation in MOOCs’ retention rates: A
systematic literature review. Research and Practice in Technology Enhanced
Learning, 17(1), 1–20. https://doi.org/10.1186/s41039-022-00181-3
Badali, M., Hatami, J., Farrokhnia, M., & Noroozi, O. (2022). e ef-
fects of using Merrill’s rst principles of instruction on learning and
satisfaction in MOOC. Innovations in Education and Teaching
International, 59(2), 216–225. https://doi.org/10.1080/14703297.2020.
1813187
Bandura, A. (1997). Self-ecacy: e exercise of control. Freeman.
Barron, K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model
of motivation. In J. S. Eccles & K. Salmelo-Aro (Eds.), International
encyclopedia of social and behavioral sciences: Motivational psychology
(2nd ed., pp. 503–509). Elsevier.
Bauer, K. W., & Bennett, J. S. (2003). Alumni perceptions used to assess
undergraduate research experience. e Journal of Higher Education,
74(2), 210–230. https://doi.org/10.1080/00221546.2003.11777197
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology.
Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1
191/1478088706qp063oa
Brisson, B. M., Dicke, A.-L., Gaspard, H., Häfner, I., Flunger, B.,
Nagengast, B., & Trautwein, U. (2017). Short intervention, sustained
eects: Promoting students’ math competence beliefs, eort, and
achievement. American Educational Research Journal, 54(6), 1048–
1078. https://doi.org/10.3102/000283121771
Brudzinski, M., Hubenthal, M., Fasola, S., & Schnorr, E. (2021).
Learning in a crisis: online skill building workshop addresses imme-
diate pandemic needs and oers possibilities for future trainings.
Seismological Research Letters, 92(5), 3215–3230. https://doi.
org/10.1785/0220200472
Cagiltay, N. E., Cagiltay, K., & Celik, B. (2020). An analysis of course
characteristics, learner characteristics, and certication rates in MITx
MOOCs. International Review of Research in Open and Distributed
Learning, 21(3), 121–139.
Canning, E. A., & Harackiewicz, J. M. (2015). Teach it, don’t preach it:
e dierential eects of directly-communicated and self-generated
utility value information. Motivation Science, 1(1), 47–71. https://doi.
org/10.1037/mot0000015
Canning, E. A., Harackiewicz, J. M., Priniski, S. J., Hecht, C. A.,
Tibbetts, Y., & Hyde, J. S. (2018). Improving performance and reten-
tion in introductory biology with a utility-value intervention. Journal
of Educational Psychology, 110(6), 834–849. https://doi.org/10.1037/
edu0000244
Chaw, L. Y., & Tang, C. M. (2019). Driving high inclination to complete
massive open online courses (MOOCs): Motivation and engagement
factors for learners. Electronic Journal of e-Learning, 17(2), 118–130.
https://doi.org/10.34190/JEL.17.2.05
Clark, T. (2005). e business profession: A mandatory, noncredit,
co-curricular career preparation program for undergraduate business
majors. Business Communication Quarterly, 68(3), 271–289. https://
doi.org/10.1177/1080569905279026
Cohen, J. (1988). Statistical power analysis for the behavioral sciences
(2nd ed). L. Erlbaum Associates.
Cramér, H. (1946). Mathematical methods of statistics. Princeton
University Press.
16 M. HUBENTHAL AND M. BRUDZINSKI
D’Amico, M. M., Morgan, G. B., ornton, Z. M., & Bassis, V. (2020).
Noncredit education enrollment and outcomes: Exploring the “Black
Box” of noncredit community college education. Career and Technical
Education Research, 45(2), 17–38. https://doi.org/10.5328/cter45.2.17
DeVellis, R. F. (2003). Scale development: eory and applications. Sage
Publications, Inc.
Dietrich, J., Viljaranta, J., Moeller, J., & Kracke, B. (2017). Situational
expectancies and task values: Associations with students’ eort.
Learning and Instruction, 47, 53–64. https://doi.org/10.1016/j.learnin-
struc.2016.10.009
Dugick, F. K. D., Lee, S. v d., Prieto, G. A., Dybing, S. N., Toney, L.,
& Cole, H. M. (2021). ROSES: Remote online sessions for emerging
seismologists. Seismological Research Letters, 92(4), 2657–2667.
https://doi.org/10.1785/0220200421
Eccles, J. S., Adler, T. F., Futterman, R., Go, S. B., Kaczala, C. M.,
Meece, J. L., & Midgley, C. (1983). Expectancies, values, and aca-
demic behaviors. In J. T. Spence (Ed.), Achievement and achievement
motivation. W.H. Freeman.
Eccles, J. S., & Wigeld, A. (2002). Motivational beliefs, values, and
goals. Annual Review of Psychology, 53(1), 109–132. https://doi.
org/10.1146/annurev.psych.53.100901.135153
Eccles, J. S., & Wigeld, A. (2020). From expectancy-value theory to
situated expectancy-value theory: A developmental, social cognitive,
and sociocultural perspective on motivation. Contemporary
Educational Psychology, 61, 101859. https://doi.org/10.1016/j.ced-
psych.2020.101859
Flesch, R. (1981). How to write in plain English. Harper Collins.
Gaspard, H., Dicke, A. L., Flunger, B., Brisson, B. M., Häfner, I.,
Nagengast, B., & Trautwein, U. (2015). Fostering adolescents’ value
beliefs for mathematics with a relevance intervention in the class-
room. Developmental Psychology, 51(9), 1226–1240. https://doi.
org/10.1037/dev0000028
Gonzales, L., & Keane, C. (2020). Diversity in the Geosciences,
American Geological Institute (AGI): Geoscience Currents. Data
Brief 2020-023.
Harackiewicz, J. M., Canning, E. A., Tibbetts, Y., Priniski, S. J., & Hyde,
J. S. (2016). Closing achievement gaps with a utility-value intervention:
Disentangling race and social class. Journal of Personality and Social
Psychology, 111(5), 745–765. https://doi.org/10.1037/pspp0000075
Ho, D., Reich, J., Nesterko, S., Seaton, T., Mullaney, T., Waldo, J.,
Chuang, I. (2014). HarvardX and MITx: e rst year of open
online courses (HarvardX and MITx Working Paper No. 1). Available
at SSRN: https://ssrn.com/abstract=2381263
Hubenthal, M. (2019). Investigating the academic pathways and careers
of alumni of the IRIS Undergraduate Internship Program (1998 –
2018). Incorporated Research Institutions for Seismology. https://
www.iris.edu/hq/les/programs/education_and_outreach/Evaluation/
Investigating_career_pathways_v1.pdf
Hubenthal, M., Bohon, W., & Taber, J. (2020). A pandemic pivot in
Earth science outreach and education. Eos, 101. https://doi.
org/10.1029/2020EO152146
Hubenthal, M., & Judge, J. (2013). Taking research experiences for
undergraduates online. Eos, Transactions American Geophysical
Union, 94(17), 157–158. https://doi.org/10.1002/2013EO170001
Hulleman, C., Barron, K., Kosovich, J., & Lazowski, R. (2016). Current
theories, constructs, and interventions within an expectancy-value
framework. In A. Lipnevich, F. Preckel, & R. Roberts (Eds.),
Psychosocial skills and school systems in the Twenty-rst century:
eory, research, and applications. Springer.
Hulleman, C. S., Godes, O., Hendricks, B. L., & Harackiewicz, J. M.
(2010). Enhancing interest and performance with a utility value
intervention. Journal of Educational Psychology, 102(4), 880–895.
https://doi.org/10.1037/a0019506
IRIS. (2021). EarthScope USArray introductory and advanced short
courses 2009-2017. Incorporated Research Institutions for Seismology.
https://www.iris.edu/hq/les/short_courses/USArray_Synopsis_v2.pdf
Jordan, K. (2015). Massive open online course completion rates revisit-
ed: Assessment, length and attrition. e International Review of
Research in Open and Distributed Learning, 16(3), 341–358. https://
doi.org/10.19173/irrodl.v16i3.2112
Jordan, K. (2020). MOOC Completion Rates: e Data. Retrieved
February 2023, from http://www.katyjordan.com/MOOCproject.html
Junge, B., Quiñones, C., Kakietek, J., Teodorescu, D., & Marsteller, P.
(2010). Promoting undergraduate interest, preparedness, and profes-
sional pursuit in the sciences: An outcomes evaluation of the SURE
program at Emory University. CBE—Life Sciences Education, 9(2),
119–132. https://doi.org/10.1187/cbe.09-08-0057
Khalil, H, and., & Ebner, M. (2014). MOOCs completion rates and
possible methods to improve retention—A literature review. Proc.of
World Conf. on Educational Multimedia, Hypermedia and
Telecommunications, 2014, 1236–1244.
Kosovich, J., Hulleman, C., Barron, K., & Getty, S. (2015). A practical
measure of student motivation: Establishing validity evidence for the
expectancy-value-cost scale in middle school. e Journal of Early
Adolescence, 35(5–6), 790–816. https://doi.org/10.1177/0272431614556890
Lockwood, J., & Mooney, A. (2018). Computational inking in
Secondary Education: Where does it t? A systematic literary review.
International Journal of Computer Science Education in Schools, 2(1),
41–60. https://doi.org/10.21585/ijcses.v2i1.26
Luik, P., & Lepp, M. (2021). Are highly motivated learners more likely
to complete a computer programming MOOC? e International
Review of Research in Open and Distributed Learning, 22(1), 41–58.
https://doi.org/10.19173/irrodl.v22i1.4978
Maya-Jariego, I., Holgado, D., González-Tinoco, E., Castaño-Muñoz, J.,
& Punie, Y. (2020). Typology of motivation and learning intentions
of users in MOOCs: e MOOCKNOWLEDGE study. Educational
Technology Research and Development, 68(1), 203–224. https://doi.
org/10.1007/s11423-019-09682-3
Meece, J. L., Wigeld, A., & Eccles, J. S. (1990). Predictors of math
anxiety and its inuence on young adolescents’ course enrollment
intentions and performance in mathematics. Journal of Educational
Psychology, 82(1), 60–70. https://doi.org/10.1037/0022-0663.82.1.60
Mosher, S., & Keane, C. (Eds.) (2021). Vision and change in the geosci-
ences: e future of undergraduate geoscience education. American
Geosciences Institute.
Musu-Gillette, L. E., Wigeld, A., Harring, J., & Eccles, J. S. (2015).
Trajectories of change in student’s self-concepts of ability and values in
math and college major choice. Educational Research and Evaluation,
21(4), 343–370. https://doi.org/10.1080/13803611.2015.1057161
NASEM. (2018). Graduate STEM education for the 21st century. e
National Academies Press. https://doi.org/10.17226/25038
NASEM. (2020). Enhancing quantitative capacity of geoscience pro-
grams, organized by the committee on solid earth geophysics.
National Academies of Sciences, Engineering, and Medicine. https://
www.nationalacademies.org/event/04-09-2020/cosg-spring-2020
NSF. (2007). NSF Cyberinfrastructure Council provides its cyberinfra-
structure vision for the 21st century. National Science Foundation.
http://www.nsf.gov/pubs/2007/nsf0728/nsf0728.pdf
NSF. (2011). Advisory committee for cyberinfrastructure task force on
cyberlearning and workforce development report. National Science
Foundation. https://www.nsf.gov/cise/oac/taskforces/TaskForceReport_
Learning.pdf
NSF. (2017). Harnessing the data revolution. National Science Foundation.
https://www.nsf.gov/news/special_reports/big_ideas/harnessing.jsp
Pajares, F. (1996). Self-ecacy beliefs in academic settings. Review
of Educational Research, 66(4), 543–578. https://doi.org/10.3102/
00346543066004543
Perez, T., Dai, T., Kaplan, A., Cromley, J. G., Brooks, W. D., White, A.
C., Mara, K. R., & Balsai, M. J. (2019). Interrelations among expec-
tancies, task values, and perceived costs in undergraduate biology
achievement. Learning and Individual Dierences, 72, 26–38. https://
doi.org/10.1016/j.lindif.2019.04.001
Privitera, G. J. (2014). Research methods for the behavioral sciences. Sage
Publications, Inc.
Reich, J. (2014). MOOC completion and retention in the context of
student intent, EDUCAUSE Review Online 8. Retrieved from https://
er.educause.edu/articles/2014/12/mooc-completion-and-retentio
n-in-the-context-of-student-intent
Rosenzweig, E. Q., Wigeld, A., & Hulleman, C. S. (2020). More useful
or not so bad? Examining the eects of utility value and cost reduc-
JOURNAL OF GEOSCIENCE EDUCATION 17
tion interventions in college physics. Journal of Educational
Psychology, 112(1), 166–182. https://doi.org/10.1037/edu0000370
Rosenzweig, E. Q., Wigeld, A., & Eccles, J. S. (2022). Beyond utility
value interventions: e why, when, and how for next steps in
expectancy-value intervention research. Educational Psychologist,
57(1), 11–30. https://doi.org/10.1080/00461520.2021.1984242
Simpkins, S. D., Davis-Kean, P. E., & Eccles, J. S. (2006). Math and
science motivation: A longitudinal examination of the links between
choices and beliefs. Developmental Psychology, 42(1), 70–83. https://
doi.org/10.1037/0012-1649.42.1.70
Sit, S. M., & Brudzinski, M. R. (2017). Creation and assessment of an
active e-learning introductory geology course. Journal of Science
Education and Technology, 26(6), 629–645. https://doi.org/10.1007/
s10956-017-9703-3
Sloan, V., Haacker, R., Batchelor, R., & Garza, C. (2020). How
COVID-19 is aecting undergraduate research experiences. Eos, 101.
https://doi.org/10.1029/2020EO145667
Stuebeam, D. L., & Shinkeld, A. J. (2007). Evaluation theory, models,
and applications. John Wiley & Sons.
Taber, J., Hubenthal, M., Bravo, T., Dorr, P., Johnson, J., McQuillian, P.,
Sumy, D. F., & Welti, R. (2015). Seismology education and
public-outreach resources for a spectrum of audiences, as provided
by the IRIS Consortium. e Leading Edge, 34(10), 1178–1184.
https://doi.org/10.1190/tle34101178.1
Tang, X., Yin, Y., Lin, Q., Hadad, R., & Zhai, X. (2020). Assessing compu-
tational thinking: A systematic review of empirical studies. Computers
& Education, 148, 103798. https://doi.org/10.1016/j.compedu.2019.103798
iry, H., Weston, T. J., Laursen, S. L., & Hunter, A. B. (2012). e
benets of multi-year research experiences: Dierences in novice
and experienced students’ reported gains from undergraduate
research. CBE–Life Sciences Education, 11, 1–13.
Walters, C., Patino, L., Rom, E., & Adams, A. (2016). Participant
trends in the geosciences research experiences for undergraduates
program. Abstract (ED51B-0790) presented at 2022 AGU Fall Meeting,
12–16 Dec., San Francisco, CA.
Wei, X., Saab, N., & Admiraal, W. (2024). What rationale would work?
unfolding the role of learners’ attitudes and motivation in predicting
learning engagement and perceived learning outcomes in MOOCs.
International Journal of Educational Technology in Higher Education,
21(1), 5. https://doi.org/10.1186/s41239-023-00433-2
Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., &
Wilensky, U. (2016). Dening computational thinking for mathematics
and science classrooms. Journal of Science Education and Technology,
25(1), 127–147. https://doi.org/10.1007/s10956-015-9581-5
Wigeld, A., & Eccles, J. S. (2000). Expectancy-value theory of achieve-
ment motivation. Contemporary Educational Psychology, 25(1), 68–
81. https://doi.org/10.1006/ceps.1999.1015
Wigeld, A., Eccles, J. S., Yoon, K. S., Harold, R. D., Arbreton, A.,
Freedman-Doan, C., & Blumenfeld, P. C. (1997). Change in chil-
dren’s competence beliefs and subjective task values across the ele-
mentary school years: A 3-year study. Journal of Educational
Psychology, 89(3), 451–469. https://doi.org/10.1037/0022-0663.89.3.451
Wigeld, A., Rosenzweig, E. Q., & Eccles, J. (2017). Competence values.
In A. Elliot, C. Dweck, & D. Yeager (Eds.), Handbook of competence
and motivation: eory and application (2nd ed., pp. 116–134).
Guilford Press.
Wigeld, A., Tonks, S. M., & Klauda, S. L. (2016). Expectancy-value
theory. In K. R. Wentzel & D. B. Miele (Eds.), Handbook of motiva-
tion of school (2nd ed., pp. 55–74). Routledge.
Wood, J. (2022). ese 3 charts show the global growth in online
learning. World Economic Forum. https://www.weforum.org/agenda/
2022/01/online-learning-courses-reskill-skills-gap/