ArticlePDF Available

Virtually the Same? Evaluating the Effectiveness of Remote Undergraduate Research Experiences

Authors:

Abstract

In-person undergraduate research experiences (UREs) promote students’ integration into careers in life science research. In 2020, the COVID-19 pandemic prompted institutions hosting summer URE programs to offer them remotely, raising questions about whether undergraduates who participate in remote research can experience scientific integration and whether they might perceive doing research less favorably (i.e., not beneficial or too costly). To address these questions, we examined indicators of scientific integration and perceptions of the benefits and costs of doing research among students who participated in remote life science URE programs in Summer 2020. We found that students experienced gains in scientific self-efficacy pre- to post-URE, similar to results reported for in-person UREs. We also found that students experienced gains in scientific identity, graduate and career intentions, and perceptions of the benefits of doing research only if they started their remote UREs at lower levels on these variables. Collectively, students did not change in their perceptions of the costs of doing research despite the challenges of working remotely. Yet students who started with low cost perceptions increased in these perceptions. These findings indicate that remote UREs can support students’ self-efficacy development, but may otherwise be limited in their potential to promote scientific integration.
CBE—Life Sciences Education 22:ar25, 1–12, Summer 2023 22:ar25, 1
ARTICLE
Riley A. Hess,1 Olivia A. Erickson,2 Rebecca B. Cole,1 Jared M. Isaacs,2
Silvia Alvarez-Clare,3 Jonathan Arnold,4 Allison Augustus-Wallace,5
Joseph C. Ayoob,6 Alan Berkowitz,7 Janet Branchaw,8 Kevin R. Burgio,7
Charles H. Cannon,3 Ruben Michael Ceballos,9 C. Sarah Cohen,10 Hilary Coller,11
Jane Disney,12 Van A. Doze,13 Margaret J. Eggers,14 Edwin L. Ferguson,15
Jerey J. Gray,16 Jean T. Greenberg,15 Alexander Homann,17
Danielle Jensen-Ryan,18 Robert M. Kao,19 Alex C. Keene,20 Johanna E. Kowalko,21
Steven A. Lopez,22 Camille Mathis,23 Mona Minkara,24 Courtney J. Murren,25
Mary Jo Ondrechen,22 Patricia Ordoñez,26 Anne Osano,27 Elizabeth
Padilla-Crespo,28 Soubantika Palchoudhury,29 Hong Qin,30 Juan Ramírez-Lugo,31
Jennifer Reithel,32 Colin A. Shaw,33 Amber Smith,34 Rosemary J. Smith,32,35
Fern Tsien,36 and Erin L. Dolan2,*
1Department of Psychology, 2Department of Biochemistry and Molecular Biology, and
4Department of Genetics, University of Georgia, Athens, GA 30602; 3Center for Tree Science,
Morton Arboretum, Lisle, IL 60532; 5Department of Medicine and Oce of Diversity and
Community Engagement and 36Department of Genetics, Louisiana State University Health
Sciences Center, New Orleans, LA 70112; 6Department of Computational and Systems Biology,
University of Pittsburgh School of Medicine, Pittsburgh, PA 15260; 7Education Department, Cary
Institute for Ecosystem Studies, Millbrook, NY 12545; 8WISCIENCE and Department of Kinesiology
and 34Wisconsin Institute for Science Education and Community Engagement, University of
Wisconsin–Madison, Madison, WI 53706; 9Department of Biological Sciences, University of
Arkansas, Fayetteville, AR 72701; 10Department of Biology, Estuary and Ocean Science Center,
San Francisco State University, San Francisco, CA 94132; 11Department of Molecular, Cell and
Developmental Biology and 17Department of Microbiology, Immunology, and Molecular Genetics
and Institute for Quantitative and Computational Biosciences, University of California Los Angeles,
Los Angeles, CA 90095; 12Community Environmental Health Laboratory, Mt. Desert Island
Biological Laboratory, Salisbury Cove, ME 04672; 13Department of Biomedical Sciences, University
of North Dakota, Grand Forks, ND 58202; 14Department of Microbiology and Cell Biology and;
33Undergraduate Scholars Program and Department of Earth Sciences, Montana State University,
Bozeman, MT 59717; 15Department of Molecular Genetics and Cell Biology, University of Chicago,
Chicago, IL 606307; 16Department of Chemical & Biomolecular Engineering, Johns Hopkins
University, Baltimore, MD 21218; 18Department of Math and Sciences, Laramie County Communi-
ty College, Cheyenne, WY 82007; 19Science Department, College of Arts and Sciences, Heritage
University, Toppenish, WA 98948; 20Department of Biology, Texas A&M University, College Station,
TX 77840; 21Department of Biological Sciences, Lehigh University, Bethlehem, PA 18015;
22Department of Chemistry & Chemical Biology and 24Department of Bioengineering, Northeast-
ern University, Boston, MA 02115; 23American Psychiatric Association, Washington, DC 20024;
25Department of Biology, College of Charleston, Charleston, SC 29424; 26Department of
Computer Science and 31Department of Biology, University of Puerto Rico–Río Piedras, San Juan,
PR 00925; 27Department of Natural Sciences, Bowie State University, Bowie, MD 20715;
28Department of Science and Technology, Inter American University of Puerto Rico–Aguadilla,
Aguadilla, PR 00605; 29Chemical and Materials Engineering, University of Dayton, Dayton, OH
45469; 30Department of Computer Science and Engineering and Department of Biology,
Geology, and Environmental Science, University of Tennessee at Chattanooga, Chattanooga,
TN 37403; 32Rocky Mountain Biological Laboratory, PO Box 519, Crested Butte, CO 81224;
35Department of Biological Sciences, Idaho State University, Pocatello, ID 83209
Virtually the Same? Evaluating the
Eectiveness of Remote Undergraduate
Research Experiences
Kyle Frantz, Monitoring Editor
Submitted Jan 4, 2022; Revised Feb 27, 2023;
Accepted Mar 17, 2023
DOI:10.1187/cbe.22-01-0001
*Address correspondence to: Erin L. Dolan
(eldolan@uga.edu).
© 2023 R. A. Hess etal. CBE—Life Sciences
Education © 2023 The American Society for Cell
Biology. This article is distributed by The
American Society for Cell Biology under license
from the author(s). It is available to the public
under an Attribution–Noncommercial–Share
Alike 4.0 Unported Creative Commons License
(http://creativecommons.org/licenses/
by-nc-sa/4.0).
“ASCB®” and “The American Society for Cell
Biology®” are registered trademarks of The
American Society for Cell Biology.
CBE Life Sci Educ June 1, 2023 22:ar25
22:ar25, 2 CBE—Life Sciences Education 22:ar25, Summer 2023
R. A. Hess et al.
INTRODUCTION
Undergraduate research experiences (UREs) are critical for
shaping students’ decisions regarding whether to pursue gradu-
ate education and research careers in the life sciences (Laursen
et al., 2010; Lopatto and Tobias, 2010; Gentile et al., 2017).
Although UREs vary widely in duration and structure, they
share some common characteristics (Lopatto, 2003; Gentile
et al., 2017). Typically, undergraduate researchers join faculty
members’ research groups to collaborate in or carry out some
aspect of their research. Undergraduates are guided in their
research by a more experienced researcher, such as a graduate
student, postdoctoral associate, or faculty member, who is typi-
cally called their “research mentor” (Thiry and Laursen, 2011;
Aikens et al., 2016; Joshi et al., 2019). During UREs, students
are expected to engage in the practices of the discipline, includ-
ing collecting and analyzing data, interpreting results, trouble-
shooting and problem solving, collaborating with other
researchers, and communicating findings both orally and in
writing (Gentile et al., 2017). Often, undergraduate researchers
assume increasing ownership of their research over time, taking
on greater responsibility and autonomy in their work as they
gain experience and expertise (Hanauer et al., 2012).
In 2020, the COVID-19 pandemic caused massive disrup-
tions of research, slowing or stopping research altogether at col-
leges and universities across the country (Korbel and Stegle,
2020; Redden, 2020). Summer URE programming was not
spared these effects. In 2019, there were 125 National Science
Foundation (NSF)-funded URE Sites in the biological sciences;
in Summer 2020, 80% of Sites were cancelled (S. O’Conner,
NSF program manager for BIO REU Sites, personal communica-
tion). Remarkably, about 20% of the Sites opted to proceed
with their Summer 2020 programs. The programs that opted to
proceed were modified to operate on an entirely remote basis.
Research projects had to be modified, or changed entirely, to
accommodate a remote format (Erickson et al., 2022). These
modifications typically included a shift from experimental, lab-
oratory, and field-based research and techniques to research
questions or problems that could be addressed using computa-
tional and analytical approaches. Additionally, program leaders
and research mentors were tasked with adapting their typical
ABSTRACT
In-person undergraduate research experiences (UREs) promote students’ integration into
careers in life science research. In 2020, the COVID-19 pandemic prompted institutions
hosting summer URE programs to oer them remotely, raising questions about whether
undergraduates who participate in remote research can experience scientific integration
and whether they might perceive doing research less favorably (i.e., not beneficial or too
costly). To address these questions, we examined indicators of scientific integration and
perceptions of the benefits and costs of doing research among students who participated
in remote life science URE programs in Summer 2020. We found that students experienced
gains in scientific self-ecacy pre- to post-URE, similar to results reported for in-person
UREs. We also found that students experienced gains in scientific identity, graduate and
career intentions, and perceptions of the benefits of doing research only if they started
their remote UREs at lower levels on these variables. Collectively, students did not change
in their perceptions of the costs of doing research despite the challenges of working re-
motely. Yet students who started with low cost perceptions increased in these perceptions.
These findings indicate that remote UREs can support students’ self-ecacy development,
but may otherwise be limited in their potential to promote scientific integration.
program timelines, meeting schedules, communication plat-
forms, and curricula (e.g., seminars, workshops) to an online
format.
This unprecedented and massive shift raises the question of
whether undergraduates who participate in remote research
programs realize the same outcomes as undergraduates who
have participated in in-person URE programs. This question is
important to address for several reasons. First, graduate pro-
grams and employers can benefit from knowing about the expe-
riences and outcomes of applicants whose research experience
occurred remotely during Summer 2020. Second, if remote
URE programs are beneficial to students, they have the poten-
tial to expand access to research experiences, especially for stu-
dents who would otherwise be excluded from in-person UREs
due to geographic constraints. Third, remote URE programs
may reduce some of the cost associated with in-person pro-
gramming (e.g., housing), allowing reallocation of these funds
to pay additional undergraduate researchers. Finally, remote
UREs may allow both students and their mentors greater flexi-
bility in balancing work–life demands, including eliminating
the hassle of relocating for a temporary summer research posi-
tion. The present study aims to provide insight about whether
remote UREs benefit students and thus should be considered an
option for URE programming in the future.
THEORETICAL FRAMEWORK
For the most part, UREs have been designed to allow students
to explore research as a path for further education and careers
(Seymour et al., 2004; Hunter et al., 2007; Laursen et al., 2010;
Lopatto and Tobias, 2010; Thiry et al., 2011; Gentile et al.,
2017). Multiple theories related to career development and
decision making have been used to explore and explain the out-
comes students realize from participating in research. For exam-
ple, Estrada, Hernandez, and colleagues carried out a series of
studies framed by the tripartite integration model of social
influence (TIMSI), arguing that three social factors influence
students’ integration into the scientific community (Estrada
et al., 2011; Hernandez et al., 2018). Specifically, students’ sci-
entific self-efficacy, scientific identity, and perceptions of the
alignment between their personal values and the values of the
CBE—Life Sciences Education 22:ar25, Summer 2023 22:ar25, 3
Impacts of Remote Undergraduate Research
scientific community (i.e., values alignment) predict whether
students engage in research experiences and continue in a sci-
ence research–related path (Estrada et al., 2011, 2018). Fur-
thermore, students’ engagement in research increases their sci-
entific self-efficacy, which in turn positively influences their
scientific identity (Adedokun et al., 2013; Robnett et al., 2015;
Frantz et al., 2017). Thus, from an empirical perspective,
research experiences can stimulate a recursive process through
which students develop their research skills, feel more capable
of performing research, identify and share values with the
research community, and choose to continue in research (Her-
nandez et al., 2020). Theoretically, the TIMSI illustrates how
research experiences embed students in the social environment
of a research group, thereby promoting their integration into
the scientific community (Hernandez et al., 2020).
It is unclear whether remote research affords the same social
environment for students to carry out research as does an
in-person experience. For example, the types of research activi-
ties that can be done at a distance are more limited, which may
limit students’ development of research skills and, in turn, their
scientific self-efficacy. The extent to which research mentors can
provide in-the-moment guidance to help students overcome
challenges is also likely to be limited, because they are not
working side by side. This may affect the extent to which stu-
dents are successful in their research tasks, which could stymie
their scientific self-efficacy development. Furthermore, students
may feel less engaged in the social environment of their research
group, because their interactions are more time and space lim-
ited. This may in turn limit their feelings of being part of the
research community, thereby limiting their scientific identity
development. Thus, it is reasonable to question whether remote
UREs would foster the same level of scientific integration as
in-person UREs.
Prior research has also used expectancy-value-cost theory
(EVT; Eccles and Wigfield, 2002; Barron and Hulleman, 2015)
as a framework for examining students’ value of UREs as a pre-
dictor of their motivation to continue in research (Ceyhan and
Tillotson, 2020). EVT posits that individuals’ expectations
about the degree to which they will be successful in a task (i.e.,
their self-efficacy) and their perceptions of the value and costs
associated with a task or pursuit influence their motivation to
engage in the task or pursuit in the future (Eccles and Wigfield,
2002; Barron and Hulleman, 2015). From this theoretical per-
spective, one would expect undergraduates to decide whether
to pursue graduate education or research careers based on
whether they perceived they were sufficiently competent and
whether doing research would provide sufficient value over
costs. Value can take the form of being personally interesting
(intrinsic value), being useful (utility value), and providing
prestige or respect (attainment value; Eccles and Wigfield,
2002). Cost can be experienced in terms of effort spent, emo-
tional or psychological tolls, or missed opportunities (Ceyhan
and Tillotson, 2020).
Work from Ceyhan and Tillotson (2020) indicates that
undergraduates express intrinsic and utility value as well as
opportunity costs of in-person research. However, students may
experience remote research differently, ascribing different val-
ues and costs to research and differing in their motivation to
continue research in the future. For example, students carrying
out research remotely may not be responsible for the hands-on
collection of their data, which may limit their interest in the
work (i.e., less intrinsic value). In contrast, students may per-
ceive greater utility value, because they learn computational
skills that are useful in a variety of career paths and in high
demand among employers. In addition, students may perceive
less opportunity cost of doing remote research because of its
inherent flexibility (e.g., no need to physically relocate, options
to schedule research tasks around other personal demands).
In summary, prior research using TIMSI and EVT shows that
UREs influence students’ scientific self-efficacy, scientific iden-
tity, and perceptions of the value and costs of doing research,
which can in turn influence their intentions to pursue a gradu-
ate degree and/or a research career as well as their actual pur-
suit of these paths. Here, we used these frameworks to study of
the influence of remote UREs on student outcomes. Specifically,
we sought to address the following research questions:
1. To what extent do undergraduates who engage in remote
research programs experience scientific integration in terms
of gains in their scientific self-efficacy, scientific identity, val-
ues alignment, and intentions to pursue graduate education
and science research-related careers?
2. To what extent do undergraduates who engage in remote
research programs shift their perceptions of the values and
costs of doing research?
Due to COVID-19, it was not possible to include a compari-
son group of in-person undergraduate researchers. Thus, we
report our results here and interpret them with respect to pub-
lished results of in-person UREs, which include students in URE
Sites and other URE formats (e.g., Robnett et al., 2015; Frantz
et al., 2017; Ceyhan and Tillotson, 2020; Hernandez et al.,
2020).
METHODS
Here we describe the results of a single-arm, comparative study.
We collected data using established survey measures of the con-
structs of interest, which we administered before and after stu-
dents participated in a remote research program. We evaluated
the measurement models and then we addressed our research
questions by fitting a series of latent growth models within a
structural equation model framework. The results reported here
are part of a larger study of remote UREs that was reviewed and
determined to be exempt by the University of Georgia Institu-
tional Review Board (STUDY00005841, MOD00008085).
Context and Participants
We contacted the 25 institutions that planned to host remote
research programs during Summer 2020 (S. O’Connor, personal
communication) to invite them to collaborate in this study. A
total of 23 programs hosted by 24 research institutions in 18
states and one U.S. territory agreed to participate by distribut-
ing study information to their Summer 2020 cohorts of under-
graduate researchers. The sample included five non–degree
granting research institutes as well as three master’s universi-
ties, one doctoral university, two high research activity universi-
ties, and 11 very high research activity universities according to
the Carnegie Classification of Institutes of Higher Education.
Three universities were classified as Hispanic-serving institu-
tions. At the time of enrollment, undergraduate researchers did
not yet know that their summer programs would take place
22:ar25, 4 CBE—Life Sciences Education 22:ar25, Summer 2023
R. A. Hess et al.
remotely. One institution did not have the capacity to host its
complete program remotely, so they partnered with another
institution to host a joint program. Additionally, one of the 24
institutions offered two distinct programs funded from different
sources. We treated these as a single program, because the par-
ticipating students, their research projects, and the program
activities were quite similar (Erickson et al., 2022). In total, 307
students received the recruitment email and study information.
This number includes students (n = 27) who participated pri-
marily in person who were later excluded from the analysis. A
total of 227 remote students in 22 programs (average group
size = 12) completed both the pre and post surveys. The aver-
age program duration was 9 weeks; detailed duration data
can be found in Table 1. Of the 227 students who responded to
both the pre and post surveys, 153 identified as women, 69
identified as men, and 4 identified as non-binary. There were 45
students who indicated they were transfer students and 54 who
indicated that they were first generation college students (i.e.,
no parent or guardian completed a bachelor’s degree). Program
details are described elsewhere (Erickson et al., 2022).
The programs in this study were funded by the NSF or the
U.S. Department of Agriculture. The NSF supports UREs
through two funding mechanisms: Research Experience for
Undergraduate (REU) Sites, which host cohorts of students
each year, or REU Supplements, which typically support one or
two undergraduate researchers associated with a funded
research project (National Science Foundation, n.d.). Here, we
focus on URE Sites, which typically offer some combination of
networking with faculty and professional development to com-
plement the mentored research experience (National Science
Foundation, n.d.). In the past, URE participants have typically
been junior- or senior-level undergraduate students who have
committed to a science, technology, engineering, and mathe-
matics (STEM) major, but programs are increasingly involving
students at earlier points in their undergraduate careers in
order to attract students to a STEM career who were not already
on this path (National Science Foundation, n.d.).
Data Collection
We surveyed students twice using the secure survey service
Qualtrics: 1) at the beginning of the program (pre survey or
time 1) and 2) after all program activities had been com-
pleted (post survey or time 2). Students participating in pro-
grams that offered pre-program workshops were asked to
complete the initial survey before engaging in these work-
shops. Students were sent emails with the final survey within
a week of finishing their URE programs with up to two
reminders. Monetary incentives were not offered. Only
students who completed both surveys were included in the
sample (Table 2). The survey measures are described briefly
here and included in their entirety in the Supplemental
Material.
Scientific Self-Ecacy. Scientific self-efficacy is the extent to
which students are confident in their ability to carry out various
science research practices, such as developing a hypothesis to
test. We used a nine-item scientific self-efficacy measure that
was a combination of seven published items (Chemers et al.,
2011; Estrada et al., 2011) and two items (“Use computational
skills” and “Troubleshoot an investigation or experiment”) that
we authored based on input from the directors of the URE pro-
grams in this study. These items were intended to more fully
capture the forms of scientific self-efficacy students could
develop by engaging in remote research. Response options
ranged from 1 (“not confident”) to 6 (“extremely confident”).
Scientific Identity. Scientific identity is the extent to which stu-
dents see themselves as scientists and as members of the scien-
tific community. We used a seven-item scientific identity mea-
sure using seven published items (Chemers et al., 2011; Estrada
et al., 2011). An example item is “I have a strong sense of
belonging to the community of scientists.” Response options
ranged from 1 (“strongly disagree”) to 6 (“strongly agree”).
Values Alignment. Values alignment is the extent to which stu-
dents see their personal values as aligning with values of the
scientific community. We used a published four-item values
alignment measure (Estrada et al., 2011), the structure of which
was based upon the Portrait Value Questionnaire (Schwartz
et al., 2001). Response options ranged from 1 (“not like me”) to
6 (“extremely like me”). An example item is “A person who
thinks it is valuable to conduct research that builds the world’s
scientific knowledge.”
Intrinsic Value. Intrinsic value refers to how much students
find research personally interesting and enjoyable. We adapted
a published six-item intrinsic value measure (Gaspard et al.,
2015b). Response options ranged from 1 (“strongly disagree”)
to 6 (“strongly agree”). An example item is “Research is fun to
me.”
Personal Importance. Personal importance (also known as
attainment value) refers to the importance that students place
on doing well in research, including how relevant doing well in
research is for their identity. We adapted a three-item personal
importance measure (Gaspard et al., 2015b). Response options
ranged from 1 (“strongly disagree”) to 6 (“strongly agree”). An
example item is “Research is very important to me personally.”
Utility Value. Although EVT conceptualizes utility value as a
single construct, work from Gaspard and others has shown
that students perceive different forms of utility from their edu-
cational experiences, such as utility for their future careers or
for helping their communities (Thoman et al., 2014; Gaspard
et al., 2015a, b). Thus, we chose to measure three forms of
utility value (i.e., job, life, and social utility) by adapting exist-
ing scales (Gaspard et al., 2015b). Job utility refers to students’
TABLE 1. Duration of URE programs: Remote URE programs in this
study varied in duration, with most being about 10 weeks long
Duration in weeks Number of programs
5 1
8 3
9 4a
10 12
11 2
aOne program had staggered end dates with most students engaging in research
for 9 weeks.
CBE—Life Sciences Education 22:ar25, Summer 2023 22:ar25, 5
Impacts of Remote Undergraduate Research
perceptions of how useful the ability to do research would be
in the context of a workplace. We adapted three job utility
items, such as “The skills I develop in research will help me be
successful in my career.” Life utility refers to students’ percep-
tions of how useful the ability to do research would be for
their everyday lives. We adapted three life utility items, such
as “Research comes in handy in everyday life.” Social utility
refers to students’ perceptions of how useful the ability to do
research would be for their communities. We adapted three
social utility items, such as “Being well versed in research will
prepare me to help my community.” For all utility items, the
response options ranged from 1 (“strongly disagree”) to 6
(“strongly agree”).
Cost. Cost is the extent to which students perceive research as
requiring them to make sacrifices. We adapted the three-item
cost scale (Gaspard et al., 2015b). Response options ranged
from 1 (“strongly disagree”) to 6 (“strongly agree”). An example
item is “I have to give up a lot to do well in research.”
Graduate and Career Intentions. Graduate and career inten-
tions refer the extent to which students intend to pursue a grad-
uate degree or science- or research-related career. The career-re-
lated item was used from Estrada et al. (2011), and the graduate
degree–related item was similarly worded, with “career”
replaced with “graduate degree.” Response options ranged
from 1 (“I DEFINITELY WILL NOT pursue a graduate degree
in science/a science research–related career”) and 5 (“I
DEFINITELY WILL pursue a graduate degree in science/a
science research–related career”).
Previous Research Experience. To better characterize the
study sample and explore possible differential effects of remote
research experiences for students with different levels of
research experience, we asked students how much research
experience they had before they participated in the study.
Response options included: none, one semester or summer, two
semesters or summers, three semesters or summers, and more
than three semesters or summers.
Missing Data
Data were evaluated for missingness. Most variables in the pre
and post surveys were not missing any observations. Out of all
22,635 data points, only 65 were missing. Only participants
with post observations were included in the data set. To check
for attrition biases, we compared pre-survey item means of
participants who did or did not complete the post survey (n =
37) using Welch’s two-sample t tests. We observed a significant
difference only for one values alignment item, which asks par-
ticipants to rate the extent to which they agree that “I can do
better in the world based on my ability to do research.” Stu-
dents who completed the post survey had higher pre-survey
scores (M = 5.22, SD = 0.92) than those who did not (M = 4.80,
SD = 0.76). This difference may be a false positive, given that
we ran 25 t tests to test for item-level differences related to
missingness. Based on the very limited number of missing val-
ues and the absence of meaningful group differences in miss-
ingness, we assume that data were missing at random and thus
not likely to impact our results.
Data Analysis
Following the Anderson and Gerbing (1988) two-step
approach, we first tested confirmatory measurement models
for all measures before fitting our structural models. To attain
optimum model fit for our measurement model, we followed
an iterative process of model specification using confirmatory
factor analysis (CFA) with robust maximum likelihood estima-
tion. We also evaluated the internal consistency and invariance
of the measures. Then, we used latent growth modeling within
a structural equation model framework to address our research
questions. All analyses were conducted in R v. 4.0.1 and RStu-
dio using the R package lavaan (Rosseel, 2012; Bates et al.,
2014). We provide an overview of our analyses in the following
sections and include details in the Supplemental Material.
Assessment of Measurement Model Fit. We used several fit
indices to assess how adequately our CFA models reproduced
their variance–covariance matrices. We provide a detailed
description of our approaches and the resulting model fit statis-
tics in the Supplemental Material, with a brief summary here.
First, we assessed measurement model fit by conducting a chi-
square test (χ2) for each model (Kline, 2015). Then we assessed
goodness of fit using equivalence testing (Yuan et al., 2016; Mar-
coulides and Yuan, 2017; Peugh and Feldon, 2020). We supple-
mented evaluation of our measurement models by interpreting
factor loadings to estimate the extent to which each survey item
reflects its respective latent variable and coefficient omega (Ω)
values as a measure of internal consistency, or the degree of item
correlation within the factor (Dunn et al., 2014). Ultimately, we
balanced evidence from fit indices, factor loadings, and omega
values to determine our final measurement models. Finally, we
evaluated each measure for invariance over time points.
TABLE 2. Demographics of study participantsa
Race/ethnicity
Previous research experience
None 1 Term 2 Terms 3 Terms >3 Terms Total
African American or Black 7672931
Central and East Asian 6587430
Latinx 10 13 16 11 10 60
Middle Eastern 1 1 2
Native American or Native Hawaiian 2 2 2 1 7
South Asian 3 1 4 8
White 18 30 34 13 21 116
aNote that students were able to indicate multiple races or ethnicities, so race/ethnicity counts do not sum to the total sample size.
22:ar25, 6 CBE—Life Sciences Education 22:ar25, Summer 2023
R. A. Hess et al.
Substantive Analyses. We calculated intraclass correlations
(ICC) using the R package psychometric v. 2.3 (Fletcher, 2010).
Specifically, we calculated ICC1, which estimates the influence
of the group on scores (Bliese, 2000). We fit our data in a struc-
tural equation model framework using latent growth models
(LGMs) with robust maximum likelihood estimation using the
lavaan R package. The models for our one-item measures of
graduate intentions and career intentions would not converge,
so these two outcomes were analyzed using a latent growth
model with both items included in the same equation (i.e., one
slope and one intercept).
We fit five LGMs altogether for 11 total variables. We fit four
LGMs related to the TIMSI, one for each of four variables (i.e.,
scientific self-efficacy, scientific identity, values alignment, and
graduate school and career intentions). We fit one LGM to esti-
mate changes in the seven benefit and cost variables. For each
model, we report seven parameters: 1) where students are at
the start of the remote URE (i.e., intercept of the fixed effect,
κ1); 2) any observed growth pre- to post-URE (i.e., slope of the
fixed effect, κ2), 3) any influence of prior research experience
on students’ starting values (prior research intercept; β1) and 4)
growth (prior research slope; β2); 5) any influence of students’
program on their starting values (program intercept; β1) and 6)
growth (program slope; β2); and 7) the correlation of the ran-
dom intercept and slope (Φ21). We interpret a positive correla-
tion as indicating that students starting at a higher value (e.g.,
greater incoming self-efficacy) grew more pre- to post-URE,
whereas a negative correlation indicates that students starting
at a higher value grew less from pre- to post-URE. This coeffi-
cient helps determine whether students with higher or lower
scores at the start of the URE changed the most from pre- to
post-URE. All reported scores are unstandardized. Means and
standard deviations for each measure at both time points are
reported in Table 3.
Because we conducted 77 statistical tests altogether (seven
parameters for 11 variables), we used the Benjamini-Hochberg
procedure for controlling the false discovery rate (Benjamini
and Hochberg, 1995). This procedure calculates a critical value
for each p value using the formula (i/m)*Q, where i is the rank
of the p value from lowest to highest, m is the total number of
tests run, and Q is our chosen false discovery rate. With a total
of 77 tests and a false discovery rate of 5%, we determined that
all tests with a value of p < 0.021 would be considered
significant.
RESULTS
Here we report the significant results of our LGM analyses. Given
that students were grouped by program, we first calculated ICCs
with program as the grouping variable to estimate the similarity
in scores between students of the same program. Across both
time points, intraclass correlations were small, with the highest
being ICC = 0.08 for scientific self-efficacy at time 2. These
results suggest that students in the same programs did not score
more similarly one another than to students in other programs.
Indicators of Scientific Integration
In alignment with the TIMSI, students who participated in
remote UREs grew in their scientific self-efficacy. Collectively,
students did not grow in their scientific identity, values align-
ment, or graduate and career intentions pre- to post-URE. How-
ever, many students began their remote UREs with up to three
terms of prior research experience and started their remote
UREs at high levels on these variables. When we analyzed
growth related to students’ pre-URE levels, we found that stu-
dents with lower starting scientific identity, values alignment,
and graduate and career intentions grew, while those with
higher starting levels did not. We report the specific results for
each outcome in the following sections and in Tables 3 and 4.
Students Grew in Their Scientific Self-Ecacy Regardless of
Their Starting Levels. Students began their UREs reporting
moderate levels of scientific self-efficacy (M = 3.65, SD = 0.091,
κ1 = 2.91). On average, students increased in their scientific
self-efficacy by a value of 0.87 on a 1 to 6 scale from pre- to
post-URE (κ2 = 0.87, SE = 0.25, p = 0.0002). In addition, stu-
dents who started their UREs at a lower level of scientific
self-efficacy experienced greater growth than those with higher
starting values (Φ21 = 0.44, SE = 0.07, p < 0.0001). Students’
prior research experience significantly predicted their scientific
self-efficacy at the start of their UREs (β1 = 0.23, SE = 0.04, p <
0.0001), but did not significantly predict their self-efficacy
growth from pre- to post-URE. Students did not differ in their
starting self-efficacy (p = 0.733) or their self-efficacy growth
based on their programs (p = 0.859).
TABLE 3. Descriptive statistics for outcome variables
NMean ± SD
Pre Post Pre Post
Scientific Self-efficacy 259 221 3.65 ± 0.91 4.27 ± 0.88
Scientific identity 257 226 4.64 ± 0.92 4.92 ± 0.97
Values alignment 262 225 5.29 ± 0.07 5.34 ± 0.77
Graduate school intentions 262 227 4.36 ± 0.78 4.38 ± 0.81
Career intentions 262 227 4.21 ± 0.84 4.31 ± 0.79
Enjoyment 257 227 5.17 ± 0.88 5.19 ± 0.99
Intrinsic value 261 227 5.43 ± 0.69 5.35 ± 0.97
Personal importance 262 226 5.31 ± 0.71 5.28 ± 0.85
Job utility 262 227 5.54 ± 0.67 5.49 ± 0.77
Life utility 260 226 5.12 ± 0.78 5.04 ± 0.96
Social utility 262 227 5.28 ± 0.72 5.18 ± 0.93
Cost 257 226 3.47 ± 1.28 3.44 ± 1.49
CBE—Life Sciences Education 22:ar25, Summer 2023 22:ar25, 7
Impacts of Remote Undergraduate Research
In analyzing the scientific self-efficacy data, we observed that
the mean score for item 2 (“Use computational skills [software,
algorithms, and/or quantitative technologies]”) was lower than
for the other items in the scale: M = 3.08 pre-URE (vs. M =
3.42–4.10 for other items) and M = 4.00 post-URE (vs. M = 3.85–
4.74 for other items). This suggests that, even though students
are experiencing scientific self-efficacy growth, students per-
ceived themselves to be less capable in their computational skills.
Students with Lower Starting Levels Grew in Their Scientific
Identity. As a group, students began their UREs reporting a
higher level of scientific identity than scientific self-efficacy (M
= 4.64, SD = 0.92, κ1 = 3.87), and those with more prior research
experience began their UREs reporting greater scientific iden-
tity (β1 = 0.15, SE = 0.04, p < 0.001). As a group, students did
not grow significantly in their scientific identity from pre- to
post-URE (p = 0.074). Rather, students with lower starting lev-
els experienced more growth in their scientific identity than
those with higher starting levels (Φ21 = 0.21, SE = 0.05, p <
0.0001). Students differed slightly in their starting levels of sci-
entific identity based on their programs (β1 = 0.02, SE = 0.01,
p = 0.018), but they did not differ in their identity growth based
on their programs (p = 0.957).
Students with Lower Starting Levels Grew in Their Values
Alignment. Students began their UREs reporting high levels of
values alignment (M = 5.29, SD = 0.68, κ1 = 5.14). Collectively,
students did not change in their values alignment from pre- to
post-URE (p = 0.261). Yet students with lower starting levels of
values alignment grew more in their values alignment compared
with those who started with higher levels (Φ21 = 0.14, SE =
0.04, p < 0.0001). Students with more prior research experience
reported slightly higher levels of values alignment at the start of
their UREs (β1 = 0.08, SE = 0.03, p = 0.009), although prior
experience alone did not predict changes in their values align-
ment (p = 0.697). Finally, students did not differ in their starting
levels of values alignment or changes in their values alignment
based on their programs (p = 0.759 and p = 0.166, respectively).
Students with Lower Starting Levels Increased Their
Intentions to Pursue Graduate School and Research
Careers. Students began their UREs already intending to
TABLE 4. Students in remote UREs dier in their scientific integration based on their starting levels
Outcome ParameteraβSE z p
Scientific self-efficacy Starting level 2.91 0.25 11.64 0.000
Growth 0.87 0.29 3.03 0.002
Starting level by program 0.00 0.01 0.34 0.733
Growth by program 0.00 0.01 0.18 0.859
Starting level based on prior experience 0.23 0.04 5.37 0.000
Growth based on prior experience 0.09 0.05 1.75 0.081
Growth based on starting level −0.44 0.07 −6.12 0.000
Scientific identity Starting level 3.87 0.24 16.03 0.000
Growth 0.43 0.24 1.79 0.074
Starting level by program 0.02 0.01 2.37 0.018
Growth by program 0.00 0.01 0.05 0.957
Starting level based on prior experience 0.15 0.04 3.58 0.000
Growth based on prior experience 0.08 0.04 1.81 0.070
Growth based on starting level −0.21 0.06 −3.62 0.000
Values alignment Starting level 5.14 0.21 24.71 0.000
Growth 0.26 0.24 1.13 0.261
Starting level by program 0.00 0.01 0.31 0.759
Growth by program 0.01 0.01 1.38 0.166
Starting level based on prior experience 0.08 0.03 2.61 0.009
Growth based on prior experience 0.01 0.03 0.39 0.697
Growth based on starting level −0.14 0.04 −3.51 0.000
Graduate school and
career intentions
Starting level 3.98 0.19 21.35 0.000
Growth 0.02 0.17 0.14 0.886
Starting level by program 0.00 0.01 0.47 0.635
Growth by program 0.00 0.01 0.05 0.957
Starting level based on prior experience 0.08 0.03 2.10 0.017
Growth based on prior experience 0.07 0.03 0.24 0.815
Growth based on starting level −0.09 0.03 −2.75 0.006
aWe interpret the intercept fixed effect (κ1) as the level at which students started their UREs (starting level); the slope fixed effect (κ2) as students’ growth from pre- to
post-URE (growth); intercept of program and prior research experience variables (β1) as starting level by program and starting level based on prior experience, respec-
tively; the slope of program and prior research experience variables (β2) as students’ growth by program and growth based on prior experience, respectively; and the
correlation of the random intercept and slope (Φ21) as an indicator of whether students experienced different growth based on starting level on a variable. A positive
correlation indicates that students starting at a higher level grew more pre- to post-URE, and a negative correlation indicates that students starting at a higher value grew
less. Significant results are bolded.
22:ar25, 8 CBE—Life Sciences Education 22:ar25, Summer 2023
R. A. Hess et al.
attend graduate school (M = 4.36, SD = 0.79) and pursue a
research career (M = 4.21, SD = 0.84), and their intentions as a
group did not change pre- to post-URE (p = 0.886). Again, stu-
dents with lower starting intentions experienced more growth
in their intentions compared with those who started with higher
levels (Φ21 = 0.09, SE = 0.03, p < 0.0006). Students with more
prior research experience reported slightly higher intentions at
the start of their UREs (β1 = 0.08, SE = 0.03, p = 0.017),
although prior experience alone did not predict changes in their
intentions (p = 0.815). Finally, students did not differ in their
starting levels of intentions or changes in their intentions based
on their programs (p = 0.635 and p = 0.957, respectively).
Perceptions of Benefits and Costs
Collectively, students who participated in remote UREs did not
change their perceptions of the benefits and cost of doing
research from pre- to post-URE. Yet students with lower starting
perceptions of the benefits and costs of doing research grew
more in their perceptions of both. We report the specific results
for each outcome below and in Tables 3 and 5.
Students with Lower Starting Levels Grew in Their Enjoyment
of Research, Personal Importance of Research, and Utility
Values of Research. On average, students began their UREs at
a very high level of enjoyment (M = 5.17, SD = 0.88, κ1 = 4.49)
and did not change in their enjoyment pre- to post-URE (p =
0.381). Students with more prior research experience started at
a slightly higher level of enjoyment of research (β1 = 0.18, SE =
0.04, p < 0.0001), while students with lower starting levels of
enjoyment grew more in their enjoyment of research (Φ21 =
0.18, SE = 0.05, p < 0.001).
Collectively, students also began their UREs perceiving a
high level of the personal importance of doing research (M =
5.31, SD = 0.71, κ1 = 4.85), and this did not change pre- to post-
URE (p < 0.748). However, students with lower starting levels
experienced more growth in their personal importance of
research than those with higher starting levels (Φ21 = 0.13, SE
= 0.05, p < 0.010). Students did not differ in their starting levels
or growth of personal importance of research based on their
programs (p = 0.039 and p = 0.397, respectively) or their prior
research experience (p = 0.115 and p = 0.821, respectively).
Similarly, students as a group started their UREs with very
positive perceptions of the job, life, and social utility aspects of
research (job utility: M = 5.54, SD = 0.67, κ1 = 5.18; life utility:
M = 5.12, SD = 0.78, κ1 = 4.90; social utility: M = 5.28, SD =
0.72, κ1 = 5.06), and this did not change pre- to post-URE
(p values > 0.70). However, students with lower starting levels
experienced more growth in their utility perceptions than those
with higher starting levels (job utility: Φ21 = 0.16, SE = 0.06,
p < 0.004; life utility: Φ21 = 0.09, SE = 0.04, p < 0.019; social
utility: Φ21 = 0.13, SE = 0.05, p < 0.015). Students differed
very slightly in their starting perceptions of job utility based on
their programs (Φ21 = 0.02, SE = 0.01, p < 0.006), but did not
differ in their starting levels or growth of their utility percep-
tions based on their prior research experience (p values > 0.20).
Students Did Not Change in Their Intrinsic Value of Research,
Regardless of Their Starting Levels. Students began their
UREs perceiving a high level of intrinsic value of doing research
(M = 5.43, SD = 0.69 κ1 = 4.82), and this did not change pre- to
post-URE (p < 0.939). Students differed slightly in their starting
levels of intrinsic value of research based on their programs (β1
= 0.02, SE = 0.01, p < 0.014) and their prior research experi-
ences (β1 = 0.09, SE = 0.03, p < 0.006). Contrary to other out-
comes, students did not differ in their growth in intrinsic value
based on their starting levels (p = 0.240).
Students with Lower Starting Levels Grew in Their
Perceptions of the Costs of Research. On average, students
began their UREs reporting a moderate level of perceived costs
(M = 3.47, SD = 1.28, κ1 = 2.48), and this did not change pre- to
post-URE. Yet students with lower starting costs perceptions
experienced more growth in their costs perceptions than those
with higher starting levels (Φ21 = 0.32, SE = 0.11, p < 0.003).
Students differed very slightly in their starting cost perceptions
based on their programs (β1 = 0.05, SE = 0.02, p < 0.001), but
not on their prior research experiences (p = 0.925). In addition,
students’ cost perceptions did not change in ways that related to
their programs or prior research experiences (p = 0.042 and
p = 0.641, respectively).
DISCUSSION
In this study, we first sought to determine whether undergradu-
ates who engage in remote research programs experienced
research-related social influence in terms of gains in their self-ef-
ficacy, scientific identity, and values alignment (research ques-
tion 1). We found that students in remote UREs experienced
some level of integration into the scientific community despite
the remote circumstances (Estrada et al., 2011; Adedokun et al.,
2013; Robnett et al., 2015; Frantz et al., 2017). Specifically, stu-
dents who completed remote UREs experienced significant
gains in their scientific self-efficacy, and these gains were due to
their research experiences and not to their particular URE pro-
grams. Even students who had prior research experience grew
in their scientific self-efficacy. This result might be attributable
to additional research experience building students’ confidence
in their research skills, regardless of how much research they
have done before. Alternatively, students’ self-efficacy growth
may be due to the fact that remote research requires different
skill sets than in-person projects (e.g., using particular software,
writing code; Erickson et al., 2022). Indeed, students started
their UREs reporting less confidence in their computational
skills than in their other research-related skills. It is unclear
whether students’ initial uncertainty about their computational
skills is specific to remote research or unique to the last-minute
shift from away from bench or field research. As a reminder,
most of the students in this study were accepted into their pro-
grams before decisions were made to offer programs remotely.
Regardless, students perceived that they developed their com-
putational skills even though they were researching remotely.
The self-efficacy growth experienced by students in this study
resembled the growth observed in a number of longitudinal
studies of in-person UREs. For instance, Robnett and colleagues
(2015) studied students who completed in-person UREs at col-
leges and universities across the country. The positive effects
they observed took place over a period of four semesters of
in-person research, while the positive effects we observed
occurred in a much shorter period—an average of about 9
weeks—in entirely remote research. This result may be due to
the intensity of the summer experience (35–40 hours per
CBE—Life Sciences Education 22:ar25, Summer 2023 22:ar25, 9
Impacts of Remote Undergraduate Research
week) versus the less intense, more protracted nature of aca-
demic year UREs. Frantz and colleagues have observed similar
self-efficacy growth among students in a 10-week summer pro-
gram, providing additional evidence that shorter, intensive
experiences are similarly effective in building students’ confi-
dence in their ability to be successful in science research com-
pared with longer, less intense programs (Frantz et al., 2017).
Estrada and colleagues (2018) also studied the effects of UREs
on the self-efficacy of a cohort of underrepresented minority stu-
dents in their junior and senior years. Similar to our results, their
TABLE 5. Students in remote UREs dier in their perceptions of the benefits and costs of doing research based on their initial perceptions
Outcome ParameteraβSE z p
Enjoyment Starting level 4.49 0.27 16.79 0.000
Growth 0.20 0.23 0.88 0.381
Starting level by program 0.01 0.01 0.91 0.362
Growth by program 0.01 0.01 0.66 0.508
Starting level based on prior experience 0.18 0.04 4.10 0.000
Growth based on prior experience 0.03 0.04 0.84 0.403
Growth based on starting level −0.18 0.05 −3.36 0.001
Intrinsic value Starting level 4.82 0.20 24.59 0.000
Growth 0.02 0.24 0.08 0.939
Starting level by program 0.02 0.01 2.46 0.014
Growth by program 0.01 0.01 0.63 0.526
Starting level based on prior experience 0.09 0.03 2.75 0.006
Growth based on prior experience 0.00 0.04 0.11 0.911
Growth based on starting level 0.06 0.05 1.17 0.240
Personal
importance
Starting level 4.85 0.21 22.77 0.000
Growth 0.08 0.25 0.32 0.748
Starting level by program 0.02 0.01 2.06 0.039
Growth by program 0.01 0.01 0.85 0.397
Starting level based on prior experience 0.06 0.04 1.58 0.115
Growth based on prior experience 0.01 0.04 0.23 0.821
Growth based on starting level −0.13 0.05 −2.56 0.010
Job utility Starting level 5.18 0.18 28.31 0.000
Growth 0.05 0.22 0.24 0.809
Starting level by program 0.02 0.01 2.75 0.006
Growth by program 0.01 0.01 1.11 0.266
Starting level based on prior experience 0.02 0.03 0.80 0.425
Growth based on prior experience 0.05 0.04 1.41 0.159
Growth based on starting level −0.16 0.06 −2.92 0.004
Life utility Starting level 4.90 0.19 25.51 0.000
Growth 0.06 0.21 0.28 0.783
Starting level by program 0.02 0.01 2.27 0.023
Growth by program 0.01 0.01 1.82 0.068
Starting level based on prior experience 0.01 0.03 0.23 0.815
Growth based on prior experience 0.06 0.04 1.58 0.114
Growth based on starting level −0.09 0.04 −2.34 0.019
Social utility Starting level 5.06 0.22 23.43 0.000
Growth 0.06 0.26 0.22 0.824
Starting level by program 0.01 0.01 1.27 0.204
Growth by program 0.01 0.01 1.07 0.283
Starting level based on prior experience 0.02 0.04 0.51 0.614
Growth based on prior experience 0.05 0.04 1.27 0.203
Growth based on starting level −0.13 0.05 −2.44 0.015
Cost Starting level 2.48 0.38 6.55 0.000
Growth 0.51 0.36 1.43 0.152
Starting level by program 0.05 0.02 3.23 0.001
Growth by program 0.02 0.01 2.03 0.042
Starting level based on prior experience 0.01 0.07 0.09 0.925
Growth based on prior experience 0.03 0.07 0.47 0.641
Growth based on starting level −0.32 0.11 −3.02 0.003
aWe interpret the intercept fixed effect (κ1) as the level at which students start their UREs (starting level); the slope fixed effect (κ2) as students’ growth from pre- to
post-URE (growth); intercept of program and prior research experience variables (β1) as starting level by program and starting level based on prior experience, respec-
tively; the slope of program and prior research experience variables (β2) as students’ growth by program and growth based on prior experience, respectively; and the
correlation of the random intercept and slope (Φ21) as an indicator of whether students experienced different growth based on starting level on a variable. A positive
correlation indicates that students starting at a higher level grew more pre- to post-URE, and a negative correlation indicates that students starting at a higher value grew
less. Significant results are bolded.
22:ar25, 10 CBE—Life Sciences Education 22:ar25, Summer 2023
R. A. Hess et al.
findings indicated that in-person UREs had a small but signifi-
cant positive effect on students’ self-efficacy.
Students in our study only experienced changes in their sci-
entific identity, values alignment, or intentions to pursue grad-
uate education or research careers if they started their remote
UREs with lower levels of these indicators. In addition, these
students made relatively larger gains in self-efficacy, lesser
gains in scientific identity, and even more modest gains in val-
ues alignment and graduate and career intentions. This pattern
of effect sizes resembles those observed in studies of in-person
UREs (Robnett et al., 2015; Frantz et al., 2017; Hernandez
et al., 2020), indicating that students are experiencing remote
UREs similarly to in-person UREs but perhaps to a lesser extent.
Indeed, our results differ from those observed for in-person
UREs, several of which have documented a positive influence
of UREs on students’ scientific identity regardless of their start-
ing point. Thus, remote UREs appear to be productive environ-
ments for advancing students’ scientific integration, but pri-
marily for students who do not already perceive themselves as
already integrated into the scientific community. Notably, stu-
dents’ starting levels on indicators of integration were more
predictive of growth than their prior research experiences. This
result is consistent with observations that UREs can vary widely
in implementation (Gentile et al., 2017) and students’ experi-
ences differ, even within the same program (Cooper et al.,
2019; Limeri et al., 2019; Erickson et al., 2022). Thus, pro-
grams and researchers should be cautious about assuming that
students who report engaging in research for a similar number
of terms have comparable experiences or realized similar
outcomes.
In keeping with the EVT of motivation (Barron and
Hulleman, 2015), we also sought to explore the extent to which
undergraduates in remote research programs shifted their per-
ceptions of the benefits and costs of doing research (research
question 2). Students in this study already perceived high ben-
efits and low costs of research when they started their remote
research, and their perceptions did not change. It is encourag-
ing that the challenges of remote research did not, on average,
dissuade students from the benefits of doing research and did
not magnify their cost perceptions. In fact, students who started
their remote UREs with lower levels increased slightly in their
enjoyment, personal importance, and utility values of doing
research. Thus, to some extent, remote UREs are useful for stu-
dents to weigh the benefits of doing research if they do not
already perceive high benefits. We were unable to find any
quantitative studies of undergraduate researchers’ perceptions
of the benefits and costs of doing research with which to com-
pare our results. Qualitative research from Ceyhan and Tillotson
(2020) indicates that undergraduates express intrinsic value,
which includes both interest and enjoyment, utility value, and
opportunity costs of in-person research. Our findings are consis-
tent with these results and offer additional insight that two fac-
ets of intrinsic value, namely students’ enjoyment of and inter-
est in research, can be empirically distinguished. Our results are
consistent with the notion students with high interest in
research self-select into summer research programs and do not
on average change in their interest, but can experience changes
in other research-related values.
Notably, students who started their remote UREs with lower
cost perceptions also increased in these perceptions. Students
perceiving greater benefits and costs of doing research seems
counterintuitive, yet this effect was also observed by Ceyhan
and Tillotson (2020) in their study of in-person UREs. Students
may be developing a deeper or more sophisticated understand-
ing of what research is and what doing research entails, which
enables them to recognize more and different benefits as well as
more costs. Our measurement model assessment results (in the
Supplemental Material) support this idea, because the factor
loadings for benefits and costs items increase pre- to post-URE
and the measures show configural invariance but not factorial
invariance. In other words, students appear to be perceiving the
items differently after they complete their UREs than before.
LIMITATIONS
There are several limitations of this study that should be consid-
ered in interpreting the results. The main limitation is that we
designed the study as a single-arm, comparison study; no com-
parison group of students completing UREs in-person was
included because of the circumstances caused by COVID-19. It
may be that students who opted to participate in a remote URE
were particularly primed for success or that mentors and URE
program directors put forth additional effort to ensure a posi-
tive experience. It also may be that students were grateful to
have any meaningful experience in the midst of the pandemic
lockdown and thus responded more favorably than would oth-
erwise be the case. Future research should directly compare
remote versus in-person UREs, ideally using random assign-
ment to one or the other format with students who are willing
to do either. Our results provide at least some evidence of the
benefits of remote research, which mitigates the ethical con-
cerns associated with such a study.
Another limitation is that we did not collect program-level
data that would allow us to connect student outcomes to pro-
gram features or activities. Future research should explore how
to systematically characterize URE elements in a way that allows
such connections to be explored and tested. Although some
efforts have been made characterize what undergraduates do
during research (Robnett et al., 2015), these efforts do not cap-
ture program-level elements that are likely to influence student
experiences and outcomes from UREs (Erickson et al., 2022).
It may be that growth by students who started their remote
UREs at higher levels of the constructs we examined was limited
by the measures we used. We examined means and standard
deviations, which indicated room for growth, and tested for and
ruled out regression to the mean as alternative explanations for
limited growth by these students (see Supplemental Material).
However, we cannot rule out limitations of the measurements.
For instance, our measure of scientific identity demonstrated
configural invariance and high internal reliability, but the mea-
surement model fit was only fair and did not show factorial
invariance (i.e., factor loadings increased from pre- to post-
URE). These results suggest that students may be changing their
thinking and perceptions about research as they engage in
research. Undergraduates may be shifting from thinking of
themselves as a “science person” to a “science research person”
as they gain more research experience. Current measures likely
capture the former but not the latter. Moving forward, research-
ers should explore the utility of existing measures for discrimi-
nating among undergraduate students with more or less
research experience and develop additional measures as needed.
CBE—Life Sciences Education 22:ar25, Summer 2023 22:ar25, 11
Impacts of Remote Undergraduate Research
Finally, there were limitations related to our sample, which
was entirely comprised of biology students. Therefore, our
results may be unique to the discipline. Biology research may
be more or less amenable to remote research compared with
other STEM disciplines. Moreover, as the full extent of the
COVID-19 pandemic unfolded, students and mentors who
chose to move forward with remote research may possess dif-
ferent personality traits or differing levels of our variables of
interest (i.e., scientific identity, scientific self-efficacy) from
those who opted out of remote research. Research topics them-
selves likely changed during the transition to accommodate the
remote research arrangement, so researchers who chose to
move forward with remote research may have conducted a dif-
ferent type of research than they originally envisioned. Finally,
data were collected during a time of social unrest in the United
States during Summer 2020. Awareness of social unrest and
systematic racism may have affected the well-being of partici-
pants, which may have influenced their experiences in the
remote URE program.
CONCLUSION
In summary, our work suggests that remote UREs can have a
positive effect on student outcomes, especially their scientific
self-efficacy, which has been shown to influence students’ deci-
sions to continue in science research–related career paths
(Estrada et al., 2011; Hernandez et al., 2018). Thus, programs
may wish to offer remote URE programming even though
in-person research has resumed. Perhaps the greatest advantage
of remote research programs is that they open doors for stu-
dents who may not have the opportunity to participate in an
in-person research program (Erickson et al., 2022). Remote
UREs can allow for more flexible scheduling and enable research
participation without the additional costs and logistics of travel
and lodging. Thus, remote programs may be a viable method of
expanding access to UREs, especially among students who may
find it difficult to travel.
Although remote UREs have many advantages, their appro-
priateness should be evaluated on a case-by-case basis and
should be considered alongside the advantages and disadvan-
tages of in-person UREs. Our results indicate that remote UREs
do not benefit all students equally. Rather, the benefits appear
to be larger for students who have more to gain because they
report lower levels of scientific integration and perceive fewer
benefits associated with doing research. Furthermore, certain
types of research (e.g., computational biology) may be more
amenable to remote work (Alford et al., 2017). Particular
research mentors and undergraduates may be better able to
navigate the unstructured nature of remote work. Certain
remote research environments may be more or less accessible
for different individuals, such as those who can sit and work on
a computer for extended periods of time (Reinholz and Ridgway,
2021). Certain personal situations may make remote research
more difficult, such as whether individuals have access to robust
Internet connections and quiet workspaces (Erickson et al.,
2022). Finally, because students are not able to complete bench-
work at home, remote UREs may aid in the development of a
different skill set than in-person UREs. Thus, students may ben-
efit from completing both types of UREs throughout their
undergraduate degree programs in order to develop a wider
variety of skills.
It is important to note that students in this study were all
conducting the entire research experience remotely. In the future,
URE programs may wish to consider hybrid designs in which
some students are in person and others are remote, or in which
all students participate partly in person and partly remotely. Stu-
dents may experience a hybrid program quite differently than a
remote program, which could influence their outcomes. We are
not aware of any existing research to support the efficacy of a
hybrid URE program. If such a program exists, we encourage
researchers to investigate differential outcomes for in-person
and remote students who are within the same URE program.
ACKNOWLEDGMENTS
We thank all of the students, faculty, and other research mentors
for their willingness to proceed with remote REU programming
and for sharing their experiences so that others could learn. We
also thank the Social Psychology of Research Experiences and
Education group members for feedback on drafts of this article.
This material is based upon work supported by the NSF under
grant no. DBI-2030530. Any opinions, findings, conclusions, or
recommendations expressed in this material are those of the
authors and do not necessarily reflect the views of any of the
funding organization. The authors dedicate this work to all of the
undergraduates seeking to do research and the individuals who
provide these opportunities despite challenging circumstances.
REFERENCES
Adedokun, O. A., Bessenbacher, A. B., Parker, L. C., Kirkham, L. L., & Burgess,
W. D. (2013). Research skills and STEM undergraduate research students’
aspirations for research careers: Mediating eects of research self-e-
cacy. Journal of Research in Science Teaching, 50(8), 940–951.https://
doi.org/10.1002/tea.21102
Aikens, M. L., Sadselia, S., Watkins, K., Evans, M., Eby, L. T., & Dolan, E. L.
(2016). A social capital perspective on the mentoring of undergraduate
life science researchers: An empirical study of undergraduate–postgrad-
uate–faculty triads. CBE—Life Sciences Education, 15(2), ar16. https://
doi.org/10.1187/cbe.15-10-0208
Alford, R. F., Leaver-Fay, A., Gonzales, L., Dolan, E. L., & Gray, J. J. (2017). A
cyber-linked undergraduate research experience in computational bio-
molecular structure prediction and design. PLoS Computational Biology,
13(12), e1005837.https://doi.org/10.1371/journal.pcbi.1005837
Anderson, J. C., & Gerbing, D. W. (1988). Structural equation modeling in
practice: A review and recommended two-step approach. Psychological
Bulletin, 103(3), 411–423.https://doi.org/10.1037/0033-2909.103.3.411
Barron, K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model of mo-
tivation. Psychology, 84, 261–271.
Bates, D., Maechler, M., Bolker, B., & Walker, S. (2014). LME4: Linear
Mixed-Eects Models Using Eigen and S4 (R Package Version 1.1-4).
Retrieved September 12, 2021, from https://cran.r-project.org/web/
packages/lme4/index.html
Benjamini, Y., & Hochberg, Y. (1995). Controlling the false discovery rate: A
practical and powerful approach to multiple testing. Journal of the Roy-
al Statistical Society. Series B (Methodological), 57, 289–300.
Bliese, P. D. (2000). Within-group agreement, non-independence, and reli-
ability: Implications for data aggregation and analysis. In Klein, K. J., &
Kozlowski, S. W. J. (Eds.), Multilevel theory, research, and methods in or-
ganizations: Foundations, extensions, and new directions (pp. 349–381).
Hoboken, NJ: Jossey-Bass/Wiley.
Ceyhan, G. D., & Tillotson, J. W. (2020). Early year undergraduate research-
ers’ reections on the values and perceived costs of their research expe-
rience. International Journal of STEM Education, 7(1), 1–19.
Chemers, M. M., Zurbriggen, E. L., Syed, M., Goza, B. K., & Bearman, S. (2011).
The role of ecacy and identity in science career commitment among
underrepresented minority students. Journal of Social Issues, 67(3),
469–491.https://doi.org/10.1111/j.1540-4560.2011.01710.x
22:ar25, 12 CBE—Life Sciences Education 22:ar25, Summer 2023
R. A. Hess et al.
Cooper, K. M., Gin, L. E., Akeeh, B., Clark, C. E., Hunter, J. S., Roderick, T. B.,
... & Pfeier, L. D. (2019). Factors that predict life sciences student per-
sistence in undergraduate research experiences. PLoS ONE, 14(8)
Dunn, T. J., Baguley, T., & Brunsden, V. (2014). From alpha to omega: A prac-
tical solution to the pervasive problem of internal consistency estima-
tion. British Journal of Psychology, 105(3), 399–412.
Eccles, J. S., & Wigeld, A. (2002). Motivational beliefs, values, and goals.
Annual Review of Psychology, 53(1), 109–132. https://doi.org/10.1146/
annurev.psych.53.100901.135153
Erickson, O. A., Cole, R. B., Isaacs, J. M., Alvarez-Clare, S., Arnold, J.,
Augustus-Wallace, A., ... & Dolan, E. L. (2022). “How do we do this at a
distance?!” A descriptive study of remote undergraduate research pro-
grams during COVID-19. CBE—Life Sciences Education, 21(1), ar1.https://
doi.org/10.1187/cbe.21-05-0125
Estrada, M., Hernandez, P. R., Schultz, P. W., & Herrera, J. (2018). A longitudi-
nal study of how quality mentorship and research experience integrate
underrepresented minorities into STEM careers. CBE—Life Sciences
Education, 17(1), ar9.https://doi.org/10.1187/cbe.17-04-0066
Estrada, M., Woodcock, A., Hernandez, P. R., & Schultz, W. P. (2011). Toward a
model of social inuence that explains minority student integration into
the scientic community. Journal of Educational Psychology, 103(1),
206–222.https://doi.org/10.1037/a0020743
Fletcher, T. D. (2010). Psychometric: Applied Psychometric Theory (R Pack-
age Version 2.3). Retrieved September 28, 2022, from https://cran.r
-project.org/web/packages/psychometric/index.html
Frantz, K. J., Demetrikopoulos, M. K., Britner, S. L., Carruth, L. L., Williams, B. A.,
Pecore, J. L., ... & Goode, C. T. (2017). A comparison of internal dispositions
and career trajectories after collaborative versus apprenticed research ex-
periences for undergraduates. CBE—Life Sciences Education, 16(1), ar1.
Gaspard, H., Dicke, A.-L., Flunger, B., Brisson, B. M., Häfner, I., Nagengast, B.,
& Trautwein, U. (2015a). Fostering adolescents’ value beliefs for mathe-
matics with a relevance intervention in the classroom. Developmental
Psychology, 51(9), 1226.
Gaspard, H., Dicke, A.-L., Flunger, B., Schreier, B., Häfner, I., Trautwein, U., &
Nagengast, B. (2015b). More value through greater dierentiation:
Gender dierences in value beliefs about math. Journal of Educational
Psychology, 107(3), 663.
Gentile, J., Brenner, K., & Stephens, A. (2017). Undergraduate research expe-
riences for STEM students: successes, challenges, and opportunities.
National Academies Press. Retrieved May 17, 2017, from www.nap.edu/
catalog/24622/undergraduate-research-experiences-for-stem
-students-successes-challenges-and-opportunities
Hanauer, D. I., Frederick, J., Fotinakes, B., & Strobel, S. A. (2012). Linguistic
analysis of project ownership for undergraduate research experiences.
CBE—Life Sciences Education, 11(4), 378–385.https://doi.org/10.1187/
cbe.12-04-0043
Hernandez, P. R., Agocha, V. B., Carney, L. M., Estrada, M., Lee, S. Y., Loomis,
D., ... & Park, C. L. (2020). Testing models of reciprocal relations between
social inuence and integration in STEM across the college years. PLoS
ONE, 15(9), e0238250.https://doi.org/10.1371/journal.pone.0238250
Hernandez, P. R., Woodcock, A., Estrada, M., & Schultz, P. W. (2018). Under-
graduate research experiences broaden diversity in the scientic work-
force. BioScience, 68(3), 204–211.https://doi.org/10.1093/biosci/bix163
Hunter, A.-B., Laursen, S. L., & Seymour, E. (2007). Becoming a scientist: The
role of undergraduate research in students’ cognitive, personal, and pro-
fessional development. Science Education, 91(1), 36–74. https://doi
.org/10.1002/sce.20173
Joshi, M., Aikens, M. L., & Dolan, E. L. (2019). Direct ties to a faculty mentor
related to positive outcomes for undergraduate researchers. BioScience,
69(5), 389–397.https://doi.org/10.1093/biosci/biz039
Kline, R. B. (2015). Principles and practice of structural equation modeling.
New York, NY: Guilford.
Korbel, J. O., & Stegle, O. (2020). Eects of the COVID-19 pandemic on life
scientists. Genome Biology, 21(1), 113. https://doi.org/10.1186/s13059
-020-02031-1
Laursen, S., Hunter, A.-B., Seymour, E., Thiry, H., & Melton, G. (2010). Under-
graduate research in the sciences: Engaging students in real science. San
Francisco, CA: Wiley.
Limeri, L. B., Asif, M. Z., Bridges, B. H. T., Esparza, D., Tuma, T. T., Sanders, D.,
... & Dolan, E. L. (2019). “Where’s My Mentor?!” Characterizing negative
mentoring experiences in undergraduate life science research. CBE—Life
Sciences Education, 18(4), ar61.https://doi.org/10.1187/cbe.19-02-0036
Lopatto, D. (2003). The essential features of undergraduate research. Coun-
cil on Undergraduate Research Quarterly, 24, 139–142.
Lopatto, D., & Tobias, S. (2010). Science in solution: The impact of undergrad-
uate research on student learning. Washington, DC: Council on Under-
graduate Research.
Marcoulides, K. M., & Yuan, K.-H. (2017). New ways to evaluate goodness of
t: A note on using equivalence testing to assess structural equation
models. Structural Equation Modeling: A Multidisciplinary Journal, 24(1),
148–153.
National Science Foundation. (n.d.). Research experiences for undergradu-
ates (REU). Retrieved September 12, 2021, from https://beta.nsf.gov/
funding/opportunities/research-experiences-undergraduates-reu
Peugh, J., & Feldon, D. F. (2020). “How well does your structural equation
model t your data?”: Is Marcoulides and Yuan's equivalence test the an-
swer? CBE—Life Sciences Education, 19(3), es5.
Redden, E. (2020). “Nonessential” research has halted on many campuses.
Inside Higher. Retrieved September 12, 2021, from www.insidehighered
.com/news/2020/03/30/nonessential-research-has-halted-many
-campuses
Reinholz, D. L., & Ridgway, S. W. (2021). Access needs: Centering students
and disrupting ableist norms in STEM. CBE—Life Sciences Education,
20(3), es8.https://doi.org/10.1187/cbe.21-01-0017
Robnett, R. D., Chemers, M. M., & Zurbriggen, E. L. (2015). Longitudinal asso-
ciations among undergraduates’ research experience, self-ecacy, and
identity. Journal of Research in Science Teaching, 52(6), 847–867.https://
doi.org/10.1002/tea.21221
Rosseel, Y. (2012). lavaan: An R package for structural equation modeling.
Journal of Statistical Software, 48(2), 1–36.
Schwartz, S. H., Melech, G., Lehmann, A., Burgess, S., Harris, M., & Owens, V.
(2001). Extending the cross-cultural validity of the theory of basic
human values with a dierent method of measurement. Journal of
Cross-Cultural Psychology, 32(5), 519–542. https://doi.org/10.1177/
0022022101032005001
Seymour, E., Hunter, A.-B., Laursen, S. L., & DeAntoni, T. (2004). Establishing
the benets of research experiences for undergraduates in the sciences:
First ndings from a three-year study. Science Education, 88(4), 493–
534.https://doi.org/10.1002/sce.10131
Thiry, H., & Laursen, S. L. (2011). The role of student-advisor interactions in
apprenticing undergraduate researchers into a scientic community of
practice. Journal of Science Education and Technology, 20(6), 771–
784.https://doi.org/10.1007/s10956-010-9271-2
Thiry, H., Laursen, S. L., & Hunter, A.-B. (2011). What Experiences Help Students
Become Scientists? A Comparative Study of Research and Other Sources
of Personal and Professional Gains for STEM Undergraduates. Journal of
Higher Education, 82(4), 357–388.https://doi.org/10.1353/jhe.2011.0023
Thoman, D. B., Brown, E. R., Mason, A. Z., Harmsen, A. G., & Smith, J. L.
(2014). The role of altruistic values in motivating underrepresented
minority students for biomedicine. BioScience, biu199. https://doi.
org/10.1093/biosci/biu199
Yuan, K.-H., Chan, W., Marcoulides, G. A., & Bentler, P. M. (2016). Assessing
structural equation models by equivalence testing with adjusted t in-
dexes. Structural Equation Modeling, 23(3), 319–330.
Article
Full-text available
To continue to advance the field of computational biology and fill the constantly growing need for new trainees who are well positioned for success, immersive summer research experiences have proven to be effective in preparing students to navigate the challenges that lay ahead in becoming future computational biologists. Here, we describe 10 simple rules for planning, offering, running, and improving a summer research program in computational biology that supports students in honing technical competencies for success in research and developing skills to become successful scientific professionals.
Article
Full-text available
The COVID-19 pandemic shut down undergraduate research programs across the United States. A group of 23 colleges, universities, and research institutes hosted remote undergraduate research programs in the life sciences during Summer 2020. Given the unprecedented offering of remote programs, we carried out a study to describe and evaluate them. Using structured templates, we documented how programs were designed and implemented, including who participated. Through focus groups and surveys, we identified programmatic strengths and shortcomings as well as recommendations for improvements from students’ perspectives. Strengths included the quality of mentorship, opportunities for learning and professional development, and a feeling of connection with a larger community. Weaknesses included limited cohort building, challenges with insufficient structure, and issues with technology. Although all programs had one or more activities related to diversity, equity, inclusion, and justice, these topics were largely absent from student reports even though programs coincided with a peak in national consciousness about racial inequities and structural racism. Our results provide evidence for designing remote Research Experiences for Undergraduates (REUs) that are experienced favorably by students. Our results also indicate that remote REUs are sufficiently positive to further investigate their affordances and constraints, including the potential to scale up offerings, with minimal concern about disenfranchising students.
Article
Full-text available
This essay describes the concept of access needs as a tool for improving accessibility in science, technology, engineering, and mathematics (STEM) education broadly, from the classroom, to research group meetings, to professional conferences. The normalization of stating access needs and creating access check-ins is a regular practice used in disability justice activist circles, but it has not yet been normalized in STEM education spaces. Just as normalizing the use of pronouns has been an important step for supporting gender justice, we argue that normalizing access talk is an important step for advancing disability justice in STEM fields. Moreover, we argue that all individuals have access needs, regardless of whether they are disabled or nondisabled. We provide concrete suggestions and techniques that STEM educators can use today.
Article
Full-text available
Background Prior research reported that motivational beliefs that individuals attach to specific tasks predict continuing interest and persistence in the task. A motivational approach may be particularly useful for understanding undergraduate students’ engagement with research in their first and second years in college. The current study utilizes the expectancy-value theory of achievement motivation to qualitatively explore how much and in what ways early year undergraduate researchers value their research experience and what kinds of costs they associate with it. Results The results revealed that intrinsic value had the highest expression in participants’ motivation to engage in research. The second most expressed value type was the utility value of undergraduate research with regards to obtaining the desired outcomes, and attainment value played the least important role in participants’ motivation to engage in research. Findings also indicated that some of the participants associated a cost(s) to their research experience. The highest mentioned perceived cost was opportunity cost, where participants commented on losing other valued alternatives when engaging in research. Participants commented on the time, effort, or amount of work needed to engage in research, and a few participants commented on the emotional cost associated with their research experience in terms of the fear of failure. Conclusion As perceived cost is the least studied in the expectancy-value framework, this study contributes to cost values within college students, particularly about early year undergraduate researchers. The findings of this study can form the basis for future work on exploring ways to increase the values and decrease the costs students experience in their undergraduate research experiences.
Article
Full-text available
The present study tests predictions from the Tripartite Integration Model of Social Influences (TIMSI) concerning processes linking social interactions to social integration into science, technology, engineering, and mathematics (STEM) communities and careers. Students from historically overrepresented groups in STEM were followed from their senior year of high school through their senior year in college. Based on TIMSI, we hypothesized that interactions with social influence agents (operationalized as mentor network diversity, faculty mentor support, and research experiences) would promote both short-and long-term integration into STEM via social influence processes (operationalized as science self-efficacy, identity, and internalized community values). Moreover, we examined the previously untested hypothesis of reciprocal influences from early levels of social integration in STEM to future engagement with social influence agents. Results of a series of longitudinal structural equation model-based mediation analyses indicate that, in the short term, higher levels of faculty mentorship support and research engagement, and to a lesser degree more diverse mentor networks in college promote deeper integration into the STEM community through the development of science identity and science community values. Moreover, results indicate that, in the long term, earlier high levels of integration in STEM indirectly influences research engagement through the development of higher science identity. These results extend our understanding of the TIMSI framework and advance our understanding of the reciprocal nature of social influences that draw students into STEM careers.
Article
Full-text available
Structural equation modeling is an ideal data analytical tool for testing complex relationships among many analytical variables. It can simultaneously test multiple mediating and moderating relationships, estimate latent variables on the basis of related measures, and address practical issues such as nonnormality and missing data. To test the extent to which a hypothesized model provides an appropriate characterization of the collective relationships among its variables, researchers must assess the “fit” between the model and the sample’s data. However, interpreting estimates of model fit is a problematic process. The traditional inferential test of model fit, the chi-square test, is biased due to sample size. Fit indices provide descriptive (i.e., noninferential) values of model fit (e.g., comparative fit index, root-mean-square error of approximation), but are unable to provide a definitive “acceptable” or “unacceptable” fit determination. Marcoulides and Yuan have introduced an equivalence-testing technique for assessing model fit that combines traditional descriptive fit indices with an inferential testing strategy in the form of confidence intervals to facilitate more definitive fit conclusions. In this paper, we explain this technique and demonstrate its application, highlighting the substantial advantages it offers the life sciences education community for drawing robust conclusions from structural equation models. A structural equation model and data set (N = 1902) drawn from previously published research are used to illustrate how to perform and interpret an equivalence test of model fit using Marcoulides and Yuan’s approach.
Article
Full-text available
Undergraduate research experiences in science, technology, engineering, and mathematics fields are championed for promoting students' personal and professional development. Mentorship is an integral part of undergraduate research, as effective mentorship maximizes the benefits undergraduates realize from participating in research. Yet almost no research examines instances in which mentoring is less effective or even problematic, even though prior research on mentoring in workplace settings suggests negative mentoring experiences are common. Here, we report the results of a qualitative study to define and characterize negative mentoring experiences of undergraduate life science researchers. Undergraduate researchers in our study reported seven major ways they experienced negative mentoring: absenteeism, abuse of power, interpersonal mismatch, lack of career support, lack of psychosocial support, misaligned expectations, and unequal treatment. They described some of these experiences as the result of absence of positive mentoring behavior and others as actively harmful behavior, both of which they perceive as detrimental to their psychosocial and career development. Our results are useful to mentors for reflecting on ways their behaviors might be perceived as harmful or unhelpful. These findings can also serve as a foundation for future research aimed at examining the prevalence and impact of negative mentoring experiences in undergraduate research.
Article
Full-text available
Undergraduate research experiences (UREs) have the potential to benefit undergraduates and longer UREs have been shown to lead to greater benefits for students. However, no studies have examined what causes students to stay in or consider leaving their UREs. In this study, we examined what factors cause students to stay in their UREs, what factors cause students to consider leaving their UREs, and what factors cause students to leave their UREs. We sampled from 25 research-intensive (R1) public universities across the United States and surveyed 768 life sciences undergraduates who were currently participating in or had previously participated in a URE. Students answered closed-ended and open-ended questions about factors that they perceived influenced their persistence in UREs. We used logistic regression to explore to what extent student demographics predicted what factors influenced students to stay in or consider leaving their UREs. We applied open-coding methods to probe the student-reported reasons why students chose to stay in and leave their UREs. Fifty percent of survey respondents considered leaving their URE, and 53.1% of those students actually left their URE. Students who reported having a positive lab environment and students who indicated enjoying their everyday research tasks were more likely to not consider leaving their UREs. In contrast, students who reported a negative lab environment or that they were not gaining important knowledge or skills were more likely to leave their UREs. Further, we identified that gender, race/ethnicity, college generation status, and GPA predicted which factors influenced students' decisions to persist in their UREs. This research provides important insight into how research mentors can create UREs that undergraduates are willing and able to participate in for as long as possible.
Article
Full-text available
Mentored research is critical for integrating undergraduates into the scientific community. Undergraduate researchers experience a variety of mentoring structures, including dyads (i.e., direct mentorship by faculty) and triads (i.e., mentorship by graduate or postdoctoral researchers [postgraduates] and faculty). Social capital theory suggests that these structures may offer different resources that differentially benefit undergraduates. To test this, we collected data from a national sample of more than 1,000 undergraduate life science researchers and used structural equation modeling to identify relationships between mentoring structures and indicators of integration into the scientific community. Undergraduates in dyads and triads with direct faculty interactions reported similar levels of science self-efficacy, scientific identity, and scholarly productivity, and higher levels of these outcomes than students in triads lacking faculty interactions. Undergraduates' career intentions were unrelated to their mentoring structure, and their gains in thinking and working like scientists were higher if they interacted with both postgraduates and faculty.
Article
The common approach to the multiplicity problem calls for controlling the familywise error rate (FWER). This approach, though, has faults, and we point out a few. A different approach to problems of multiple significance testing is presented. It calls for controlling the expected proportion of falsely rejected hypotheses — the false discovery rate. This error rate is equivalent to the FWER when all hypotheses are true but is smaller otherwise. Therefore, in problems where the control of the false discovery rate rather than that of the FWER is desired, there is potential for a gain in power. A simple sequential Bonferronitype procedure is proved to control the false discovery rate for independent test statistics, and a simulation study shows that the gain in power is substantial. The use of the new procedure and the appropriateness of the criterion are illustrated with examples.