Content uploaded by Rachel L. Schechter
Author content
All content in this area was uploaded by Rachel L. Schechter on Jun 21, 2023
Content may be subject to copyright.
Virtual Lab Implementation Model Predicts STEM Future Plans: Insights
from Contemporary Science Courses in Higher Education
Rachel L. Schechter
Learning Experience Design Research
United States
rachel@lxdresearch.com
Paul A. Chase
Tufts University
United States
paul@lxdresearch.com
Apoorva Shivaram
Northwestern
United States
ashivaram@u.northwestern.edu
Abstract:Using a mixed-method approach, researchers examined college instructors' use of a
virtual lab simulation library, Labster, students' perceptions of the simulations on their learning
and future STEM plans, as well as data from the library to explore patterns of lab use and quiz
scores. A detailed look at Contemporary Biology I and Biology II quiz scores found that virtual
lab use was more regular during the second year of implementation, and students' quiz scores
significantly improved, with fewer quiz attempts. Outcomes of the student survey revealed that
when virtual lab assignments occurred before the in-person lab, students were 4 times more likely
to “Plan to work in a STEM field” and over 5 times more likely to “Plan to take more STEM
courses” when compared to students who used the virtual labs at other times. To better understand
the difference between a student confident of a future in STEM to one unsure, a sense of
belonging to the science community at the university was a distinguishing factor. These findings
have important implications on implementing virtual lab simulations to increase student
performance, decrease DFW rates, and support continued STEM course enrollment.
Keywords:virtual learning,STEM,higher education,technology,enrollment,Labster,simulated labs
Introduction
Since returning from the pandemic, college instructors nationwide have been challenged by a lack of
student engagement. Leaders of programs in science, technology, engineering, and mathematics (STEM) fields have
experimented with innovative technologies for many years to increase student engagement and interest (e.g., Baiyun
Chen, Bastedo, & Howard, 2018). Integrating virtual science lab tools could be critical in leveraging technology to
enhance student engagement and improve learning outcomes in STEM disciplines (e.g., Van der Kleij, Feskens, &
Eggen, 2015). This paper explores student and instructor perceptions of Labster, a virtual science lab tool, and its
impact on student engagement levels and enrollment outcomes.
Understanding the factors influencing student engagement and learning outcomes is crucial for improving
pedagogical practices and creating effective learning environments. While student engagement refers to the
enthusiasm, interest, and involvement students exhibit during their learning experiences, their perceived engagement
may or may not correlate with their long-term STEM involvement goals. For example, leading student engagement
researcher Vincent Tinto believes engagement is an overall indicator of student well-being (Tinto, 2022). On the
other hand, the productive struggle students go through when diving more deeply into understanding the science
concepts and relationships among scientific ideas (instead of simply finding an answer) may be frustrating even
though the process of learning may lead to higher grades (Townsend, Slavit, & McDuffie, 2018).
Given this knowledge, in what ways can instructors help students engage in their STEM coursework
in a way that promotes the learning process? Based on student feedback, higher levels of support are perceived in
contexts where instructors use active learning techniques and offer collaborative learning opportunities in their
classroom instruction (Umbach and Wawrzynski, 2005). Furthermore, active learning plays a significant role in
STEM courses, where it has been found to significantly reduce the performance gap between students from well-
resourced and under-resourced educational backgrounds (Ballen et al., 2017). By examining instructors’ use of and
student perceptions of using virtual science lab tools, this study aims to contribute to the growing body of research
on technology-enhanced learning and provide valuable insights for educators and administrators seeking to optimize
student’s class engagement and longer-term science learning outcomes in STEM higher education.
Research-Based Design
While Labster designers were inspired by inquiry-based learning, which was shown to be very effective in
science classrooms (Aktamis et al., 2016), the development team recognizes that not all learners will have equal
experience and comfort with a virtual environment. Labster’s simulations allow multiple entry points to help orient
students to the space and tools. Learners start simulations with explicit lessons, explanations, and demonstrations or
learn through an interactive inquiry-based learning approach. Responding to research on motivation (Deci & Ryan,
2001), Labster designers set all clocks in the lab at 8:00 and removed any sense of deadlines, which would lower a
student's intrinsic motivation. By incorporating choice at the start and throughout inquiry-based learning, students
learn independently and at their own pace, increasing their autonomy and independence (Vries & May, 2019).
A later addition to Labster was Dr. One, a virtual lab assistant, developed to show student users how to
complete a task before trying it themselves and remind students about lab safety. Research suggests that student
learning with virtual helpers or “pedagogical agents'' positively affects learning outcomes (Schroeder & Craig,
2021), perhaps by reducing the cognitive load on the learner. This is particularly true when the pedagogical agent
allows students to experience worked examples in combination with their own problem-solving (McLaren & Isotani,
2011). With this research finding in mind, Labster’s theory of change details how access to simulated lab
experiences in a safe, private, expert-led, and untimed environment increases students’ time, comfort, and
opportunity to explore and learn scientific concepts.
Product Description
Virtual labs are interactive science simulations enabling students to engage in laboratory activities from
their devices. Labster’s virtual laboratory and science learning platform provides faculty, administrators, and
students with a 24/7 science campus in the cloud. Labster’s simulations are interactive learning environments that
place the student in a central, decision-making role where they are challenged to apply their skills to solve problems
within the context of a story. All the action takes place inside a 3D model of a real biology, chemistry, or physics
laboratory, but with a unique twist: the laws of time and space can be deliberately broken to demonstrate the
principles that make real science work.
As they progress through each Labster simulation, students practice lab techniques, visualize scientific
concepts, and test their newly acquired knowledge. The simulations include an element of gamification that requires
students’ active participation via a pedagogical agent in the form of a virtual character that motivates, supports, and
encourages them with questions, feedback, hints, and explanations.
Labster’s catalog includes over 300 virtual lab simulations in biology, chemistry, and physics. Each
simulation is a guided activity with embedded quizzes to empower students with feedback so they can assess their
own understanding. For instructors and administrators, there is a performance dashboard to track student progress,
identify areas where they need additional coaching and reinforcement, and supplemental resources to help scaffold
learning, including lab manuals, lab report templates, explainer videos, theory pages, and graphics.
Virtual lab simulations complement lectures, readings, and hands-on lab activities. Students can directly
relate what they learn in simulations to their course content. Learning with virtual lab simulation is game-like and
could support student motivation. Combined with pedagogically sound design, virtual lab simulations seek to create
the conditions under which students become more active participants in their learning.
Study Methods
A large public university in Texas partnered with LXD Research to conduct the study. The Instructional
Design team brought Labster as a tool during the pandemic, and eight instructors continued using the program to
supplement and complement in-person lab instruction. To better understand the use of Labster, researchers analyzed
product data from Labster, conducted two student surveys, and interviewed instructors.
Instructor Interviews
The leaders of the university’s instructional design team sent an invitation to the instructors using Labster
to complete a brief survey with questions about their usage and implementation of Labster in their classrooms. In
this survey, instructors were allowed to opt in for a short interview with a research team member. Four instructors
opted in to be interviewed for 60 minutes. All four interviews took place on Zoom and were subsequently
transcribed and analyzed for common themes and suggestions. Courses taught by the interviewees included: General
Chemistry 1, General Chemistry 2, Organic Chemistry 1, Contemporary Biology (i.e., ‘science for non-biology
majors’), Biosciences 1, Exercise Physiology, and Exercise Nutrition (graduate course).
Student Surveys
Two brief surveys were administered to undergraduate students at a large university in Texas. The leaders
of the university’s instructional design team sent the students invitations to complete the surveys. The survey
includes a 10-item Science Engagement Scale, which will be validated in the larger research project. One survey
was administered as a pre-course (start of semester) survey, while the other was administered as a post-course (end
of semester) survey. Both surveys included questions about students’ opinions and perceptions of using Labster in
their science courses. This report focuses on the post-course survey. A total of 188 undergraduate students
completed the post-survey. (A larger response rate was likely associated with a $10 Amazon gift card as an incentive
for completing the survey.) Twelve students did not use Labster but still answered a few questions in the survey.
User Data Description
Labster data reflected 40 unique courses utilizing the tool across two years, two semesters, and one summer
each (Summer 2021 through Spring 2023). Information about each student’s attempt for each simulation, including
their quiz scores, was analyzed for patterns. Students were given a new identification number each time they
enrolled in a class; therefore, student experiences could not be followed over time. Instead, cohorts of students could
be compared (e.g., students who took Biology I in Fall 2021 vs. Fall 2022) to determine whether usage differed from
year 1 to subsequent years (i.e., Year 2 or Year 3). Due to the course scope and sequence for science majors, it
could be inferred that nearly all students in advanced classes took the prerequisite the semester before (e.g., students
who took Biology II in Spring 2023 were in Biology I in Fall 2022).
Results
Instructor Interview Themes
The four instructors from three different departments provided insights into how science courses were
structured. For example, introductory or Level I courses may be half-filled with science majors, while advanced
courses or courses on targeted topics such as Kinesiology may be nearly all science majors (General Chemistry I is a
prerequisite for General Chemistry II, etc.). All of the instructors mentioned that the curriculum for courses was pre-
established according to a standard textbook in the field and Labster was incorporated as review or to supplement
learning. The lab component for courses was typically three hours a week for the semester and Labster was typically
incorporated as review or supplement to the in-person lab. For courses that did not have a lab component (e.g.,
Exercise Physiology and Exercise Nutrition), Labster was incorporated into the course as a supplementary tool to
learn assessment techniques.
Instructors incorporated Labster at varying rates and times, ranging from 2 virtual labs per course to 12 labs
per course. All four instructors assigned Labster virtual labs to be done individually, while in-person labs combined
individual and group work. Half of the instructors said it was their first year using Labster.
Two instructors had a very good understanding of how to use Labster to support learning and found it very
beneficial. They agree that it helps both science majors and non-science majors grasp an overview of a concept for
an experiment or technique, even if they’ve never performed it. One explained,
“I loved it. The experience, the students really enjoyed it. They didn't get a 100% the first time through, so
they had to struggle a little bit. Some of them didn't get a 100% the fourth or fifth time through. So it was
clearly something that wasn't something where you just sat down, clicked a couple buttons and got through
it. They were actually challenged, they actually had to think. And the students enjoyed it too, when I asked.
And I think it was a great experience. I'll keep using [Labster] and what I'm considering doing is extending
it out to the lecture course and giving some homeworks with those as well. I just haven't gotten to that point
yet.”
The other instructors still were exploring the tool and needed to spend more time to figure out how to incorporate it
more effectively.
While individual instructors mostly reported positive experiences using Labster with students, three out of
the four did not know that seven other instructors also used Labster. Because instructors did not share ideas or
experiences with each other, a range of implementation models were identified. Instructors assigned Labster after a
lecture content, before the lab to preview the experiments or concepts, or after the lab to review what was covered.
Instructors also mentioned using them during the lecture, between challenging in-person labs, or as something extra
over spring break.
In sum, all four instructors agreed that Labster was an engaging and hands-on tool that provided students
with the ability to practice simulations and experiments that stimulated their learning.
User Data Analysis & Results
Initial exploratory analysis of the Labster simulation data revealed a change in implementation patterns
from Year 1 to Year 2 (Fall) or Year 3 (Spring). The graphs below show the % of all lab simulation completions by
week across each semester each year. From week to week, Year 1 shows inconsistent use of Labster throughout the
semester, with multiple weeks of greatly reduced use. Subsequent years show a pattern of use that suggests less use
during the first and last weeks of class, but more consistent use throughout the weeks in between.
Figure 1. Weekly Usage Rates by Course and Year of Use*
*Note: The university started using the program in Spring 2021 (with Biology II). In the graph above,
Year 2 of Biology I is immediately followed by Year 3 of Biology II.
To determine the consistency of use, we calculated the standard deviation to measure the variance (i.e.,
volatility) difference in the number of attempts per week across Year 1. We compared the same metric with Year 2.
In Year 1, standard deviations were 9.0%, compared to 5.0% standard deviation in Year 2, indicating less
consistency of use during Year 1. The difference was significant t(70) = 14.1, p < .001.
After identifying this implementation change, student behavior and performance was reviewed.
Contemporary Biology was chosen for this deeper dive investigation because it is a course for non-majors and a
prerequisite for many science majors. Student experiences in Contemporary Biology I could impact their choices
about joining or remaining in the STEM pipeline (as evidenced by their enrollment in Biology II). Looking to see
changes in student scores from Year 1 to Year 2 could help answer this question and provide insights on how to
support the STEM pipeline with virtual labs. Because the Labster simulations were the same and the syllabi were the
same across the two years, student performance would provide evidence that the implementation patterns were not
just different but were more effective. Results showed that student scores in Year 2 were significantly higher than in
Year 1 (t(2622) = 21.4, p < .001), and students earned those higher scores in fewer attempts (t(2622) = 17.4, p
< .001), as shown in Table 1 below.
Table 1. Bio 1 Courses: Comparing First Year to Year 2
Conducting the same analysis for Biology II showed different results. Biology II is the next level course in
the scope and sequence, and students choose to continue their journey in the STEM pipeline. As indicated by the
sample size difference, Labster was not used by as many Biology II instructors in Year 1 (and it wasn’t used as
often) compared to Year 3. The Year 3 students, however, used Labster in Biology I the previous fall. Table 2 shows
that, in terms of performance, students had significantly higher scores in Year 3 compared to Year 1 (t(881) = 13.2,
p < .001), even though they had the same number of attempts (i.e., they performed better the first time they tried a
lab). The combination of instructor experience and student experience resulted in even better results. It is also
relevant to note that enrollment retention from Biology I to Biology II was quite high at 66%.
Table 2. Bio 2 Courses: Comparing Year 1 to Year 3
Student Survey Data Results
Connecting the student survey with the patterns above, we can now see how the Year 2 implementation
model impacted students' perceptions of learning and future STEM course and career plans. Instructors could assign
Labster use in various ways, and students can access the labs anytime. While about half of the students surveyed
used Labster before doing a lab in person (57%), other students used Labster after the lab or to study before an exam
(14%) or some other way (17%). A fourth group of students used Labster multiple times throughout a course, some
combination of before and after the lab, before the exam, and in other ways (13%).
Analyses revealed a positive predictive relationship between the implementation model and outcome
measures covered in the survey. For example, students who used Labster before the lab were significantly more
likely to indicate they would take additional STEM classes than those who used other models (t(184) = 4.3, p
< .001). Likewise, students who used Labster before the lab were significantly more likely to indicate they plan to
seek a job in a STEM field (t(184) = 4.1, p < .001). For further details, please see Tables 3 and 4.
Table 3. Planning to work in a STEM Field: Comparing Not Before Lab to Before Lab
Table 4. Taking more STEM courses: Comparing Not Before Lab to Before Lab
On the other hand, there was no significant difference between implementation models on students’
perceived impact of Labster on the decision to take future courses or on how much they learned during the course.
Nearly all students (96%) said they learned the same amount or more by having Labster as part of their coursework.
One-quarter of students credited Labster for learning a lot more. By conducting this study using a mixed-method
approach, we uncovered nuances that would have been difficult to see by doing the survey alone. While students’
voices are important to make decisions about education tools and how to implement them, a survey alone is not
enough to fully understand how a tool may relate to students’ future STEM involvement.
Sense of Effort vs. Sense of Belonging
Understanding the factors that differ between a student saying, “Yes, I plan on taking a future STEM
course” or “Work in a STEM field” and those who say, “Maybe…” can be helpful to inform program design
changes. This survey allowed for such an exploration. The ratings for students who responded “Yes” were similar to
those who responded “Maybe” on the items related to their effort and depth of thinking during science classes.
When it came to their sense of belonging with the science community at their school, however, the “Maybe”
students rated themselves as feeling less included than the “Yes” students. These differences between “Maybe” and
“Yes” respondents were statistically significant for students who planned to take more STEM courses (t(171) = 3.5,
p < .001) and students who planned to work in STEM fields (t(153) = 2.8, p < .01). This is an area where colleges
could target programs and interventions to increase a sense of belonging in the community using the tools they
already have. A subsequent paper will present additional analyses of the STEM Engagement Scale and the
relationship between plans for a future in STEM and other aspects of science engagement.
From the instructor’s perspective, only one of four instructors interviewed were aware that another faculty
member from the same department was using Labster in their courses. This instructor was able to obtain
troubleshooting and quick setup assistance from the other instructor. The remaining instructors were unaware of any
other faculty members using Labster in their courses, of which there were seven others. This is an opportunity for
the instructional design teams at colleges and universities to establish a community of Labster users who can explore
new ways to incorporate the tool in their classrooms and share resources.
Discussion and limitations
Previewing information and understanding how the lab will go can positively impact student outcomes as
students learn what to expect. Knowing what to expect during the lab enhances students’ confidence, which is
related to self-competence, one of the three pillars of intrinsic motivation. This study shows that simulated virtual
labs can be an effective tool for previewing this information and students attributing the labs to greater learning
outcomes.
“The simulation is very valuable because it allows them to get kind of a hands-on
experience. Doing the experiment before they get into the lab and try for the rst
time. That's the main, I guess, that's the main strength of Labster as to why I use it in
the lab course,” an instructor explained.
In cases where students struggle with an in-person lab, they may be asked to repeat it and progress through
the lab more quickly the second time. By previewing the lab beforehand, in-person experiences may go more
smoothly the first time, resulting in improved outcomes and eliminating the need for repeating them. The virtual lab
allows for infinite attempts, and evidence from this study shows that students need fewer attempts to get the same
high scores (or even higher scores) over time. Not only does this preview keep the lab’s content fresh in their minds,
but survey results also suggest that students attribute their success to the in-person lab experience. These
improvements to the overall course experience were likely associated with students’ increased likelihood of
planning to take more STEM classes or work in a STEM field.
Receiving immediate feedback during the virtual lab can also increase the student’s awareness of their own
learning. By receiving real-time grading and the ability to repeat the labs for higher grades, on their own schedule,
students can see their progress and reinforce their understanding, which is especially helpful for students who need
to be exposed to information more than once. Doing it at their own pace could also allow them to engage in inquiry
by finding and using additional resources, such as vocabulary, without the time constraint of being in a classroom.
As a result, there may be an increase in scientific literacy. It is worth noting that although not all students complete
every task assigned, all students actively participate in the labs. As stated by an instructor, “Not all of my
students do the required quizzes that I post. Right? But all of my students did the computer
labs. So that to me, says they're at least responding positively to that.” This aligns with the
concept of the teacher as the ‘guide on the side’, facilitating learning, while allowing students to process information
according to their individual styles, making learning more relevant.
Longitudinal research has shown that redesigning a biology course to more tightly align lectures and labs,
leveraging technology, and making learning more relevant for students can decrease DFW rates and increase grades
(Uechert, Adams, & Lock, 2011). This study’s limitations include the lack of a similar comparison group to compare
students who did not have access to Labster. Instructors also used Labster in various ways, so future research could
experiment with a more prescriptive approach to provide more control in the study. Additionally, matching students’
opinions with their virtual lab scores and student enrollment data and grades from the university would be ideal to
better follow students along all aspects of learning and outcomes. A quasi-experimental study using Labster would
build greater understanding of its impact on higher education engagement and support of future STEM involvement.
Finally, there is an opportunity for colleges to leverage tools such as Labster to build community among
instructors and between students. Research shows that first-generation college students are sensitive to even daily
opportunities to experience a sense of belonging, resulting in higher engagement (Gillen-O’Neel, 2021). In this
study, instructors lacked awareness of an online Labster community, and at least seven other STEM instructors were
using Labster at their university. When examining the difference between a student considering a future in STEM
and one who was less sure, feeling part of the science community at their school was a distinguishing factor. Labster
created an online community for students and instructors during the time period when this study was conducted,
creating a new tool universities can leverage to build a sense of belonging among students and instructors. This
online community could also be a component to include in future studies.
Conclusion
Incorporating and implementing Labster’s virtual lab simulation in college STEM courses was an enriching
and positive experience for virtually all students and instructors. Among students, the use of virtual lab simulations
prior to the in-person lab sessions greatly predicted their plans to enroll in future STEM coursework and STEM
careers. While most students credited the labs with increasing the amount they learned, many students seemed
unaware of how much using the simulations before the lab influenced their learning during the lab sessions.
Instructors using Labster indicated that the virtual simulations were easy to incorporate into their courses and they
were able to achieve their instructional goals. However, there is a potential to establish a stronger science
community – among students to enhance their connection to this community and among instructors to strengthen
their experience with Labster. In sum, incorporating virtual lab-based learning using products like Labster appears to
have amplified accessibility to learning, contributed to the reduction of DFW rates, and fostered student STEM
retention. Moving forward, future research can offer insights to strengthen such tools, thereby creating a more
inclusive and effective learning environment for students all over the world.
References
Aktamis, H., Higde, E., & Özden, B. (2016). Effects of the inquiry-based learning method
on students’ achievement, science process skills and attitudes towards science: A meta-analysis science.
Journal of Turkish Science Education, 13(4), 248-261.
Chen, B., Bastedo, K., & Howard, W. (2018). Exploring Design Elements for Online STEM
Courses: Active Learning, Engagement & Assessment Design. Online Learning, 22(2), 59–75. https://doi-
org.ezproxy.lakeheadu.ca/10.24059/olj.v22i2.1369
Ballen, C. J., Wieman, C., Salehi, S., Searle, J. B., & Zamudio, K. R. (2017). Enhancing
diversity in undergraduate science: Self-efficacy drives performance gains with active learning. CBE—Life
Sciences Education, 16(4), Article 56.
Deci, E. L., Koestner, R., & Ryan, R. M. (2001). Extrinsic Rewards and Intrinsic Motivation in
Education: Reconsidered Once Again. Review of Educational Research, 71(1), 1–27.
https://doi.org/10.3102/00346543071001001
Gillen-O’Neel, C. (2021). Sense of belonging and student engagement: A daily study of first-and
continuing-generation college students. Research in Higher Education, 62(1), 45-71.
Klimmt, C., & Vorderer, P. (2003). Media psychology “is not yet there”: Introducing theories on
media entertainment to the presence debate. Presence: Teleoperators and Virtual Environments, 12(4),
346-359. http://dx.doi.org/10.1162/ 105474603322391596.
McLaren, B. M., & Isotani, S. (2011, June). When is it best to learn with all worked examples?.
In International conference on artificial intelligence in education (pp. 222-229). Springer, Berlin,
Heidelberg.
Meyer, G. F., Wong, L. T., Timson, E., Perfect, P., & White, M. D. (2012). Objective Fidelity
Evaluation in Multisensory Virtual Environments: Auditory Cue Fidelity in Flight Simulation. PLoS ONE,
7(9). https://doi.org/10.1371/journal.pone.0044381
Schroeder, N. L., & Craig, S. D. (2021). Learning with virtual humans: Introduction to the
special issue. Journal of Research on Technology in Education, 53(1), 1-7.
Townsend, C., Slavit, D., & McDuffie, A. R. (2018). Supporting All Learners in Productive
Struggle. Mathematics Teaching in the Middle School, 23(4), 216-224.
Tinto, V. (2022). Exploring the character of student persistence in higher education: The impact
of perception, motivation, and engagement. In Handbook of Research on Student Engagement (pp. 357-
379). Cham: Springer.
Ueckert, C., Adams, A., & Lock, J. (2011). Redesigning a large-enrollment introductory biology
course. CBE Life Sciences Education, 10(2), 164–174. https://doi.org/10.1187/cbe.10-10-0129
Umbach, P. D. & Wawrzynski, M. R. (2005). Faculty Do Matter: The Role of College Faculty in
Student Learning and Engagement. Research in Higher Education, 46(2), pp. 153–184. Retrieved from
https://files.eric.ed.gov/fulltext/ED491002.pdf
Van der Kleij, F., Feskens, R. C., & Eggen, T. (2015). Effects of Feedback in a Computer-Based
Learning Environment on Students' Learning Outcomes: A Meta-Analysis. Review of Educational
Research, 85(4), 475-511. 10.3102/0034654314564881.
Vries, L. & May, M. (2019). Virtual laboratory simulation in the education of laboratory
technicians–motivation and study intensity. Biochemistry and Molecular Biology Education, 47(3), 257-
262. 10.1002/bmb.21221.
Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence
questionnaire. Presence: Teleoperators and Virtual Environments, 7(3), 225-240.
http://dx.doi.org/10.1162/105474698565686.