Technical ReportPDF Available

Re-shaping Post-COVID-19 Teaching and Learning: A Blueprint of Virtual-Physical Blended Classrooms in the Metaverse Era

Authors:

Abstract

During the COVID-19 pandemic, most countries have experienced some form of remote education through video conferencing software platforms. However, these software platforms present significant limitations that reduce immersion and fail to replicate the classroom experience. The currently emerging Metaverse addresses many of such limitations by offering blended physical-digital environments. This paper aims to assess how the Metaverse can support and improve e-learning. We first survey the latest applications of blended environments in education and highlight the primary challenges and opportunities. Accordingly, we derive our proposal for a virtual-physical blended classroom configuration that brings students and teachers into a shared educational Metaverse. We focus on the system architecture of the Metaverse classroom to achieve real-time synchronization of a large number of participants and activities across physical (mixed reality classrooms) and virtual (remote virtual reality platform) learning spaces. Our proposal attempts to transform the traditional physical classroom into virtual-physical cyberspace as a new social network of learners and educators connected at an unprecedented scale.
Re-shaping Post-COVID-19 Teaching and Learning: A Blueprint of
Virtual-Physical Blended Classrooms in the Metaverse Era
Yuyang Wang*
Computational Media & Arts Thrust
Hong Kong University of Science and Technology
Lik-Hang Lee
Augmented Reality and Media Lab
Korea Advanced Institute of Science and Technology
Tristan Braud
Division of Integrative Systems and Design
Hong Kong University of Science and Technology
Pan Hui§
Computational Media & Arts Thrust
Hong Kong University of Science and Technology
ABSTRACT
During the COVID-19 pandemic, most countries have experienced
some form of remote education through video conferencing software
platforms. However, these software platforms present significant
limitations that reduce immersion and fail to replicate the classroom
experience. The currently emerging Metaverse addresses many of
such limitations by offering blended physical-digital environments.
This paper aims to assess how the Metaverse can support and im-
prove e-learning. We first survey the latest applications of blended
environments in education and highlight the primary challenges and
opportunities. Accordingly, we derive our proposal for a virtual-
physical blended classroom configuration that brings students and
teachers into a shared educational Metaverse. We focus on the sys-
tem architecture of the Metaverse classroom to achieve real-time syn-
chronization of a large number of participants and activities across
physical (mixed reality classrooms) and virtual (remote virtual real-
ity platform) learning spaces. Our proposal attempts to transform
the traditional physical classroom into virtual-physical cyberspace
as a new social network of learners and educators connected at an
unprecedented scale.
Index Terms:
Applied computing—Education—Distance
learning—E-learning; Human computer interaction—Interaction
paradigms—AR/VR.
1 INTRODUCTION
Under the complex and changing global political and economic situ-
ation and the global pandemic, individual activities and production
methods face increasing challenges. For example, lectures have
moved to online via Zoom or MS Teams to maintain social dis-
tancing. However, the current online educational content is mainly
based on flat 2D display, which lacks immersion and engagement
compared to traditional teaching in a physical classroom. Students
feel hardly focused on the remote lecture. In this situation, the
Metaverse appears as a meaningful solution by integrating state-of-
the-art technologies such as VR/AR, artificial intelligence and cloud
computing [28] to make educational activities to more attractive.
Motivations
: The next generation of education, i.e., teaching and
learning, should include diversified yet customized learning content
and context, increased student interaction and creativity, boosted
*e-mail: yuyangwang@ust.hk
e-mail: likhang.lee@kaist.ac.kr
e-mail: braudt@ust.hk
§e-mail: panhui@ust.hk
motivation and engagement [2]. Unfortunately, existing teaching
facilities and tools fail to offer these features. This work exam-
ines existing teaching and learning modalities and proposes a novel
teaching system to promote learning experience and performance
in the Metaverse era. In this context, we first present a mini-survey
on digital education platforms, including computer-mediated learn-
ing and teaching via online conferencing tools and VR/AR-based
teaching. We highlight the pros and cons of these teaching modal-
ities. For instance, Zoom enables synchronous teaching but lacks
students engagement [9]. On the other hand, AR/VR provides more
engaging teaching activities thanks to 3D visualization technology
in semi-immersive or fully immersive environments. However, it
requires a high-quality and robust network for synchronizing graphi-
cal data [52]. Therefore, we put forward a virtual-physical blended
classroom, with the ambition of migrating physical classrooms and
remote educational activities to the unified yet immersive cyberspace,
known as the Metaverse. Our proposed Metaverse classroom con-
tains two physical classrooms at two university campuses and one
digital classroom hosted in the edge-cloud synchronized computing
devices. The digital classroom enables users (i.e., learners and edu-
cators) to appear at a physical lecture from their home campus, in
which the digital twins of such users, as avatars, can interact with
users from other campuses.
Review methodology
: First, we search in Google Scholar for
the terms “VR education” and “AR education”, and additional key-
words including “application” and “challenge” and find that related
publications spread among several journals and conferences. Sec-
ond, we choose papers mainly from celebrated publishers: ACM,
IEEE, Springer, Elsevier and Taylor
&
Francis. Third, we favour
the paper with top citations and from the filed of education and
VR/AR. Finally, if no proper references can be found in the former
steps, we expand our search domain in Google Scholar with other
keywords such as “VR/AR applications” and filter by investigating
the relevance to the educational field. This mini-survey illustrates
the paradigm shift in teaching and learning in the post-COVID-19
period. The classroom moves from computer-mediated learning or
VR/AR-based applications to a fully virtual-physical blended one.
Contributions:
This paper’s contributions are twofold, as follows. First, we
review the latest development of remote learning and interactive
medium like AR/VR/MR. Second, we propose the Metaverse class-
room containing two physical classrooms and one cloud-based vir-
tual classroom, and the architecture about how to implement the
system. We accordingly pinpoint the grand challenges of establish-
ing the proposed Metaverse classroom.
2 RECENT ED UC ATIO N LANDSCAPE
The recent development of AR/VR provides evidence that immersive
elements are no longer a myth in learning context [15]. We highlight
arXiv:2203.09228v1 [cs.HC] 17 Mar 2022
some important works related to computer-mediated education and
relevant studies of AR/VR, as presented in Figure 1. Unfortunately,
learners from disadvantaged families still find it hard to overcome the
challenges imposed by COVID-19 because they lack stable network
connection, powerful computer and digital skills [37].
Computer-mediated Education.
Since the outbreak of COVID-
19, the lack of high-quality learning materials and efficient teaching
tools has posed a threat to traditional classroom-based education.
Learning remotely through a video conferencing system has pro-
vided an opportunity to maintain lecturing and communicating activ-
ities. However, the HCI community also identified some challenges
arising from remote teaching [9]. For instance, students may find
it challenging to pay continuous attention because of multitasking
and unexpected interruption. They tend to experience decreased
learning efficacy as live streaming learning costs a longer time and
reduced engagement/collaboration because the learning environment
changes from a community-based public area supported by various
university facilities to a private space with limited resources. Re-
searchers proposed several solutions to mitigate the deficiency or
weakness for both teachers and students during the live streaming
course, e.g., establishing the voice-based conversational agents to
engage natural interaction [49], assigning learning tasks to learn-
ers and improving motivational beliefs [51], boosting explanation
quality and presentation skills [39].
Durham University developed a multi-touch and multi-user desk
to boost children’s mathematical skills in a collaborative and inter-
active way [21]. Pupils can work together to solve problems and
answer questions through some inventive solutions. Compared to
traditional paper-based activities, the multi-touch table can improve
mathematical flexibility and fluency and promote higher levels of
active student engagement.
VR in Education.
VR integrates five essential components: 3D
perspective, dynamic rendering, closed-loop interaction, inside-out
perspective and enhanced sensory feedback [47]. Thanks to rising
computing power and the easy access to many affordable head-
mounted displays (HMDs), VR has proved to be helpful in various
applications such as surgical operations, psychotherapy and STEM
education [16].
The lack of opportunities for practical and hands-on manipula-
tion of objects in the real world is linked to the fact that children
are usually poor at intuitive physics [7]. Therefore, Brelsford [7]
designed a physics simulator to give students a better intuition about
the physical process inside an immersive environment. One group
of students was given a pendulum in controllable length and three
balls in varied mass in this simulator. These students can manipulate
gravity, mass, location of the balls, air drag, friction, period of the
pendulum, force, motion, and starting force of the pendulum to carry
out experiments for an hour. Another control group of students was
given a lecture in the physical classroom on the same material. An
exam was assigned four weeks after the test. The results indicated
that those who had learned through the virtual laboratory revealed
better retention than those from the lecture-based learning group.
Similarly, Yalow and Snow [50] reported that offering learners
instructional material in spatial-graphical format, instead of verbal
words, will improve their immediate comprehension of the material.
Therefore, many studies have investigated the benefits of learning
in virtual environments. Rega and Fink [38] performed a pioneer
study that designed a novel approach to pandemic preparedness and
response based on immersive simulation. The authors developed
a semester-long course in simulated environments for students in
Master of Public Health (MPH). Students are grouped to represent
different country health departments during the whole semester,
and they can gain incident command training and acquire audio
lectures and learning materials related to the imminent pandemic.
Despite learning in virtual environments, students still considered
this training paradigm more groundbreaking, fascinating and educa-
tive. Such work provides us with another solution to combat the
current COVID-19 global pandemic [33].
VR can be promising for education and training, but challenges
still exist [44]. First, VR is generally regarded as a game for enter-
tainment and relaxation. Learners pay more attention to winning
in the game, but they are not fully engaged to acquire knowledge
and improve critical thinking skills, far from the teaching objec-
tive. Second, VR is not perfect from a psychological point of view.
For example, the older lectures may prefer the classical teaching
approach instead of the digital one compared to younger students.
Longstanding immersion in locomotion-dominated VR applications
will cause discomfort and sickness symptoms (e.g., nausea, dizziness
and disorientation) individually among users, which will weaken
users’ willingness for using VR devices [45]. Third, the educational
institution will share the high cost of design and creating the VR
resources towards different teaching objectives, and the library de-
partment lacks interoperability standards or optimal practices for
adopting these VR contents, leading to difficulties to share resources
among different institutions and pointless duplicated work [12]. In
addition, VR has made significant contributions to other educational
cases, e.g., controlling the real chemistry lab by interacting with a
virtual lab [34] and teaching physical geography through spherical
video-based VR immersion [23].
AR in Education.
AR is a 3D technology that boosts the user’s
perception of the physical work by adding a contextual layer of
information [5], and it has become a favoured topic in educational
research in the last decade [22]. The popularity of AR technology
is because it does not require expensive hardware and complicated
equipment such as HMDs. AR can be achieved with low-cost hand-
held devices such as mobile phones and tablets, enabling AR-based
educational settings to be available for most learners. Among many
learner types, K-12 students (primary and secondary students) are
the most preferred learners because the outstanding visualization
features of AR hold the key to students whose learning performance
relies mainly on seeing, hearing or other ways to sense at this age [3].
There are many examples of the successful applications of AR
in education. Gardeli and Vosinakis [17] designed the ARQuestion
application through a collaborative mobile AR game for improving
problem-solving and computational skills of primary school students
through a gamified activity, which demonstrates the application of
AR system in classroom-based interventions. ARQuest supports
multi-user interaction, and students can collaborate with high en-
gagement and motivation to solve challenging problems, despite the
small screen size. This work can be regarded as a 3D version of the
above-mentioned 2D Star trek classroom [21]. The AR application
can benefit learners in sports education and training to learn sports
skills, provide additional hints and feedback, stimulate practice, and
introduce new rules for creating new sports [40].
However, a broad application of AR in education is still facing
some challenges. The most blamed challenge is usability because
most AR educational settings are difficult for students to use [3]. For
example, students might find it challenging to use the AR application
and perform interactive activities without well-designed interfaces,
leading to worse effectiveness [35]. For example, previous work
revealed that the group of learners with AR devices required signifi-
cantly longer training time compared with those without using AR
equipment, which might be due to the novelty of the AR technology
and users were not familiar with the learning approach [19]. Another
issue is that the AR learning environment may increase learners’
cognitive workload because of the excessive materials and compli-
cated tasks [10]. Some technical issues can also weaken learners’
expectations of AR technology. For example, location-based AR
applications rely heavily on the GPS signal to determine the po-
sition and orientation, while low sensitivity in trigger recognition
frequently appears as an issue [10].
(c)
(a) (b) (d)
(e) (f) (g) (h)
Figure 1: Examples of different digital teaching and learning approaches through computer-aided tools and VR/AR technologies. (a) Star
trek room to boost children’s mathematical abilities with on one desktop [1]; (b) Teaching via online conferencing tools [9]; (c) Providing
additional information during sport training [40]; (d) Collaborative learning via ARQuest [17]; (e) Digital content representation of teaching
material [43]; (f) Controlling the real lab inside the VR [34]; (g) Teaching physical geography with video-based VR [23]; (h) Construction of a
virtual campus at CUHK SZ [14].
3 A VISION OF METAVE RS E CLASSROOMS
Traditional teaching activities depend on verbal and nonverbal in-
teractions between teachers and students to formulate and enrich
cultural norms, behaviors, practices, and beliefs, but learning in a vir-
tual platform interrupts such process due as online meeting changes
individual’s communication behaviors [20]. Therefore, we expect to
overcome the defects of current virtual education and propose the
Metaverse classroom, which can boost student engagement, improve
learning efficiency, and create supportive connection tools during
and after class meetings.
Greenan [20] argued that social presence and self-disclosure are
two crucial aspects of virtual education. Garrison et al. [18] ex-
plained the social presence as socio-emotional support and interac-
tion and the individual’s ability to project themselves socially and
emotionally with their entire personality through a communication
tool, which will be impacted by conversions, activities, collaboration,
familiarity and motivation among participants. On the other hand,
during virtual education, self-disclosure could lower the feeling
of unreliability and ambiguity during communication and promote
intimacy and positive relationships [20], which reduce the mental
distance between individuals and increase trust between teachers
and students [42]. Thus, with the increasing popularity of AR and
VR education applications (section 2), establishing the Metaverse
classroom can promote the aforementioned benefits at scale.
3.1 Towards the Metaverse classroom
Due to the global pandemic, access to classrooms and laboratories
is restricted to teachers and students. Online conferencing tools
enable continuity of education, making lectures and tutorials avail-
able virtually to anyone. Coincidentally, the pandemic shed light
on the feasibility of e-learning in long duration and hence the user
acceptability to the e-learning at scale. However, engagement is an
essential component of education, and passive learning via video con-
ferencing fails to reinforce the engagement among class participants.
One way to tackle this challenge, highlighted during the COVID-
19 pandemic, is to introduce interactive and real-time elements to
educators and learners. The advanced development of VR/AR/MR
is characterized by immersive visual stimuli and real-time tracking.
Also, the related hardware and multimedia technologies can support
the Metaverse as an affordable learning platform for learners and
educators on the globe. With such an unprecedented opportunity, we
can employ the Metaverse as a new social platform for both learners
and educators, as nowadays VR/AR/MR devices are getting mature,
which serve as an efficient tool for teaching and learning.
We propose an educational platform that supports the teaching
and learning experience with the Metaverse, with the following
features: (i) learning assessment in the Metaverse for the courses,
(ii) interaction with presentations in the Metaverse, and (iii) teach-
ing experience with augmentation that benefits from visualizing
knowledge and from 3D virtual entities. The platform aims to bring
more engaging and interactive tutorials and mixed learning. Remark-
ably, the virtual-physical blended classroom in the Metaverse makes
the educators and learners situated in intuitive yet user-context (to
both learners and educators) environments. Thus, the Metaverse
classroom provides effective communication media among partic-
ipants from various campuses. At the same time, the changeover
in such a virtual-physical environment brings new values to the
class participants, in terms of learning effectiveness, senses of (vir-
tual) presences, interactive learning experience, etc. Although such a
Metaverse classroom can connect to numerous possible usage scenar-
ios, we specifically highlight several class participants’ interactions,
as follows.
Gamified Learning and Task-based Modules.
Promoting gam-
ification is obvious, which facilitates active class dynamics by de-
signing digital “breakouts" for teams of students or letting students
collaborate to create their own.
Learner Collaborations.
Challenging students to work in teams
to solve a riddle or puzzle or bring the surreal scenes, i.e., augmented
and virtual realities, to the classroom. Additionally, incorporating a
360-degree video scene.
Learner-driven Activities.
Empowering students to create
“choose your own adventure"-style stories or presentations to share
their contents (learning outcomes, opinions, a speech, etc.) with the
Metaverse community.
Saving Instructors’ Time.
With unlimited possibilities for creat-
ing experiences and new features being added continuously, instruc-
tors would become more willing to spend their time on designing
new curricula. The saved time can also enable the instructors to
spend energy with their students, e.g., creating diversified and even
personalized activities for the sake of improving their students’ skills
and knowledge.
Access to Limited/restricted Equipment.
Remarkably, many
students cannot access labs due to the COVID-19. As people can
access the Metaverse classroom anytime and anywhere, the virtual-
physical presence of class participants could achieve ubiquitous
learning yet real-time access to the lab resource (e.g., a virtual lab
as the digital twin) as well as other limited/restricted resources (e.g.,
testing Uranium in the Metaverse).
We consider the Metaverse classroom consists of both physical
Figure 2: The MR Metaverse classroom allows students, instructors,
and speakers located physically in HKUST GZ campus, HKUST
CWB campus, or online to interact within the same activity (lecture,
tutorial, seminar). Through MR, the virtual classroom enables users
to attend a physical lecture on their home campus, while users from
other campus and online users are represented through avatars.
and virtual classrooms. As shown in Figure 2, learners and instruc-
tors are situated in classrooms spreading over multiple campuses,
while the virtual classroom bridge the physical class participants
together. The physical participants of one location can meet the
virtual participants and the physical participants of other locations.
We outline the above concept with a unit case. The participants
are situated in two physical classrooms and one virtual classroom
spreading over two campuses (Hong Kong Clear Water Bay, CWB
and Guangzhou, GZ) in the following paragraphs. These three class-
rooms are synchronized so that the intervention of a participant in
any of these classrooms will be visible to the attendants in the other
two classrooms through his or her avatar representation (i.e., the
digital twins of class participants). It is important to note that the
number of physical classrooms is expandable, i.e., more than two
physical classrooms can access the Metaverse classroom.
Physical Metaverse classroom in HKUST GZ and CWB cam-
puses
: These two classrooms are linked together through a mixed
reality (MR) Metaverse platform that enables classes to be shared
between the two campuses. Each classroom is equipped with a
set of sensors to track the participants and replicate their presence
through digital avatars on another campus’ classroom(s). These dig-
ital avatars are displayed in MR through projectors or headsets. As
such, it is possible to seamlessly conduct a wide range of activities,
ranging from talks and lectures to group projects involving students
and staff from both campuses.
Digital Metaverse classroom online in Virtual Reality
: A sig-
nificant portion of class attendants can access the learning context
remotely, i.e., without any physical presence, which is similar to
nowadays situation of remote learning via Zoom. Nonetheless, the
remote participants can virtually meet the physical participants from
HKUST GZ and CWB campuses. These users perhaps are either
HKUST students who cannot attend the physical lecture due to un-
expected circumstances (sickness, travel restriction, etc.) or learners
outside of HKUST who audit the course. These users can connect
to a third, fully virtual classroom through a virtual reality (VR)
headset or their computers. The HKUST students’ presence can
be represented through digital avatars displayed in both MR class-
rooms, while guest avatars enable outside users to participate in
the class (e.g., guest speakers). For instance, students from KAIST,
represented by avatars, can meet HKUST students in the Metaverse
Digital Classroom, as well as other remote class participants from
MIT and Cambridge (the lower half in Figure 2).
3.2 Architecture of the Metaverse classroom
Figure 3: General system architecture to handle synchronization
between two MR Metaverse classrooms and a VR Metaverse class-
room.
Following the unit case in previous paragraphs, Figure 3 depicts
the system architecture for replicating physical participants situated
in the physical classroom (Classrooms 1 and 2), who can interact
with remote participants in virtual cyberspace. The participants in
the physical classroom 1 wear MR headsets that can track their
locations and other features, such as facial expressions. Meanwhile,
the physical classroom is equipped with sensors that can estimate
the exact pose of the participants. The data from the headsets and
the classroom sensors are transmitted through WiFi (headset) or
wired network (sensors) to the edge server that aggregates the data
to estimate the pose and facial expression of the participants.
The server then generates the avatar and their interaction traces
accordingly, and packages them via the real-time transmission link
to both the edge server of Classroom 2 and the cloud server of the
VR classroom. The edge server in Classroom 2 identifies the vacant
seats to display virtual avatars in the mixed reality classroom. Upon
the reception of the digital information, it corrects the pose to match
the new position of the avatar and generates the scene to display to
the users through the lens of their MR headsets. Similarly, the cloud
server arranges the avatars of all users within an entirely virtual VR
classroom and transmits the results back to the remote users. Both
classrooms can rely on their own independent WiFi infrastructure
to accelerate the data transmission between the headsets and the
edge servers and minimize the end-to-end latency for replicating the
participants’ poses and expressions as digital avatars on the other
Metaverse classrooms, whether MR or VR.
4 CONCLUDING NOT ES A ND CHALLENGES
The pandemic catalyzes the emergence of immersive environments
that potentially impact how we work, play, and even learn and
teach [48]. We foresee that the Metaverse classroom will bring
more learner-centric, collaborative, and innovative elements to the
future classroom. Our work serves as a groundwork for the digital
transformation of learning and teaching, from sole digital medium
to virtual-physical blended one. Despite the Metaverse classroom
being a promising learning platform for students and teachers, extra
efforts are required to meticulously investigate system performance
and user-centric evaluation and collect valuable feedback from actual
trials, i.e., teaching activities. Therefore, we call for joint research
efforts to actualize the Metaverse classroom, and in particular, we
have to address several challenging aspects including user-centric,
system and networking issues.
User Interactivity and Perception.
Learners and educators are
the key actors in the Metaverse classroom. Maintaining the band-
width of user interactivity, regardless of output and inputs, is crucial
to user retention. Nonetheless, first, the user inputs on mobile MR
and VR headsets are far from satisfaction, resulting in low through-
put rates in general [32], which could hinder the users’ expression,
i.e., converting one’s intentions into resultant outcomes in virtual-
physical blended environments. Additionally, current input methods
of headsets are primarily speech recognition and simple hand ges-
tures [30]. As a result, users can only deal with enriched educational
contents/contexts at a degraded quality.
On the other hand, the output channels, known as feedback cues,
are limited, albeit the headsets can sufficiently provide visuals of dig-
ital entities. However, such displays’ limited Field-of-View (FOV)
potentially deteriorates communication efficacy among users in the
Metaverse classroom [29]. Specifically, partial view of body ges-
tures, heavily relying on constant visual attention, due to limited
FOV, can lead to distorted communication outcomes. Thus, multi-
modal feedback cues (e.g., haptics) become necessary to maintain
the granularity of user communication. Additionally, haptic feed-
back is essential to delivering high levels of presence and realism, but
current networking constraints create delayed feedback and damage
user experiences [6]. More importantly, presence and realism can
influence the sense of ‘being together’ in the Metaverse classroom.
Navigation and Cybersickness.
Navigation is the most funda-
mental interaction and is the primary task when users move inside
the 3D virtual environment. For example, users need to move from
one position to another in the virtual classroom to communicate with
their learning peers or adapt their viewpoint to explore unknown
environments. In physical environments, humans can navigate easily
and gain consistent multisensory feedback from through from visual,
proprioceptive and vestibular systems. In contrast, in immersive vir-
tual environments, they cannot walk naturally due to the limitation of
Metaverse devices and small workspace. For example, users would
visually perceive moving objects while their body is still in position.
They have to be constrained into a limited physical space, implying
impossible matches between actual and virtual walking, and thus
leading to unnatural sensory feedback. According to the sensory con-
flict theory [36], the mismatched visual and vestibular information
will lead users to experience cybersickness with symptoms, such as
fatigue, headache, nausea, disorientation, etc [24].
As there exists a significant correlation between the sickness level
and user discomfort [41], failure in mitigating cybersickness would
threaten the success and acceptance of the Metaverse classroom.
Several technical settings are responsible for the occurrence of cy-
bersickness, such as latency, FOV, low frame rates, inappropriate
adjustment of navigation parameters [8]. However, the user suscep-
tibility to cybersickness is individually different. Solutions to ease
the severity of cybersickness require more attention to individual
differences, including gender, gaming experience, age, ethnic origin,
etc [46], for which a Metaverse classroom should pay attention to.
The Metaverse system can acquire users’ physiological signals, pre-
dict their sickness states in an individual basis [25], and adapt the
learning content and duration accordingly.
Systems and Networking.
Developing such a classroom raises
significant challenges related to the synchronization of a large num-
ber of entities within a single digital space. The MR environment
requires transmitting large amounts of data to finely synchronize the
users’ actions. Although these data account for less traffic than live
video streaming, users’ actions need to be synchronized in real-time
to enable seamless interaction. As such, latency is a primary chal-
lenge. In highly interactive applications, users start to notice latency
above 100 ms. Besides, a latency below 100 ms still affects user
performance despite less noticeable [11]. Latency will therefore be a
primary concern in such a classroom. Another major challenge lies
in sharing the real-time course with thousands of remote users scat-
tered worldwide. Although some works strive to address massively
multi-users systems [13], they do not address the stringent latency
that may happen when users located either far away, or on a poorly
interconnected network (whether due to peering agreements or fire-
walls along the way) present a round-trip latency in the order of the
hundreds of milliseconds. Most gaming platforms solve this issue by
setting up regional servers. However, the universality of education
complicates this task, as a course taught in a certain geographical
location on the Metaverse should be accessible to anybody with and
Internet connection. Another major concern is the fine rendering of
the digital avatars of physical participants. Owing to the pervasive
sensing capabilities of the physical MR classroom, it will be possible
to rebuild highly accurate representations of the physical participants
as sophisticated avatars. However, these avatars may be too complex
to render with WebGL and lightweight VR headsets. As such, it
may be necessary to leverage remote servers (cloud and edge) to pre-
render some elements of the digital scene. One solution would be to
render a low-quality version of the models on-device and merge the
rendered frame with high-quality frames rendered in the cloud [27].
Finally, many courses may rely on video transmission, whether of
the instructor, digital artefacts (e.g., slides), or physical objects in
the classroom (e.g., whiteboard, physical items used in a lab). These
video frames need to be transmitted in real-time to match both the
avatars’ actions and the related audio transmission. A high video
quality (high resolution with few artifacts) is also necessary so that
the information presented presents high legibility. Maximizing video
quality while minimizing latency to an imperceptible level has been
a significant research challenge in the cloud gaming community, and
solutions leveraging joint source coding and forward error correction
at the application level are presenting promising results [4].
Content Democratization and Privacy.
The Metaverse encour-
ages every participant to contribute content in the virtual-physical
blended cyberspace [31]. Furthermore, regardless of learners and
educators, class participants in the proposed classroom are expected
to contribute learning content in various educational contexts. NFTs
and well-design economics models are the keys to the sustainability
of user contributions that expect credits and rewards. Finally, as
the newly created content will remain in the classroom cyberspace,
we have to consider the appropriateness of content overlays under
the privacy-preserving perspective [26]. Improper augmentation of
contents in the Metaverse can pose privacy threats and perhaps risks
of copyright infringement.
REFERENCES
[1]
Star trek classroom: the next generation of school
desks. Durham University News, Nov 2012.
https://www.dur.ac.uk/news/newsitem/?itemno=15991.
[2]
The expanding role of immersive media in education. Proc. of the 14th
IADIS Inter. Conf. e-Learning 2020, EL 2020 - Part of the 14th Multi
Conf. on MCCSIS 2020, pages 191–194, 2020.
[3]
M. Akçayır and G. Akçayır. Advantages and challenges associated with
augmented reality for education: A systematic review of the literature.
Educational Research Review, 20:1–11, 2017.
[4]
A. Alhilal et al. Nebula: Reliable low-latency video transmission for
mobile cloud gaming. arXiv:2201.07738, 2022.
[5]
R. T. Azuma. A survey of augmented reality. Presence: teleoperators
& virtual environments, 6(4):355–385, 1997.
[6]
C. Bermejo et al. Exploring button designs for mid-air interaction in
virtual reality: A hexa-metric evaluation of key representations and
multi-modal cues. Proc. of the ACM on Human-Computer Interaction,
5:1 – 26, 2021.
[7]
J. W. Brelsford. Physics Education in a Virtual Environment. Proc.
of the Human Factors and Ergonomics Society Annual Meeting,
37(18):1286–1290, oct 1993.
[8]
J.-R. Chardonnet, M. A. Mirzaei, and F. Merienne. Influence of navi-
gation parameters on cybersickness in virtual reality. Virtual Reality,
25(3):565–574, 2021.
[9]
Z. Chen et al. Learning from home: A mixed-methods analysis of
live streaming based remote education experience in chinese colleges
during the covid-19 pandemic. In Proc. of the 2021 CHI Conf. on
Human Factors in Comp. Sys., CHI ’21, NY, USA, 2021. ACM.
[10] K.-H. Cheng and C.-C. Tsai. Affordances of augmented reality in sci-
ence learning: Suggestions for future research. J. of science education
and technology, 22(4):449–462, 2013.
[11]
M. Claypool and K. Claypool. Latency and player actions in online
games. Commun. ACM, 2006.
[12]
M. Cook et al. Challenges and strategies for educational virtual reality:
Results of an expert-led forum on 3D/VR technologies across academic
institutions. Information Technology and Libraries, 38(4):25–48, 2019.
[13]
J. Donkervliet, A. Trivedi, and A. Iosup. Towards supporting millions
of users in modifiable virtual environments by redesigning
{
Minecraft-
Like
}
games as serverless systems. In 12th USENIX Workshop on Hot
Topics in Cloud Computing (HotCloud 20), 2020.
[14]
H. Duan, J. Li, S. Fan, Z. Lin, X. Wu, and W. Cai. Metaverse for Social
Good: A University Campus Prototype. In Proceedings of the 29th
ACM International Conference on Multimedia, pages 153–161, New
York, NY, USA, oct 2021. ACM.
[15]
N. Elmqaddem. Augmented Reality and Virtual Reality in Educa-
tion. Myth or Reality? International J. of Emerging Technologies in
Learning (iJET), 14(03):234, feb 2019.
[16]
C. P. Fabris et al. Virtual reality in higher education. Inter. J. of
Innovation in Science and Mathematics Education, 27(8):69–80, 2019.
[17]
A. Gardeli and S. Vosinakis. ARQuest: A tangible augmented reality
approach to developing computational thinking skills. 2019 11th Inter.
Conf. on Virtual Worlds and Games for Serious Applications, VS-
Games 2019 - Proc., page 1DUUMY, 2019.
[18]
D. R. Garrison, T. Anderson, and W. Archer. Critical inquiry in a
text-based environment: Computer conferencing in higher education.
The internet and higher education, 2(2-3):87–105, 1999.
[19]
N. Gavish et al. Evaluating virtual reality and augmented reality train-
ing for industrial maintenance and assembly tasks. Interactive Learning
Environments, 23(6):778–798, 2015.
[20]
K. A. Greenan. The Influence of Virtual Education on Classroom
Culture. Frontiers in Communication, 6(March):10–13, mar 2021.
[21]
S. Higgins et al. Multi-touch tables and collaborative learning. British
J. of Educational Technology, 43(6):1041–1054, 2012.
[22]
M.-B. Ibáñez and C. Delgado-Kloos. Augmented reality for STEM
learning: A systematic review. Computers & Education, 123:109–123,
aug 2018.
[23]
M. S. Y. Jong, C. C. Tsai, H. Xie, and F. Kwan-Kit Wong. Integrating
interactive learner-immersed video-based virtual reality into learning
and teaching of physical geography. British Journal of Educational
Technology, 2020.
[24]
R. S. Kennedy et al. Simulator sickness questionnaire: An enhanced
method for quantifying simulator sickness. The international J. of
aviation psychology, 3(3):203–220, 1993.
[25]
J. Kim et al. A Deep Cybersickness Predictor Based on Brain Signal
Analysis for Virtual Reality Contents. In 2019 IEEE/CVF ICCV, pages
10579–10588, Seoul, Korea, oct 2019. IEEE.
[26]
A. Kumar et al. Theophany: Multimodal speech augmentation in
instantaneous privacy channels. Proc. of the 29th ACM Inter. Conf. on
Multimedia, 2021.
[27]
K. Lee et al. Outatime: Using speculation to enable low-latency
continuous interaction for mobile cloud gaming. In Proceedings of the
13th Annual International Conference on Mobile Systems, Applications,
and Services, 2015.
[28]
L.-H. Lee, T. Braud, et al. All one needs to know about metaverse: A
complete survey on technological singularity, virtual ecosystem, and
research agenda. arXiv:2110.05352, 2021.
[29]
L.-H. Lee et al. From seen to unseen: Designing keyboard-less inter-
faces for text entry on the constrained screen real estate of augmented
reality headsets. Pervasive Mob. Comput., 64:101148, 2020.
[30]
L.-H. Lee et al. Ubipoint: towards non-intrusive mid-air interaction for
hardware constrained smart glasses. Proc. of the 11th ACM Multimedia
Sys. Conf., 2020.
[31]
L.-H. Lee et al. When creators meet the metaverse: A survey on
computational arts. ArXiv, abs/2111.13486, 2021.
[32]
L.-H. Lee and Others. Towards augmented reality driven human-city
interaction: Current research on mobile headsets and future challenges.
ACM Computing Surveys (CSUR), 54:1 – 38, 2022.
[33]
M. Lieux et al. Online conferencing software in radiology: Recent
trends and utility. Clinical Imaging, 76:116–122, 2021.
[34]
Y. Lu, Y. Xu, and X. Zhu. Designing and Implementing
V R2E2C
, a
Virtual Reality Remote Education for Experimental Chemistry System.
Journal of Chemical Education, 98(8):2720–2725, aug 2021.
[35]
J. A. Munoz-Cristobal et al. Supporting teacher orchestration in ubiqui-
tous learning environments: A study in primary education. IEEE Trans.
on Learning Technologies, 8(1):83–97, 2014.
[36]
C. M. Oman. Motion sickness: a synthesis and evaluation of the
sensory conflict theory. Canadian J. of physiology and pharmacology,
68(2):294–303, 1990.
[37]
E. M. Onyema et al. Impact of coronavirus pandemic on education. J.
of Education and Practice, 11(13):108–121, 2020.
[38]
P. P. Rega and B. N. Fink. Immersive simulation education: a novel ap-
proach to pandemic preparedness and response. Public Health Nursing,
31(2):167–174, 2014.
[39]
A. Shoufan. What motivates university students to like or dislike an
educational online video? a sentimental framework. Computers &
education, 134:132–144, 2019.
[40]
P. Soltani and A. H. Morice. Augmented reality tools for sports educa-
tion and training. Computers & Education, 155:103923, 2020.
[41]
A. Somrak et al. Estimating vr sickness and user experience using
different hmd technologies: An evaluation study. Future Generation
Comp. Sys., 94:302–316, 2019.
[42]
H. Song, J. Kim, and N. Park. I know my professor: Teacher self-
disclosure in online education and a mediating role of social presence.
Inter. J. of Human–Computer Interaction, 35(6):448–455, 2019.
[43]
Y. M. Tang, K. M. Au, H. C. Lau, G. T. Ho, and C. H. Wu. Evaluating
the effectiveness of learning design with mixed reality (MR) in higher
education. Virtual Reality, 24(4):797–807, 2020.
[44]
D. Velev and P. Zlateva. Virtual Reality Challenges in Education and
Training. International J. of Learning, 3(1):33–37, 2017.
[45]
Y. Wang et al. Development of a speed protector to optimize user
experience in 3D virtual environments. Inter. J. of Human-Computer
Studies, 147:102578, dec 2021.
[46]
Y. Wang et al. Using Fuzzy Logic to Involve Individual Differences
for Predicting Cybersickness during VR Navigation. In 2021 IEEE VR,
pages 373–381, Lisbon, Portugal, mar 2021. IEEE.
[47]
C. D. Wickens. Virtual reality and education. IEEE International
Conference on Systems, Man and Cybernetics, 1992.
[48]
N. R. Wijesooriya et al. Covid-19 and telehealth, education, and
research adaptations. Paediatric Respiratory Reviews, 35:38–42, 2020.
[49]
R. Winkler et al. Sara, the Lecturer: Improving Learning in Online
Education with a Scaffolding-Based Conversational Agent, page 1–14.
ACM, NY, USA, 2020.
[50]
E. Yalow and R. E. Snow. Individual differences in learning from
verbal and figural materials. Technical report, Stanford Univ Calif
School of Education, 1980.
[51]
S. Zhang and Q. Liu. Investigating the relationships among teachers’
motivational beliefs, motivational regulation, and their learning en-
gagement in online professional learning communities. Computers &
Education, 134:145–155, 2019.
[52]
Y. Zhang et al. Virtual reality applications for the built environ-
ment: Research trends and opportunities. Automation in Construction,
118:103311, 2020.
... In recent years, educators and researchers have increasingly focused on the potential of Mixed Reality (MR), VR, and AR to enrich learning experiences. VR immerses students in entirely synthetic environments, enabling them to interact with computergenerated simulations that can replicate real-world scenarios or entirely fictional worlds [8][9][10]. This technology has the potential to replace traditional, passive classroom experiences with dynamic and interactive learning sessions. ...
Article
Full-text available
This paper explores the integration of Virtual Reality (VR) and Augmented Reality (AR) in classroom settings, highlighting their potential to transform education. By creating immersive and interactive learning environments, VR and AR enhance student engagement, motivation, and learning outcomes. The key differences between VR and AR, including their applications and strengths, are discussed. The benefits, such as improved learning outcomes and enhanced engagement, are contrasted with challenges like cost, accessibility, and technical competency. Case studies exemplify successful implementations, while future trends suggest a significant impact on educational technology. The conclusion underscores the transformative potential of VR and AR in education, despite existing barriers. INTRODUCTION The integration of Virtual Reality (VR) and Augmented Reality (AR) in educational settings represents a significant leap forward in teaching and learning methodologies. These technologies create immersive and interactive learning environments that can substantially enhance the educational experience [1-4]. By merging real and virtual worlds, VR and AR offer unique opportunities to visualize complex concepts, engage in hands-on learning, and foster a deeper understanding of subject matter. This paper reviewed the benefits and challenges of incorporating VR and AR into classrooms, illustrating how these tools can transform traditional educational paradigms [5-7]. In recent years, educators and researchers have increasingly focused on the potential of Mixed Reality (MR), VR, and AR to enrich learning experiences. VR immerses students in entirely synthetic environments, enabling them to interact with computer-generated simulations that can replicate real-world scenarios or entirely fictional worlds [8-10]. This technology has the potential to replace traditional, passive classroom experiences with dynamic and interactive learning sessions. AR, on the other hand, overlays digital information onto the physical world, enhancing real-world interactions with virtual elements. This technology allows students to engage with digital content in a tangible context, thereby deepening their understanding and retention of information [11]. The growing interest in VR and AR in education is driven by their ability to create engaging, motivational, and empathetic learning environments. These technologies can cater to diverse learning styles, promote active learning, and provide immediate feedback, making them valuable tools for educators aiming to enhance student engagement and learning outcomes [12]. Despite their potential, the widespread adoption of VR and AR in classrooms faces several challenges, including high costs, technical requirements, and the need for teacher training [13]. Addressing these issues is crucial for the successful integration of VR and AR into educational curricula. This paper examines the definitions, benefits, and challenges of VR and AR in educational settings. It also presents case studies and examples of successful implementations, providing insights into how these technologies can be effectively integrated into classrooms. By exploring future trends and implications, the paper aims to highlight the transformative potential of VR and AR in education and to identify strategies for overcoming the barriers to their adoption [14]. Definition and Overview Virtual reality (VR) technology has grown considerably, and we expect it to revolutionize online education. VR immerses students in computer-generated worlds and could provide online educators with a replacement for the traditional sessile classroom experiences [15]. By leveraging the same VR
Article
Full-text available
This paper presents a human-centric mixed reality (MR) collaborative training platform that employs a kinesthetic learning technique in industrial robotic training, specifically focusing on robot pick-and-place (RPP) operations. Collaborating with ABB Robotics Vietnam, we conducted a user study to investigate the user experiences and practical perceptions of university students and novice trainees via the human-centric training assessment. The study compares the traditional training (TT) RPP classroom as a conventional method with a new collaborative MR RPP training approach (N = 50). The MR training features a digital twin (DT) of ABB GoFa™ CRB-15000 collaborative robot in an immersive 360° Digital-Objects-Based Augmented Training Environment (360-ATE) using Microsoft HoloLens devices. The research evaluated the impact of MR and DT on human-robot interaction and collaboration, user experience, task performance, knowledge retention, and interpretation, as well as differences in perceptions between the two novice cohorts under each training condition. The primary research question explores "Whether the MR collaborative training platform with DT integration in 360-ATE can serve as an alternative approach for novice students and industrial trainees in RPP operations?". The findings indicate that MR training is more engaging and effective in enhancing participant safety, confidence, and task performance, which also augments cognitive capabilities. The virtual content on HoloLens, especially the DT, captured the attention and stimulated active learning abilities. Overall, participants in the MR cohort find the proposed training platform useful and easy to use. The platform has a positive influence on their intention to use similar 360-ATE-assisted training platforms in the future.
Article
Full-text available
The main focus of this study is to explore the use of Spherical Video-based Immersive Virtual Reality (SV-IVR) in investigating learners' fluency and vocabulary acquisition among sophomores in English as a Foreign Language (EFL) classrooms in higher education in Indonesia. Data were collected through 360-degree panoramic Virtual Reality (VR) observational assessments and vocabulary recall tasks, where participants were asked to mention words based on the video content they viewed in the SV-IVR environment. The analysis employed quantitative descriptive methods according to the participants' responses and observations. The data was analysed based on the general service list (GSL), academic word list (AWL), academic vocabulary range, knowledge-based vocabulary list (KVL), and word forms. The results illustrated that learners predominantly expressed high-frequency nouns, indicating a robust foundational vocabulary appropriate for routine communication. The focus on concrete nouns corresponds with their nature of being easier to illustrate than other parts of speech, including verbs and adjectives. As learners progressed in vocabulary acquisition, they started integrating more specialized and context-specific terminology, though their utilization of less common vocabulary remains restricted. The research also indicated that SV-IVR environments increased the connections between words and their visual or functional characteristics, facilitating vocabulary retention and recall.
Chapter
Virtual reality has established a hybrid condition between the traditional categories of representing and experiencing spaces. The chapter investigates one of the fringe conditions of contemporary architecture associated with digitalisation: digital architecture designed for a virtual experience, drafting a tentative hypothesis on its heritage status. In 2021, the word “metaverse” became a hot topic, leading to an increased number of research products. The affordances of simulated reality are discussed in three main steps. The first is a clarification of its semantic range, focusing on the problem of the spatial sense of reality. Not only can parts of the physical world be transferred to the metaverse, but the virtual simulations also have repercussions on the physical world. Considering the discursive nature the notion of contemporary heritage acknowledges, the contribution that designers provide to metaverse has certainly its place in a wider debate on the recognition of intangible artefacts. Secondly, a systematic literature review of scientific publications containing both the terms “metaverse” and “architecture” will be presented. The landscape of the current status of academic research in this direction highlights prominent authors and research groups, and suggests streams of investigation for future developments. Finally, a tentative conclusion discerns five main theoretical points raised by authors in the literature, reviewed through a content analysis, with the intention of discussing possible implications for the discipline of architecture and suggesting how virtual environments have developed distinct forms of expression.
Article
Full-text available
The current COVID-19 pandemic has affected almost every aspect of life but its impact on the healthcare landscape is conspicuously adverse. However, digital technologies played a significant contribution in coping with the challenges spawned by this pandemic. In this list of applied digital technologies, the role of immersive technologies in battling COVID-19 is notice-worthy. Immersive technologies consisting of virtual reality (VR), augmented reality (AR), mixed reality (MR), extended reality (XR), metaverse, gamification, etc. have shown enormous market growth within the healthcare system, particularly with the emergence of pandemics. These technologies supplemented interactivity, immersive experience, 3D modeling, touching sensory elements, simulation, and feedback mechanisms to tackle the COVID-19 disease in healthcare systems. Keeping in view the applicability and significance of immersive technological advancement, the major aim of this study is to identify and highlight the role of immersive technologies concerning handling COVID-19 in the healthcare setup. The contribution of immersive technologies in the healthcare domain for the different purposes such as medical education, medical training, proctoring, online surgeries, stress management, social distancing, physical fitness, drug manufacturing and designing, and cognitive rehabilitation is highlighted. A comprehensive and in-depth analysis of the collected studies has been performed to understand the current research work and future research directions. A state-of-the-artwork is presented to identify and discuss the various issues involving the adoption of immersive technologies in the healthcare area. Furthermore, the solutions to these emerging challenges and issues have been provided based on an extensive literature study. The results of this study show that immersive technologies have the considerable potential to provide massive support to stakeholders in the healthcare system during current COVID-19 situation and future pandemics.
Conference Paper
Full-text available
There have been many studies about how individual differences can affect users' susceptibility to cybersickness in a VR application. However, the lack of strategy to integrate the influence of each factor on cybersickness makes it difficult to utilize the results of existing research. Based on the fuzzy logic theory that can represent the effect of different factors as a single value containing integrated information, we developed two approaches including the knowledge-based Mamdani-type fuzzy inference system and the data-driven Adaptive neuro-fuzzy inference system (ANFIS) to involve three individual differences (Age, Gaming experience and Ethnicity) and we correlated the corresponding outputs with the scores obtained from the simulator sickness questionnaire (SSQ) in a simple navigation scenario. The correlation coefficients obtained through a 4-fold cross validation were found statistically significant with both fuzzy logic approaches, indicating their effectiveness to influence the occurrence and the level of cybersickness. Our work provides insights to establish customized experiences for VR navigation by involving individual differences.
Article
In this work, we introduce the VR²E²C system for Virtual Reality Remote Education for Experimental Chemistry. It aims to increase scientific literacy about remote education of theoretical and experimental chemistry among the public and high school students. It provides a laboratory option for students with disabilities who are keen on chemistry experiments but have limited mobility. VR²E²C provides three modes to accomplish the training: users can choose to control the intelligent robot to finish the experimental operations by instruction or choose to watch the experiments automatically conducted by VR²E²C. The experimental results can be delivered to users in real time or later. This system provides remote education for safe and high fault-tolerant experimental topics, demonstrating how the latest VR and robotics technology could benefit traditional chemical education.
Article
The continued advancement in user interfaces comes to the era of virtual reality that requires a better understanding of how users will interact with 3D buttons in mid-air. Although virtual reality owns high levels of expressiveness and demonstrates the ability to simulate the daily objects in the physical environment, the most fundamental issue of designing virtual buttons is surprisingly ignored. To this end, this paper presents four variants of virtual buttons, considering two design dimensions of key representations and multi-modal cues (audio, visual, haptic). We conduct two multi-metric assessments to evaluate the four virtual variants and the baselines of physical variants. Our results indicate that the 3D-lookalike buttons help users with more refined and subtle mid-air interactions (i.e. lesser press depth) when haptic cues are available; while the users with 2D-lookalike buttons unintuitively achieve better keystroke performance than the 3D counterparts. We summarize the findings, and accordingly, suggest the design choices of virtual reality buttons among the two proposed design dimensions.
Article
Videoconferencing platforms have recently gained wide attention due to the COVID-19 pandemic, both within and outside of the medical community. This article reviews various applications of online meeting technology to the radiologic community, not only in response to the recent pandemic but also thereafter. Various platform features are outlined and discussed, specifically with respect to collaboration, training, and patient care. Platforms reviewed are GoToMeeting, Microsoft Teams, Skype, WebEx, and Zoom.
Article
Virtual walking in virtual environments (VEs) requires locomotion interfaces, especially when the available physical environment is smaller than the virtual space due to virtual reality facilities limitations; many navigation approaches have been proposed according to different input conditions, target selection and speed selection. With current technologies, the virtual locomotion speed for most VR systems relies primarily on rate-control devices (e.g., joystick). The user has to manage manual adaptation of the speed, based on the size of the VE and personal preferences. However, this method cannot provide optimal speeds for locomotion as the user tends to change the speed involuntarily due to non-desired issues including collisions or simulator sickness; in this case, the user may have to adjust the speed frequently and unsmoothly, worsening the situation. Therefore, we designed a motion protector that can be embedded into the locomotion system to provide optimal speed profiles. The optimization process aims at minimizing the total jerk when the user translates from an initial position to a target, which is a common rule of the human motion model. In addition to minimization, we put constraints on speed, acceleration and jerk so that they do not exceed specific thresholds. The speed protector is formulated mathematically and solved analytically in order to provide a smooth navigation experience with a minimum jerk of trajectory. The assessment of the speed protector was conducted in a user study measuring user experience with a simulator sickness questionnaire, event-related skin conductance responses (ER-SCR), and a NASA-TLX questionnaire, showing that the designed speed protector can provide more natural and comfortable user experience with appropriate acceleration and jerk as it avoids abrupt speed profiles.