Technical ReportPDF Available

Re-shaping Post-COVID-19 Teaching and Learning: A Blueprint of Virtual-Physical Blended Classrooms in the Metaverse Era



During the COVID-19 pandemic, most countries have experienced some form of remote education through video conferencing software platforms. However, these software platforms present significant limitations that reduce immersion and fail to replicate the classroom experience. The currently emerging Metaverse addresses many of such limitations by offering blended physical-digital environments. This paper aims to assess how the Metaverse can support and improve e-learning. We first survey the latest applications of blended environments in education and highlight the primary challenges and opportunities. Accordingly, we derive our proposal for a virtual-physical blended classroom configuration that brings students and teachers into a shared educational Metaverse. We focus on the system architecture of the Metaverse classroom to achieve real-time synchronization of a large number of participants and activities across physical (mixed reality classrooms) and virtual (remote virtual reality platform) learning spaces. Our proposal attempts to transform the traditional physical classroom into virtual-physical cyberspace as a new social network of learners and educators connected at an unprecedented scale.
Re-shaping Post-COVID-19 Teaching and Learning: A Blueprint of
Virtual-Physical Blended Classrooms in the Metaverse Era
Yuyang Wang*
Computational Media & Arts Thrust
Hong Kong University of Science and Technology
Lik-Hang Lee
Augmented Reality and Media Lab
Korea Advanced Institute of Science and Technology
Tristan Braud
Division of Integrative Systems and Design
Hong Kong University of Science and Technology
Pan Hui§
Computational Media & Arts Thrust
Hong Kong University of Science and Technology
During the COVID-19 pandemic, most countries have experienced
some form of remote education through video conferencing software
platforms. However, these software platforms present significant
limitations that reduce immersion and fail to replicate the classroom
experience. The currently emerging Metaverse addresses many of
such limitations by offering blended physical-digital environments.
This paper aims to assess how the Metaverse can support and im-
prove e-learning. We first survey the latest applications of blended
environments in education and highlight the primary challenges and
opportunities. Accordingly, we derive our proposal for a virtual-
physical blended classroom configuration that brings students and
teachers into a shared educational Metaverse. We focus on the sys-
tem architecture of the Metaverse classroom to achieve real-time syn-
chronization of a large number of participants and activities across
physical (mixed reality classrooms) and virtual (remote virtual real-
ity platform) learning spaces. Our proposal attempts to transform
the traditional physical classroom into virtual-physical cyberspace
as a new social network of learners and educators connected at an
unprecedented scale.
Index Terms:
Applied computing—Education—Distance
learning—E-learning; Human computer interaction—Interaction
Under the complex and changing global political and economic situ-
ation and the global pandemic, individual activities and production
methods face increasing challenges. For example, lectures have
moved to online via Zoom or MS Teams to maintain social dis-
tancing. However, the current online educational content is mainly
based on flat 2D display, which lacks immersion and engagement
compared to traditional teaching in a physical classroom. Students
feel hardly focused on the remote lecture. In this situation, the
Metaverse appears as a meaningful solution by integrating state-of-
the-art technologies such as VR/AR, artificial intelligence and cloud
computing [28] to make educational activities to more attractive.
: The next generation of education, i.e., teaching and
learning, should include diversified yet customized learning content
and context, increased student interaction and creativity, boosted
motivation and engagement [2]. Unfortunately, existing teaching
facilities and tools fail to offer these features. This work exam-
ines existing teaching and learning modalities and proposes a novel
teaching system to promote learning experience and performance
in the Metaverse era. In this context, we first present a mini-survey
on digital education platforms, including computer-mediated learn-
ing and teaching via online conferencing tools and VR/AR-based
teaching. We highlight the pros and cons of these teaching modal-
ities. For instance, Zoom enables synchronous teaching but lacks
students engagement [9]. On the other hand, AR/VR provides more
engaging teaching activities thanks to 3D visualization technology
in semi-immersive or fully immersive environments. However, it
requires a high-quality and robust network for synchronizing graphi-
cal data [52]. Therefore, we put forward a virtual-physical blended
classroom, with the ambition of migrating physical classrooms and
remote educational activities to the unified yet immersive cyberspace,
known as the Metaverse. Our proposed Metaverse classroom con-
tains two physical classrooms at two university campuses and one
digital classroom hosted in the edge-cloud synchronized computing
devices. The digital classroom enables users (i.e., learners and edu-
cators) to appear at a physical lecture from their home campus, in
which the digital twins of such users, as avatars, can interact with
users from other campuses.
Review methodology
: First, we search in Google Scholar for
the terms “VR education” and “AR education”, and additional key-
words including “application” and “challenge” and find that related
publications spread among several journals and conferences. Sec-
ond, we choose papers mainly from celebrated publishers: ACM,
IEEE, Springer, Elsevier and Taylor
Francis. Third, we favour
the paper with top citations and from the filed of education and
VR/AR. Finally, if no proper references can be found in the former
steps, we expand our search domain in Google Scholar with other
keywords such as “VR/AR applications” and filter by investigating
the relevance to the educational field. This mini-survey illustrates
the paradigm shift in teaching and learning in the post-COVID-19
period. The classroom moves from computer-mediated learning or
VR/AR-based applications to a fully virtual-physical blended one.
This paper’s contributions are twofold, as follows. First, we
review the latest development of remote learning and interactive
medium like AR/VR/MR. Second, we propose the Metaverse class-
room containing two physical classrooms and one cloud-based vir-
tual classroom, and the architecture about how to implement the
system. We accordingly pinpoint the grand challenges of establish-
ing the proposed Metaverse classroom.
The recent development of AR/VR provides evidence that immersive
elements are no longer a myth in learning context [15]. We highlight
arXiv:2203.09228v1 [cs.HC] 17 Mar 2022
some important works related to computer-mediated education and
relevant studies of AR/VR, as presented in Figure 1. Unfortunately,
learners from disadvantaged families still find it hard to overcome the
challenges imposed by COVID-19 because they lack stable network
connection, powerful computer and digital skills [37].
Computer-mediated Education.
Since the outbreak of COVID-
19, the lack of high-quality learning materials and efficient teaching
tools has posed a threat to traditional classroom-based education.
Learning remotely through a video conferencing system has pro-
vided an opportunity to maintain lecturing and communicating activ-
ities. However, the HCI community also identified some challenges
arising from remote teaching [9]. For instance, students may find
it challenging to pay continuous attention because of multitasking
and unexpected interruption. They tend to experience decreased
learning efficacy as live streaming learning costs a longer time and
reduced engagement/collaboration because the learning environment
changes from a community-based public area supported by various
university facilities to a private space with limited resources. Re-
searchers proposed several solutions to mitigate the deficiency or
weakness for both teachers and students during the live streaming
course, e.g., establishing the voice-based conversational agents to
engage natural interaction [49], assigning learning tasks to learn-
ers and improving motivational beliefs [51], boosting explanation
quality and presentation skills [39].
Durham University developed a multi-touch and multi-user desk
to boost children’s mathematical skills in a collaborative and inter-
active way [21]. Pupils can work together to solve problems and
answer questions through some inventive solutions. Compared to
traditional paper-based activities, the multi-touch table can improve
mathematical flexibility and fluency and promote higher levels of
active student engagement.
VR in Education.
VR integrates five essential components: 3D
perspective, dynamic rendering, closed-loop interaction, inside-out
perspective and enhanced sensory feedback [47]. Thanks to rising
computing power and the easy access to many affordable head-
mounted displays (HMDs), VR has proved to be helpful in various
applications such as surgical operations, psychotherapy and STEM
education [16].
The lack of opportunities for practical and hands-on manipula-
tion of objects in the real world is linked to the fact that children
are usually poor at intuitive physics [7]. Therefore, Brelsford [7]
designed a physics simulator to give students a better intuition about
the physical process inside an immersive environment. One group
of students was given a pendulum in controllable length and three
balls in varied mass in this simulator. These students can manipulate
gravity, mass, location of the balls, air drag, friction, period of the
pendulum, force, motion, and starting force of the pendulum to carry
out experiments for an hour. Another control group of students was
given a lecture in the physical classroom on the same material. An
exam was assigned four weeks after the test. The results indicated
that those who had learned through the virtual laboratory revealed
better retention than those from the lecture-based learning group.
Similarly, Yalow and Snow [50] reported that offering learners
instructional material in spatial-graphical format, instead of verbal
words, will improve their immediate comprehension of the material.
Therefore, many studies have investigated the benefits of learning
in virtual environments. Rega and Fink [38] performed a pioneer
study that designed a novel approach to pandemic preparedness and
response based on immersive simulation. The authors developed
a semester-long course in simulated environments for students in
Master of Public Health (MPH). Students are grouped to represent
different country health departments during the whole semester,
and they can gain incident command training and acquire audio
lectures and learning materials related to the imminent pandemic.
Despite learning in virtual environments, students still considered
this training paradigm more groundbreaking, fascinating and educa-
tive. Such work provides us with another solution to combat the
current COVID-19 global pandemic [33].
VR can be promising for education and training, but challenges
still exist [44]. First, VR is generally regarded as a game for enter-
tainment and relaxation. Learners pay more attention to winning
in the game, but they are not fully engaged to acquire knowledge
and improve critical thinking skills, far from the teaching objec-
tive. Second, VR is not perfect from a psychological point of view.
For example, the older lectures may prefer the classical teaching
approach instead of the digital one compared to younger students.
Longstanding immersion in locomotion-dominated VR applications
will cause discomfort and sickness symptoms (e.g., nausea, dizziness
and disorientation) individually among users, which will weaken
users’ willingness for using VR devices [45]. Third, the educational
institution will share the high cost of design and creating the VR
resources towards different teaching objectives, and the library de-
partment lacks interoperability standards or optimal practices for
adopting these VR contents, leading to difficulties to share resources
among different institutions and pointless duplicated work [12]. In
addition, VR has made significant contributions to other educational
cases, e.g., controlling the real chemistry lab by interacting with a
virtual lab [34] and teaching physical geography through spherical
video-based VR immersion [23].
AR in Education.
AR is a 3D technology that boosts the user’s
perception of the physical work by adding a contextual layer of
information [5], and it has become a favoured topic in educational
research in the last decade [22]. The popularity of AR technology
is because it does not require expensive hardware and complicated
equipment such as HMDs. AR can be achieved with low-cost hand-
held devices such as mobile phones and tablets, enabling AR-based
educational settings to be available for most learners. Among many
learner types, K-12 students (primary and secondary students) are
the most preferred learners because the outstanding visualization
features of AR hold the key to students whose learning performance
relies mainly on seeing, hearing or other ways to sense at this age [3].
There are many examples of the successful applications of AR
in education. Gardeli and Vosinakis [17] designed the ARQuestion
application through a collaborative mobile AR game for improving
problem-solving and computational skills of primary school students
through a gamified activity, which demonstrates the application of
AR system in classroom-based interventions. ARQuest supports
multi-user interaction, and students can collaborate with high en-
gagement and motivation to solve challenging problems, despite the
small screen size. This work can be regarded as a 3D version of the
above-mentioned 2D Star trek classroom [21]. The AR application
can benefit learners in sports education and training to learn sports
skills, provide additional hints and feedback, stimulate practice, and
introduce new rules for creating new sports [40].
However, a broad application of AR in education is still facing
some challenges. The most blamed challenge is usability because
most AR educational settings are difficult for students to use [3]. For
example, students might find it challenging to use the AR application
and perform interactive activities without well-designed interfaces,
leading to worse effectiveness [35]. For example, previous work
revealed that the group of learners with AR devices required signifi-
cantly longer training time compared with those without using AR
equipment, which might be due to the novelty of the AR technology
and users were not familiar with the learning approach [19]. Another
issue is that the AR learning environment may increase learners’
cognitive workload because of the excessive materials and compli-
cated tasks [10]. Some technical issues can also weaken learners’
expectations of AR technology. For example, location-based AR
applications rely heavily on the GPS signal to determine the po-
sition and orientation, while low sensitivity in trigger recognition
frequently appears as an issue [10].
(a) (b) (d)
(e) (f) (g) (h)
Figure 1: Examples of different digital teaching and learning approaches through computer-aided tools and VR/AR technologies. (a) Star
trek room to boost children’s mathematical abilities with on one desktop [1]; (b) Teaching via online conferencing tools [9]; (c) Providing
additional information during sport training [40]; (d) Collaborative learning via ARQuest [17]; (e) Digital content representation of teaching
material [43]; (f) Controlling the real lab inside the VR [34]; (g) Teaching physical geography with video-based VR [23]; (h) Construction of a
virtual campus at CUHK SZ [14].
Traditional teaching activities depend on verbal and nonverbal in-
teractions between teachers and students to formulate and enrich
cultural norms, behaviors, practices, and beliefs, but learning in a vir-
tual platform interrupts such process due as online meeting changes
individual’s communication behaviors [20]. Therefore, we expect to
overcome the defects of current virtual education and propose the
Metaverse classroom, which can boost student engagement, improve
learning efficiency, and create supportive connection tools during
and after class meetings.
Greenan [20] argued that social presence and self-disclosure are
two crucial aspects of virtual education. Garrison et al. [18] ex-
plained the social presence as socio-emotional support and interac-
tion and the individual’s ability to project themselves socially and
emotionally with their entire personality through a communication
tool, which will be impacted by conversions, activities, collaboration,
familiarity and motivation among participants. On the other hand,
during virtual education, self-disclosure could lower the feeling
of unreliability and ambiguity during communication and promote
intimacy and positive relationships [20], which reduce the mental
distance between individuals and increase trust between teachers
and students [42]. Thus, with the increasing popularity of AR and
VR education applications (section 2), establishing the Metaverse
classroom can promote the aforementioned benefits at scale.
3.1 Towards the Metaverse classroom
Due to the global pandemic, access to classrooms and laboratories
is restricted to teachers and students. Online conferencing tools
enable continuity of education, making lectures and tutorials avail-
able virtually to anyone. Coincidentally, the pandemic shed light
on the feasibility of e-learning in long duration and hence the user
acceptability to the e-learning at scale. However, engagement is an
essential component of education, and passive learning via video con-
ferencing fails to reinforce the engagement among class participants.
One way to tackle this challenge, highlighted during the COVID-
19 pandemic, is to introduce interactive and real-time elements to
educators and learners. The advanced development of VR/AR/MR
is characterized by immersive visual stimuli and real-time tracking.
Also, the related hardware and multimedia technologies can support
the Metaverse as an affordable learning platform for learners and
educators on the globe. With such an unprecedented opportunity, we
can employ the Metaverse as a new social platform for both learners
and educators, as nowadays VR/AR/MR devices are getting mature,
which serve as an efficient tool for teaching and learning.
We propose an educational platform that supports the teaching
and learning experience with the Metaverse, with the following
features: (i) learning assessment in the Metaverse for the courses,
(ii) interaction with presentations in the Metaverse, and (iii) teach-
ing experience with augmentation that benefits from visualizing
knowledge and from 3D virtual entities. The platform aims to bring
more engaging and interactive tutorials and mixed learning. Remark-
ably, the virtual-physical blended classroom in the Metaverse makes
the educators and learners situated in intuitive yet user-context (to
both learners and educators) environments. Thus, the Metaverse
classroom provides effective communication media among partic-
ipants from various campuses. At the same time, the changeover
in such a virtual-physical environment brings new values to the
class participants, in terms of learning effectiveness, senses of (vir-
tual) presences, interactive learning experience, etc. Although such a
Metaverse classroom can connect to numerous possible usage scenar-
ios, we specifically highlight several class participants’ interactions,
as follows.
Gamified Learning and Task-based Modules.
Promoting gam-
ification is obvious, which facilitates active class dynamics by de-
signing digital “breakouts" for teams of students or letting students
collaborate to create their own.
Learner Collaborations.
Challenging students to work in teams
to solve a riddle or puzzle or bring the surreal scenes, i.e., augmented
and virtual realities, to the classroom. Additionally, incorporating a
360-degree video scene.
Learner-driven Activities.
Empowering students to create
“choose your own adventure"-style stories or presentations to share
their contents (learning outcomes, opinions, a speech, etc.) with the
Metaverse community.
Saving Instructors’ Time.
With unlimited possibilities for creat-
ing experiences and new features being added continuously, instruc-
tors would become more willing to spend their time on designing
new curricula. The saved time can also enable the instructors to
spend energy with their students, e.g., creating diversified and even
personalized activities for the sake of improving their students’ skills
and knowledge.
Access to Limited/restricted Equipment.
Remarkably, many
students cannot access labs due to the COVID-19. As people can
access the Metaverse classroom anytime and anywhere, the virtual-
physical presence of class participants could achieve ubiquitous
learning yet real-time access to the lab resource (e.g., a virtual lab
as the digital twin) as well as other limited/restricted resources (e.g.,
testing Uranium in the Metaverse).
We consider the Metaverse classroom consists of both physical
Figure 2: The MR Metaverse classroom allows students, instructors,
and speakers located physically in HKUST GZ campus, HKUST
CWB campus, or online to interact within the same activity (lecture,
tutorial, seminar). Through MR, the virtual classroom enables users
to attend a physical lecture on their home campus, while users from
other campus and online users are represented through avatars.
and virtual classrooms. As shown in Figure 2, learners and instruc-
tors are situated in classrooms spreading over multiple campuses,
while the virtual classroom bridge the physical class participants
together. The physical participants of one location can meet the
virtual participants and the physical participants of other locations.
We outline the above concept with a unit case. The participants
are situated in two physical classrooms and one virtual classroom
spreading over two campuses (Hong Kong Clear Water Bay, CWB
and Guangzhou, GZ) in the following paragraphs. These three class-
rooms are synchronized so that the intervention of a participant in
any of these classrooms will be visible to the attendants in the other
two classrooms through his or her avatar representation (i.e., the
digital twins of class participants). It is important to note that the
number of physical classrooms is expandable, i.e., more than two
physical classrooms can access the Metaverse classroom.
Physical Metaverse classroom in HKUST GZ and CWB cam-
: These two classrooms are linked together through a mixed
reality (MR) Metaverse platform that enables classes to be shared
between the two campuses. Each classroom is equipped with a
set of sensors to track the participants and replicate their presence
through digital avatars on another campus’ classroom(s). These dig-
ital avatars are displayed in MR through projectors or headsets. As
such, it is possible to seamlessly conduct a wide range of activities,
ranging from talks and lectures to group projects involving students
and staff from both campuses.
Digital Metaverse classroom online in Virtual Reality
: A sig-
nificant portion of class attendants can access the learning context
remotely, i.e., without any physical presence, which is similar to
nowadays situation of remote learning via Zoom. Nonetheless, the
remote participants can virtually meet the physical participants from
HKUST GZ and CWB campuses. These users perhaps are either
HKUST students who cannot attend the physical lecture due to un-
expected circumstances (sickness, travel restriction, etc.) or learners
outside of HKUST who audit the course. These users can connect
to a third, fully virtual classroom through a virtual reality (VR)
headset or their computers. The HKUST students’ presence can
be represented through digital avatars displayed in both MR class-
rooms, while guest avatars enable outside users to participate in
the class (e.g., guest speakers). For instance, students from KAIST,
represented by avatars, can meet HKUST students in the Metaverse
Digital Classroom, as well as other remote class participants from
MIT and Cambridge (the lower half in Figure 2).
3.2 Architecture of the Metaverse classroom
Figure 3: General system architecture to handle synchronization
between two MR Metaverse classrooms and a VR Metaverse class-
Following the unit case in previous paragraphs, Figure 3 depicts
the system architecture for replicating physical participants situated
in the physical classroom (Classrooms 1 and 2), who can interact
with remote participants in virtual cyberspace. The participants in
the physical classroom 1 wear MR headsets that can track their
locations and other features, such as facial expressions. Meanwhile,
the physical classroom is equipped with sensors that can estimate
the exact pose of the participants. The data from the headsets and
the classroom sensors are transmitted through WiFi (headset) or
wired network (sensors) to the edge server that aggregates the data
to estimate the pose and facial expression of the participants.
The server then generates the avatar and their interaction traces
accordingly, and packages them via the real-time transmission link
to both the edge server of Classroom 2 and the cloud server of the
VR classroom. The edge server in Classroom 2 identifies the vacant
seats to display virtual avatars in the mixed reality classroom. Upon
the reception of the digital information, it corrects the pose to match
the new position of the avatar and generates the scene to display to
the users through the lens of their MR headsets. Similarly, the cloud
server arranges the avatars of all users within an entirely virtual VR
classroom and transmits the results back to the remote users. Both
classrooms can rely on their own independent WiFi infrastructure
to accelerate the data transmission between the headsets and the
edge servers and minimize the end-to-end latency for replicating the
participants’ poses and expressions as digital avatars on the other
Metaverse classrooms, whether MR or VR.
The pandemic catalyzes the emergence of immersive environments
that potentially impact how we work, play, and even learn and
teach [48]. We foresee that the Metaverse classroom will bring
more learner-centric, collaborative, and innovative elements to the
future classroom. Our work serves as a groundwork for the digital
transformation of learning and teaching, from sole digital medium
to virtual-physical blended one. Despite the Metaverse classroom
being a promising learning platform for students and teachers, extra
efforts are required to meticulously investigate system performance
and user-centric evaluation and collect valuable feedback from actual
trials, i.e., teaching activities. Therefore, we call for joint research
efforts to actualize the Metaverse classroom, and in particular, we
have to address several challenging aspects including user-centric,
system and networking issues.
User Interactivity and Perception.
Learners and educators are
the key actors in the Metaverse classroom. Maintaining the band-
width of user interactivity, regardless of output and inputs, is crucial
to user retention. Nonetheless, first, the user inputs on mobile MR
and VR headsets are far from satisfaction, resulting in low through-
put rates in general [32], which could hinder the users’ expression,
i.e., converting one’s intentions into resultant outcomes in virtual-
physical blended environments. Additionally, current input methods
of headsets are primarily speech recognition and simple hand ges-
tures [30]. As a result, users can only deal with enriched educational
contents/contexts at a degraded quality.
On the other hand, the output channels, known as feedback cues,
are limited, albeit the headsets can sufficiently provide visuals of dig-
ital entities. However, such displays’ limited Field-of-View (FOV)
potentially deteriorates communication efficacy among users in the
Metaverse classroom [29]. Specifically, partial view of body ges-
tures, heavily relying on constant visual attention, due to limited
FOV, can lead to distorted communication outcomes. Thus, multi-
modal feedback cues (e.g., haptics) become necessary to maintain
the granularity of user communication. Additionally, haptic feed-
back is essential to delivering high levels of presence and realism, but
current networking constraints create delayed feedback and damage
user experiences [6]. More importantly, presence and realism can
influence the sense of ‘being together’ in the Metaverse classroom.
Navigation and Cybersickness.
Navigation is the most funda-
mental interaction and is the primary task when users move inside
the 3D virtual environment. For example, users need to move from
one position to another in the virtual classroom to communicate with
their learning peers or adapt their viewpoint to explore unknown
environments. In physical environments, humans can navigate easily
and gain consistent multisensory feedback from through from visual,
proprioceptive and vestibular systems. In contrast, in immersive vir-
tual environments, they cannot walk naturally due to the limitation of
Metaverse devices and small workspace. For example, users would
visually perceive moving objects while their body is still in position.
They have to be constrained into a limited physical space, implying
impossible matches between actual and virtual walking, and thus
leading to unnatural sensory feedback. According to the sensory con-
flict theory [36], the mismatched visual and vestibular information
will lead users to experience cybersickness with symptoms, such as
fatigue, headache, nausea, disorientation, etc [24].
As there exists a significant correlation between the sickness level
and user discomfort [41], failure in mitigating cybersickness would
threaten the success and acceptance of the Metaverse classroom.
Several technical settings are responsible for the occurrence of cy-
bersickness, such as latency, FOV, low frame rates, inappropriate
adjustment of navigation parameters [8]. However, the user suscep-
tibility to cybersickness is individually different. Solutions to ease
the severity of cybersickness require more attention to individual
differences, including gender, gaming experience, age, ethnic origin,
etc [46], for which a Metaverse classroom should pay attention to.
The Metaverse system can acquire users’ physiological signals, pre-
dict their sickness states in an individual basis [25], and adapt the
learning content and duration accordingly.
Systems and Networking.
Developing such a classroom raises
significant challenges related to the synchronization of a large num-
ber of entities within a single digital space. The MR environment
requires transmitting large amounts of data to finely synchronize the
users’ actions. Although these data account for less traffic than live
video streaming, users’ actions need to be synchronized in real-time
to enable seamless interaction. As such, latency is a primary chal-
lenge. In highly interactive applications, users start to notice latency
above 100 ms. Besides, a latency below 100 ms still affects user
performance despite less noticeable [11]. Latency will therefore be a
primary concern in such a classroom. Another major challenge lies
in sharing the real-time course with thousands of remote users scat-
tered worldwide. Although some works strive to address massively
multi-users systems [13], they do not address the stringent latency
that may happen when users located either far away, or on a poorly
interconnected network (whether due to peering agreements or fire-
walls along the way) present a round-trip latency in the order of the
hundreds of milliseconds. Most gaming platforms solve this issue by
setting up regional servers. However, the universality of education
complicates this task, as a course taught in a certain geographical
location on the Metaverse should be accessible to anybody with and
Internet connection. Another major concern is the fine rendering of
the digital avatars of physical participants. Owing to the pervasive
sensing capabilities of the physical MR classroom, it will be possible
to rebuild highly accurate representations of the physical participants
as sophisticated avatars. However, these avatars may be too complex
to render with WebGL and lightweight VR headsets. As such, it
may be necessary to leverage remote servers (cloud and edge) to pre-
render some elements of the digital scene. One solution would be to
render a low-quality version of the models on-device and merge the
rendered frame with high-quality frames rendered in the cloud [27].
Finally, many courses may rely on video transmission, whether of
the instructor, digital artefacts (e.g., slides), or physical objects in
the classroom (e.g., whiteboard, physical items used in a lab). These
video frames need to be transmitted in real-time to match both the
avatars’ actions and the related audio transmission. A high video
quality (high resolution with few artifacts) is also necessary so that
the information presented presents high legibility. Maximizing video
quality while minimizing latency to an imperceptible level has been
a significant research challenge in the cloud gaming community, and
solutions leveraging joint source coding and forward error correction
at the application level are presenting promising results [4].
Content Democratization and Privacy.
The Metaverse encour-
ages every participant to contribute content in the virtual-physical
blended cyberspace [31]. Furthermore, regardless of learners and
educators, class participants in the proposed classroom are expected
to contribute learning content in various educational contexts. NFTs
and well-design economics models are the keys to the sustainability
of user contributions that expect credits and rewards. Finally, as
the newly created content will remain in the classroom cyberspace,
we have to consider the appropriateness of content overlays under
the privacy-preserving perspective [26]. Improper augmentation of
contents in the Metaverse can pose privacy threats and perhaps risks
of copyright infringement.
Star trek classroom: the next generation of school
desks. Durham University News, Nov 2012.
The expanding role of immersive media in education. Proc. of the 14th
IADIS Inter. Conf. e-Learning 2020, EL 2020 - Part of the 14th Multi
Conf. on MCCSIS 2020, pages 191–194, 2020.
M. Akçayır and G. Akçayır. Advantages and challenges associated with
augmented reality for education: A systematic review of the literature.
Educational Research Review, 20:1–11, 2017.
A. Alhilal et al. Nebula: Reliable low-latency video transmission for
mobile cloud gaming. arXiv:2201.07738, 2022.
R. T. Azuma. A survey of augmented reality. Presence: teleoperators
& virtual environments, 6(4):355–385, 1997.
C. Bermejo et al. Exploring button designs for mid-air interaction in
virtual reality: A hexa-metric evaluation of key representations and
multi-modal cues. Proc. of the ACM on Human-Computer Interaction,
5:1 – 26, 2021.
J. W. Brelsford. Physics Education in a Virtual Environment. Proc.
of the Human Factors and Ergonomics Society Annual Meeting,
37(18):1286–1290, oct 1993.
J.-R. Chardonnet, M. A. Mirzaei, and F. Merienne. Influence of navi-
gation parameters on cybersickness in virtual reality. Virtual Reality,
25(3):565–574, 2021.
Z. Chen et al. Learning from home: A mixed-methods analysis of
live streaming based remote education experience in chinese colleges
during the covid-19 pandemic. In Proc. of the 2021 CHI Conf. on
Human Factors in Comp. Sys., CHI ’21, NY, USA, 2021. ACM.
[10] K.-H. Cheng and C.-C. Tsai. Affordances of augmented reality in sci-
ence learning: Suggestions for future research. J. of science education
and technology, 22(4):449–462, 2013.
M. Claypool and K. Claypool. Latency and player actions in online
games. Commun. ACM, 2006.
M. Cook et al. Challenges and strategies for educational virtual reality:
Results of an expert-led forum on 3D/VR technologies across academic
institutions. Information Technology and Libraries, 38(4):25–48, 2019.
J. Donkervliet, A. Trivedi, and A. Iosup. Towards supporting millions
of users in modifiable virtual environments by redesigning
games as serverless systems. In 12th USENIX Workshop on Hot
Topics in Cloud Computing (HotCloud 20), 2020.
H. Duan, J. Li, S. Fan, Z. Lin, X. Wu, and W. Cai. Metaverse for Social
Good: A University Campus Prototype. In Proceedings of the 29th
ACM International Conference on Multimedia, pages 153–161, New
York, NY, USA, oct 2021. ACM.
N. Elmqaddem. Augmented Reality and Virtual Reality in Educa-
tion. Myth or Reality? International J. of Emerging Technologies in
Learning (iJET), 14(03):234, feb 2019.
C. P. Fabris et al. Virtual reality in higher education. Inter. J. of
Innovation in Science and Mathematics Education, 27(8):69–80, 2019.
A. Gardeli and S. Vosinakis. ARQuest: A tangible augmented reality
approach to developing computational thinking skills. 2019 11th Inter.
Conf. on Virtual Worlds and Games for Serious Applications, VS-
Games 2019 - Proc., page 1DUUMY, 2019.
D. R. Garrison, T. Anderson, and W. Archer. Critical inquiry in a
text-based environment: Computer conferencing in higher education.
The internet and higher education, 2(2-3):87–105, 1999.
N. Gavish et al. Evaluating virtual reality and augmented reality train-
ing for industrial maintenance and assembly tasks. Interactive Learning
Environments, 23(6):778–798, 2015.
K. A. Greenan. The Influence of Virtual Education on Classroom
Culture. Frontiers in Communication, 6(March):10–13, mar 2021.
S. Higgins et al. Multi-touch tables and collaborative learning. British
J. of Educational Technology, 43(6):1041–1054, 2012.
M.-B. Ibáñez and C. Delgado-Kloos. Augmented reality for STEM
learning: A systematic review. Computers & Education, 123:109–123,
aug 2018.
M. S. Y. Jong, C. C. Tsai, H. Xie, and F. Kwan-Kit Wong. Integrating
interactive learner-immersed video-based virtual reality into learning
and teaching of physical geography. British Journal of Educational
Technology, 2020.
R. S. Kennedy et al. Simulator sickness questionnaire: An enhanced
method for quantifying simulator sickness. The international J. of
aviation psychology, 3(3):203–220, 1993.
J. Kim et al. A Deep Cybersickness Predictor Based on Brain Signal
Analysis for Virtual Reality Contents. In 2019 IEEE/CVF ICCV, pages
10579–10588, Seoul, Korea, oct 2019. IEEE.
A. Kumar et al. Theophany: Multimodal speech augmentation in
instantaneous privacy channels. Proc. of the 29th ACM Inter. Conf. on
Multimedia, 2021.
K. Lee et al. Outatime: Using speculation to enable low-latency
continuous interaction for mobile cloud gaming. In Proceedings of the
13th Annual International Conference on Mobile Systems, Applications,
and Services, 2015.
L.-H. Lee, T. Braud, et al. All one needs to know about metaverse: A
complete survey on technological singularity, virtual ecosystem, and
research agenda. arXiv:2110.05352, 2021.
L.-H. Lee et al. From seen to unseen: Designing keyboard-less inter-
faces for text entry on the constrained screen real estate of augmented
reality headsets. Pervasive Mob. Comput., 64:101148, 2020.
L.-H. Lee et al. Ubipoint: towards non-intrusive mid-air interaction for
hardware constrained smart glasses. Proc. of the 11th ACM Multimedia
Sys. Conf., 2020.
L.-H. Lee et al. When creators meet the metaverse: A survey on
computational arts. ArXiv, abs/2111.13486, 2021.
L.-H. Lee and Others. Towards augmented reality driven human-city
interaction: Current research on mobile headsets and future challenges.
ACM Computing Surveys (CSUR), 54:1 – 38, 2022.
M. Lieux et al. Online conferencing software in radiology: Recent
trends and utility. Clinical Imaging, 76:116–122, 2021.
Y. Lu, Y. Xu, and X. Zhu. Designing and Implementing
, a
Virtual Reality Remote Education for Experimental Chemistry System.
Journal of Chemical Education, 98(8):2720–2725, aug 2021.
J. A. Munoz-Cristobal et al. Supporting teacher orchestration in ubiqui-
tous learning environments: A study in primary education. IEEE Trans.
on Learning Technologies, 8(1):83–97, 2014.
C. M. Oman. Motion sickness: a synthesis and evaluation of the
sensory conflict theory. Canadian J. of physiology and pharmacology,
68(2):294–303, 1990.
E. M. Onyema et al. Impact of coronavirus pandemic on education. J.
of Education and Practice, 11(13):108–121, 2020.
P. P. Rega and B. N. Fink. Immersive simulation education: a novel ap-
proach to pandemic preparedness and response. Public Health Nursing,
31(2):167–174, 2014.
A. Shoufan. What motivates university students to like or dislike an
educational online video? a sentimental framework. Computers &
education, 134:132–144, 2019.
P. Soltani and A. H. Morice. Augmented reality tools for sports educa-
tion and training. Computers & Education, 155:103923, 2020.
A. Somrak et al. Estimating vr sickness and user experience using
different hmd technologies: An evaluation study. Future Generation
Comp. Sys., 94:302–316, 2019.
H. Song, J. Kim, and N. Park. I know my professor: Teacher self-
disclosure in online education and a mediating role of social presence.
Inter. J. of Human–Computer Interaction, 35(6):448–455, 2019.
Y. M. Tang, K. M. Au, H. C. Lau, G. T. Ho, and C. H. Wu. Evaluating
the effectiveness of learning design with mixed reality (MR) in higher
education. Virtual Reality, 24(4):797–807, 2020.
D. Velev and P. Zlateva. Virtual Reality Challenges in Education and
Training. International J. of Learning, 3(1):33–37, 2017.
Y. Wang et al. Development of a speed protector to optimize user
experience in 3D virtual environments. Inter. J. of Human-Computer
Studies, 147:102578, dec 2021.
Y. Wang et al. Using Fuzzy Logic to Involve Individual Differences
for Predicting Cybersickness during VR Navigation. In 2021 IEEE VR,
pages 373–381, Lisbon, Portugal, mar 2021. IEEE.
C. D. Wickens. Virtual reality and education. IEEE International
Conference on Systems, Man and Cybernetics, 1992.
N. R. Wijesooriya et al. Covid-19 and telehealth, education, and
research adaptations. Paediatric Respiratory Reviews, 35:38–42, 2020.
R. Winkler et al. Sara, the Lecturer: Improving Learning in Online
Education with a Scaffolding-Based Conversational Agent, page 1–14.
ACM, NY, USA, 2020.
E. Yalow and R. E. Snow. Individual differences in learning from
verbal and figural materials. Technical report, Stanford Univ Calif
School of Education, 1980.
S. Zhang and Q. Liu. Investigating the relationships among teachers’
motivational beliefs, motivational regulation, and their learning en-
gagement in online professional learning communities. Computers &
Education, 134:145–155, 2019.
Y. Zhang et al. Virtual reality applications for the built environ-
ment: Research trends and opportunities. Automation in Construction,
118:103311, 2020.
... Nestes ambientes, alunos e professores podem conviver no mesmo espaço virtual, independentemente se no mundo real estão, presencialmente em uma sala ou dentro de suas residências, contando com suporte tecnológico para uma maior interatividade e uma vasta gama recursos que, inclusive, fomentam a AA (Kye, Han, Kim, Park, & Jo, 2021;Chen et al., 2022). Porém, o uso do metaverso como ambiente para AA dentro do contexto híbrido é uma inovação pouco explorada (Tlili et al., 2022;Wang, Lee, Braud, & Hui, 2022). ...
Full-text available
A educação passou por grandes transformações durante a COVID-19, incluindo mudanças nos processos de ensino e aprendizagem, as quais buscaram fomentar um melhor engajamento, participação e pensamento crítico dos alunos durante as aulas remotas. Com isso, muitos professores adaptaram técnicas de aprendizagem ativa nestes contextos na tentativa de melhorar a interação, motivação e o aprendizado dos estudantes. Com a melhoria do cenário pandêmico e o retorno gradativo às atividades presenciais, alunos e professores se viram novamente em um novo e desafiador ambiente, o ensino e aprendizado híbrido. Neste cenário, tecnologias apoiadas no metaverso surgem naturalmente como plataformas inovadoras para este tipo de ensino, ao se apresentarem como propostas de extensão do mundo real em ambientes virtuais. Neste artigo, o objetivo foi propor o uso de metaverso como um ambiente de aprendizagem ativa capaz de fornecer suporte ao ensino e aprendizado híbrido. Por meio de um estudo misto (quantitativo e qualitativo) baseado em survey, foi analisada a percepção dos alunos sobre aceitação tecnológica, ambiente de aprendizado e a sua motivação em relação ao uso do metaverso. Como resultados, foram observadas evidências positivas relacionadas com a percepção dos alunos, indicando o metaverso como uma possível abordagem inovadora em contextos de ensino e aprendizado híbridos. A partir das lições do ensino remoto herdadas da pandemia, este trabalho apresenta contribuições que podem auxiliar educadores a refletir sobre práticas educacionais nesta nova realidade "pós-COVID".
... Finally, it also worth mentioning that Metaverses can help blur the boundaries between virtual and physical learning environments, as discussed by Wang et al. (2022), who aimed to "transform the traditional physical classroom into virtualphysical cyberspace as a new social network of learners and educators connected at an unprecedented scale." ...
The Metaverse is not a new concept; nor its use for educational purpose. Despite its considerable attention and recent popularity, largely due to Facebook’s strategic bet, educators have been experimenting with such an environment for a while. Still though, educational activities that are delivered over a Metaverse environment require further exploration to fully understand their full potential, opportunities, and challenges, since such activities differ from traditional on-campus or e-learning education. Nowadays, the area of on-chain education in the Metaverse remains unexplored. In this paper, we are presenting the findings from our attempt to deliver the first of its kind on-chain and in the Metaverse course delivered by the University of Nicosia (UNIC). The course was offered in Fall 2022 as a Massive Open Online Course (MOOC) and was attended by 22,500 students. The paper reports interesting observations from this experiment highlighting lessons learned and open issues for further research. The findings reveal that a Metaverse-like environment that is structured for offering educational experiences to potential learners is likely to disrupt existing educational paradigms and academic practices. We are at a stage where significant amount of research and development is required to further understand and explore the opportunities for the application of a “meta” environment for education.KeywordsMetaverseon-chaineducationMOOCexperiment
Conference Paper
A COVID-19 estimulou varias mudanças no campo da educação sendo uma delas o ensino e aprendizado híbrido. Neste novo contexto, as tecnologias usadas no ensino remoto podem ser pouco eficazes para suporte presencial e remoto simultaneamente, fazendo com que metaversos surjam como tecnologias inovadores para suporte ao aprendizado, reflexão e colaboração. Neste estudo, e usado um metaverso como ambiente de aprendizado em uma aula híbrida, sendo avaliado por alunos de um curso superior quanto ao suporte na colaboração e aprendizado. A metodologia do estudo foi baseada nas etapas de um estudo quasi-experimental e suas respostas coletadas em um survey. Os dados foram analisados quantitativamente em análises estatísticas descritiva e de correlação. Como resultado foi possível observar indícios de que o uso de metaverso como ambiente de aprendizagem híbrido foi percebido pelos alunos como positivo em relação ao suporte colaborativo e aprendizado, resultando em uma forte correlação entre estes dois aspectos. Portanto, esta pesquisa apresenta contribuições para as áreas de educação e sistemas colaborativos, ao apresentar uma aplicação de metaverso que oportunizou refletir como estas tecnologias podem influenciar aspectos colaborativos dentro do contexto educacional e, consequentemente, em outras atividades da sociedade.
Recently, the contactless lifestyle has shifted to a new paradigm due to social issues including diseases and natural disasters caused by COVID-19. In this study, the core technology ‘MetaOps’ is derived from ‘Metaverse' and ‘Operation', which employed to data processing unit (DPU) in a core–edge distributed cloud environment that enables real-time mutual interaction between human-to-avatar in the Metaverse space. It provides an efficient real-time communication for huge data from Metaverse services. The performance was observed for simultaneous access with hundreds of user interactions and shows practical possibility of large-scale Metaverse services.KeywordsMetaverseDistributed cloudMachine learning
Full-text available
The concept of a metaverse, a virtual world that offers immersive experiences, has gained widespread interest in recent years. Despite the hype, there is still a gap in its practical application, especially in the realm of education. This study presents the design and implementation of a metaverse tailored to the needs of education. The goal of this paper is to demonstrate the feasibility of such a system and evaluate its effectiveness. It is crucial to understand the architecture and implementation of a metaverse to effectively customise it for educational purposes. To assess user experience, a field study was conducted, collecting data through questionnaires and qualitative feedback. The results show that users were pleased with the features, player experience, and ease of use.
Conference Paper
Full-text available
There have been many studies about how individual differences can affect users' susceptibility to cybersickness in a VR application. However, the lack of strategy to integrate the influence of each factor on cybersickness makes it difficult to utilize the results of existing research. Based on the fuzzy logic theory that can represent the effect of different factors as a single value containing integrated information, we developed two approaches including the knowledge-based Mamdani-type fuzzy inference system and the data-driven Adaptive neuro-fuzzy inference system (ANFIS) to involve three individual differences (Age, Gaming experience and Ethnicity) and we correlated the corresponding outputs with the scores obtained from the simulator sickness questionnaire (SSQ) in a simple navigation scenario. The correlation coefficients obtained through a 4-fold cross validation were found statistically significant with both fuzzy logic approaches, indicating their effectiveness to influence the occurrence and the level of cybersickness. Our work provides insights to establish customized experiences for VR navigation by involving individual differences.
Full-text available
Virtual walking in virtual environments (VEs) requires locomotion interfaces, especially when the available physical environment is smaller than the virtual space due to virtual reality facilities limitations; many navigation approaches have been proposed according to different input conditions, target selection and speed selection. With current technologies, the virtual locomotion speed for most VR systems relies primarily on rate-control devices (e.g., joystick). The user has to manage manual adaptation of the speed, based on the size of the VE and personal preferences. However, this method cannot provide optimal speeds for locomotion as the user tends to change the speed involuntarily due to non-desired issues including collisions or simulator sickness; in this case, the user may have to adjust the speed frequently and unsmoothly, worsening the situation. Therefore, we designed a motion protector that can be embedded into the locomotion system to provide optimal speed profiles. The optimization process aims at minimizing the total jerk when the user translates from an initial position to a target, which is a common rule of the human motion model. In addition to minimization, we put constraints on speed, acceleration and jerk so that they do not exceed specific thresholds. The speed protector is formulated mathematically and solved analytically in order to provide a smooth navigation experience with a minimum jerk of trajectory. The assessment of the speed protector was conducted in a user study measuring user experience with a simulator sickness questionnaire, event-related skin conductance responses (ER-SCR), and a NASA-TLX questionnaire, showing that the designed speed protector can provide more natural and comfortable user experience with appropriate acceleration and jerk as it avoids abrupt speed profiles.
Full-text available
Cybersickness remains a major challenge in the virtual reality community. It occurs mainly when navigating in a 3D immersive virtual environment. Several parameters are known to influence the users’ cybersickness level while navigating, that can be either technological or neuro-psychological. This study investigates two of these parameters that are the distance from a virtual barrier and the choice of the navigation interface. An experiment was performed for each of these parameters to evaluate their influence on the variation of cybersickness. For each experiment, participants were asked to navigate in a large virtual room with walls that were textured with a black and white lined pattern to voluntarily exacerbate cybersickness. The level of cybersickness was collected through subjective (Simulator Sickness Questionnaire) and behavioral (evolution of postural sway) measurements. Results allow drawing suggestions for optimal navigation, so that cybersickness can be significantly reduced, thus providing with enhanced user experience.
In this work, we introduce the VR²E²C system for Virtual Reality Remote Education for Experimental Chemistry. It aims to increase scientific literacy about remote education of theoretical and experimental chemistry among the public and high school students. It provides a laboratory option for students with disabilities who are keen on chemistry experiments but have limited mobility. VR²E²C provides three modes to accomplish the training: users can choose to control the intelligent robot to finish the experimental operations by instruction or choose to watch the experiments automatically conducted by VR²E²C. The experimental results can be delivered to users in real time or later. This system provides remote education for safe and high fault-tolerant experimental topics, demonstrating how the latest VR and robotics technology could benefit traditional chemical education.
The continued advancement in user interfaces comes to the era of virtual reality that requires a better understanding of how users will interact with 3D buttons in mid-air. Although virtual reality owns high levels of expressiveness and demonstrates the ability to simulate the daily objects in the physical environment, the most fundamental issue of designing virtual buttons is surprisingly ignored. To this end, this paper presents four variants of virtual buttons, considering two design dimensions of key representations and multi-modal cues (audio, visual, haptic). We conduct two multi-metric assessments to evaluate the four virtual variants and the baselines of physical variants. Our results indicate that the 3D-lookalike buttons help users with more refined and subtle mid-air interactions (i.e. lesser press depth) when haptic cues are available; while the users with 2D-lookalike buttons unintuitively achieve better keystroke performance than the 3D counterparts. We summarize the findings, and accordingly, suggest the design choices of virtual reality buttons among the two proposed design dimensions.
Videoconferencing platforms have recently gained wide attention due to the COVID-19 pandemic, both within and outside of the medical community. This article reviews various applications of online meeting technology to the radiologic community, not only in response to the recent pandemic but also thereafter. Various platform features are outlined and discussed, specifically with respect to collaboration, training, and patient care. Platforms reviewed are GoToMeeting, Microsoft Teams, Skype, WebEx, and Zoom.