ArticlePDF Available

Security and Privacy in the Metaverse: The Threat of the Digital Human

Authors:

Abstract

Each year, researchers and technologists are bringing the vision of the Metaverse, which is predicted to be the future of the internet, closer to becoming a reality. People will spend most of their time in this space interacting face-to-face, so to speak, with highly customizable digital avatars that seamlessly convey precise non-verbal cues from the physical movements of the users themselves. This is an exciting prospect; however, there are many privacy and security concerns that arise from this new form of interaction. Precision motion tracking is required to drive high-fidelity animation, and this affords a mass of data that has never been available before. This data provides a wealth of physical and psychological information that can reveal medical conditions, mental disorders, personality, emotion, personal identity, and more. In this paper, we discuss some implications of the availability of this data, with a focus on the psychological manipulation and coercion capabilities made available by it.
Security and Privacy in the Metaverse: The Threat of the Digital
Human
Lauren Buck
Trinity College Dublin
Dublin, Ireland
lauren.e.buck.12@gmail.com
Rachel McDonnell
Trinity College Dublin
Dublin, Ireland
ramcdonn@tcd.ie
ABSTRACT
Each year, researchers and technologists are bringing the vision of
the Metaverse, which is predicted to be the future of the internet,
closer to becoming a reality. People will spend most of their time
in this space interacting face-to-face, so to speak, with highly cus-
tomizable digital avatars that seamlessly convey precise non-verbal
cues from the physical movements of the users themselves. This is
an exciting prospect; however, there are many privacy and security
concerns that arise from this new form of interaction. Precision
motion tracking is required to drive high-delity animation, and
this aords a mass of data that has never been available before. This
data provides a wealth of physical and psychological information
that can reveal medical conditions, mental disorders, personality,
emotion, personal identity, and more. In this paper, we discuss some
implications of the availability of this data, with a focus on the psy-
chological manipulation and coercion capabilities made available
by it.
CCS CONCEPTS
Security and privacy
Social aspects of security and pri-
vacy;Human-centered computing
Collaborative and social
computing theory, concepts and paradigms.
KEYWORDS
virtual reality, personal data collection and use, shared virtual envi-
ronments, avatars, agents, biometric data, machine learning, arti-
cial intelligence
ACM Reference Format:
Lauren Buck and Rachel McDonnell. 2022. Security and Privacy in the Meta-
verse: The Threat of the Digital Human. In Proceedings of CHI Conference
on Human Factors in Computing Systems (CHI EA ’22, Proceedings of the 1st
Workshop on Novel Challenges of Safety, Security and Privacy in Extended
Reality). ACM, New York, NY, USA, 4 pages.
1 INTRODUCTION
In the present moment, we live in an age where personal data has
been considered by some as more valuable than oil [
2
]. Tech com-
panies deliberately design applications that are addictive to their
users and source user data to create personalized ad experiences
in order to generate revenue. Meta reported US$33bn in revenue
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the owner/author(s).
CHI EA ’22, Proceedings of the 1st Workshop on Novel Challenges of Safety, Security and
Privacy in Extended Reality, April 29 - May 5, 2022, New Orleans, LA, USA
©2022 Copyright held by the owner/author(s).
in the 4th Quarter of 2021 alone [
1
], and ByteDance (TikTok) is
currently valued around US$400bn as it reigns as the top grossing
social media app of 2021 [
6
]. As individuals and governments grap-
ple with the impact of personal data collection and how to protect
individual users from big tech, small pockets of online scammers
vie for the same data. Romance scams occur frequently online,
where individuals are coalesced into fake relationships that often
result in manipulation and theft, and infamous ‘rug pull’ scenarios
involving cryptocurrencies have raked in over US$2.8bn in just
the last year [
16
]. Protecting the individual has never been more
important than ever as new technologies evolve and create new
hosts of problems to resolve. In this paper, we consider some issues
that are probable to become common, complex issues derived from
the widespread adoption of virtual reality (VR) technology.
VR is a unique technological medium that is set apart from other
media by one dening attribute: 3D interaction. VR environments
emulate real world perceptions and allow users to move through
space and interact with objects and people as they would naturally.
This capability is driven by positional tracking, which calculates the
precise position of a head-mounted display, controllers, and other
trackers attached to the body, within Euclidean space. This tracking
captures the motions of users and allows for the embodiment of
bodily self-representations (i.e., avatars), which are an essential
component of the VR experience. These capabilities alone intro-
duce a myriad of privacy and security issues that require attention,
especially considering the growing of body of knowledge about
data extraction and psychological inuence occurring due to the
use of VR in experimental settings alone.
The existing catalog of literature that revolves around privacy
and security in VR mainly focuses on motion tracking (gait analy-
sis, eye tracking, general bodily motions), and there is increasing
awareness of the fact that this data provides personally identi-
able information. A shocking study carried out by Mark Miller
and colleagues reports that ve minutes of motion tracking data
gathered only from a head-mounted display during a typical view-
ing task is enough to identify a user with 95% accuracy [
25
]. This
type of data has the potential to reveal information about users
that reect mental state and medical status. Buck and Bodenheimer
note that tracking a user’s personal space representation can reveal
social preferences and disorders like social anxiety [
5
]. Dierent
types of body motions detected can predict levels of creativity [
38
]
and learning [
37
], along with medical conditions such as autism,
ADHD, and PTSD [
25
]. Diane Hosfelt, among others, have warned
that the misuse and abuse of this data can produce life-altering
consequences [15, 28].
However, the safety and privacy issues of VR that we as a com-
munity are cognizant of still remain a gray area. In this work, we
hope to bring to light one pocket of issues directly related to the
CHI EA ’22, April 29 - May 5, 2022, New Orleans, LA, USA Buck and McDonnell
psychological manipulation of users by digital humans based on
biometric data collection. There has been little publication, to our
knowledge, on this issue.
2 THE DANGERS OF THE DIGITAL HUMAN
It is no secret that VR experiences can be psychologically com-
pelling. Mel Slater and his colleagues have published an abundance
of research that can attest to this. Men report feelings of empathy
toward women after experiencing sexual harassment in a woman’s
body [
27
], the observation of a virtual body-double of oneself in-
teracting with a crowd can reduce self-persecutory thoughts [
13
],
and putatively stressful simulations can produce both physiologi-
cal and psychological responses [
22
]. Why these simulations can
be so impactful stems from the ability of VR users to embody vir-
tual self-representations. The embodiment of a virtual character
is a commonplace phenomenon in VR [
20
], and increases the per-
ceived plausability of the simulation [
32
]. These graphical self-
representations, or self-avatars, do not have to match the physical-
ity of their users in order to elicit the sensation of embodiment [
12
],
and virtual characters are not bound by physical constraints.
The importance of knowing that avatar appearance is mutable
is in the detail that humans are naturally prone to making judg-
ments based on physical appearance. It has been proven, even in
early 2D games with elementary graphics, that the way users in-
teract with one another has a lot to do with looks [
10
,
30
]. It has
been made clear that attractiveness dictates how VR users perceive
themselves and others and the behaviors they choose to carry out.
Avatars have already been posited as potential salespeople [
17
,
26
],
whose appearance can be persuasive enough to inuence decision-
making [
9
,
19
], and when embodied can embolden users to engage
in risk-taking behaviors [
23
]. VR gives us the exibility to be dig-
ital chameleons, and we can connect the dots to understand that
this ability in the hands of those with malicious intent will drive
us toward a dystopian vision of the future we are all becoming
increasingly familiar with.
This is where biometric data comes into play: the appearance of
the computer or human-driven agents and avatars users are inter-
acting with can be adapted to user preferences based on data that
the user is unaware they have shared. Depending on the technol-
ogy a system is using, eye tracking data can provide pupil dilation
and gaze xation data in response to visual stimuli [
36
], motion
data can provide proxemic behavior and body language cues [
5
,
8
],
and physiological data like EEG and skin conductance can provide
levels of emotional activation [
33
]. Additionally, facial tracking can
also provide a window into emotional response [
3
]. These non-
verbal cues can be fed into clever machine learning and articial
intelligence algorithms to create personalized, idealized interaction
partners.
Besides outward appearance, both voice and motion can be ma-
nipulated to be appealing to users. Software can manipulate vocal
tone, pitch and amplitude, which can allow users to change their
voice from male to female and vice versa. Attractive voices that are
smooth in texture and similar in pitch and timbre can be created
easily via auditory morphing [
4
]. Vocal cloning software can mimic
the sound of a particular person’s voice. Motion data can give way
to an expanse of physical and psychological information, which
we have discussed in the introduction. Vocal expression in virtual
characters has already been shown to impact social inuence and
attraction [
34
], and motion data is rich with cues that can be cat-
egorized into levels of attractiveness, which inuence interaction
behaviors [40].
These interaction partners may not only be designed to be at-
tractive to users, but may understand users deeply on both a physi-
cal and psychological level. The personality of an agent could be
adapted to be most likeable by the personality type the user is ex-
hibiting [
21
]. A more advanced iteration of articial intelligence or
a human driving an avatar could perhaps detect and empathize with
medical and mental conditions to create a sense of closeness and
trust with a user. There are many applications of this psychological
information that can take place with both benevolent and malicious
intent.
Herein lies our danger. How far are we willing to take these
technological capabilities? Generative adversarial networks (GANs)
have already been leveraged to create human likenesses from scratch
[
18
] and deepfakes generate video and audio to create scenes that
have never happened in real life. Adapting agent and avatar appear-
ance, personality, and interaction in order to sell products to users
is not a far reach, as Amazon populates recommended items based
on shopping habits, and social media sites generate personalised
ads, by using machine learning techniques. Neither is it obscene to
think avatars may be used for political duress, as some social media
sites are notorious for serving politically polarizing content to users
and manipulating emotions to increase engagement. Radical groups
have been known to recruit impressionable, young people through
online tactics. Additionally, Online gambling sites take advantage
of those suering with gambling addictions, and VR is already
considered to promote high-risk gambling behaviors [
29
]. Addition-
ally, virtual inuencers are already materializing [
35
]. Could not a
strategically placed agent persuade a user to engage in high-risk
behavior? If we think about it, mental and physical traits extracted
from biometric data could be exploited to coerce users into situa-
tions and behaviors they would otherwise refuse to engage in.
Particular attention needs to be paid to the potential for users to
be manipulated not just by businesses and institutions, but by other
individuals. It would be quite easy for a number of existing internet
scams to spill into the Metaverse, and for their impact to be even
more psychologically devastating because of the immersive aspect
of VR. Extortion and bullying could put users in more personal,
compromising situations, and concern comes with how children
will be protected since they are particularly impressionable. Online
predators will be handed a whole new toolkit of coercive measures
with the availability of more natural interaction. Finally, cyberat-
tacks will expose sensitive biometric data that could be sold on the
dark web, which would be devastating to one’s personal privacy.
Fortunately, there are many things that can be done to combat
the misuse of biometric data before it begins, and we are all called
to make positive contributions in this space. Power should be given
to the user. People should be made aware of and educated on the
implications of biometric data coupled with VR, and should be given
the option to opt out of this type of data collection. Developers can
implement cybersecurity protocols and can also choose to introduce
noise to this type of data in order to generalize it and prevent it from
revealing personally identiable information. Finally, legislators
Security and Privacy in the Metaverse: The Threat of the Digital Human CHI EA ’22, April 29 - May 5, 2022, New Orleans, LA, USA
can introduce laws that prevent businesses and individuals from
collecting and using this data with malicious intent.
Academics are called to make an impact through ongoing re-
search to help understand and mitigate known and unknown psy-
chological problems that will arise in the Metaverse. Potential re-
search avenues include continuing to understand how users can
be manipulated in advertisement scenarios [
14
,
24
], what physical
properties of agents and avatars are likely to have psychological
inuence over users [
39
], how risky behaviors translate from real to
virtual scenarios [
7
], and the overall psychological impact of digital
interaction in the Metaverse that will translate into the daily lives
of users (think how augmented images aect self-esteem [
11
,
31
]).
There are many positive impacts that interaction with digital hu-
mans can have, and it is up to bring about an ethical iteration of
the Metaverse.
3 CONCLUSIONS
Widespread adoption of the Metaverse comes with many unique
threats to user privacy and security, some of which we have broached
in this work with regard to digital humans. Biometric data reveals
a host of personally identiable information which can in turn
be used to potentially manipulate users on a psychological level
through the creation of avatars that are adaptable to user prefer-
ences. With respect to security and privacy issues, the VR commu-
nity is in the midst of the Collingridge Dilemma; it is faced with the
responsibility of understanding the potential risks that the Meta-
verse poses to the individual and mitigating those problems before
it is too late. In the grand scheme of things, the digital human is
something amazing and fearsome, and an aspect of VR that is not
to be considered lightly.
ACKNOWLEDGMENTS
This research was funded by Science Foundation Ireland under
the ADAPT Centre for Digital Content Technology (Grant No.
13/RC/2106_P2) and RADICal (Grant No. 19/FFP/6409).
REFERENCES
[1]
2021. Facebook Reports Third Quarter 2021 Results. https://investor.fb.com/
investor-news/press-release-details/2022/Meta-Reports-Fourth- Quarter-and-
Full-Year-2021- Results/default.aspx. Accessed: 2022-02-21.
[2]
Anonymous. 2017. The world’s most valuable resource is no longer oil, but
data. The Economist (2017). https://www.economist.com/leaders/2017/05/06/the-
worlds-most- valuable-resource- is-no-longer-oil-but- data Accessed: 2022-02-01.
[3]
Jeremy N Bailenson, Emmanuel D Pontikakis, Iris B Mauss, James J Gross, Maria E
Jabon, Cendri AC Hutcherson, Cliord Nass, and Oliver John. 2008. Real-time
classication of evoked emotions using facial feature tracking and physiological
responses. International journal of human-computer studies 66, 5 (2008), 303–317.
[4]
Laetitia Bruckert, Patricia Bestelmeyer, Marianne Latinus, Julien Rouger, Ian
Charest, Guillaume A Rousselet, Hideki Kawahara, and Pascal Belin. 2010. Vocal
attractiveness increases by averaging. Current biology 20, 2 (2010), 116–120.
[5]
Lauren E Buck and Bobby Bodenheimer. 2021. Privacy and Personal Space:
Addressing Interactions and Interaction Data as a Privacy Concern. In 2021 IEEE
Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops
(VRW). IEEE, 399–400.
[6]
David Curry. 2022. Top Grossing Apps (2022). Business of Apps (2022). https:
//www.businessofapps.com/data/top-grossing-apps/ Accessed: 2022-02-01.
[7]
Carla de Juan-Ripoll, José L Soler-Domínguez, Jaime Guixeres, Manuel Contero,
Noemi Álvarez Gutiérrez, and Mariano Alcañiz. 2018. Virtual reality as a new
approach for risk taking assessment. Frontiers in psychology (2018), 2532.
[8] Julius Fast. 1970. Body language. Vol. 82348. Simon and Schuster.
[9]
Ylva Ferstl, Elena Kokkinara, and Rachel Mcdonnell. 2017. Facial features of non-
player creatures can inuence moral decisions in video games. ACM Transactions
on Applied Perception (TAP) 15, 1 (2017), 1–12.
[10]
Ylva Ferstl, Michael McKay, and Rachel McDonnell. 2021. Facial feature manip-
ulation for trait portrayal in realistic and cartoon-rendered characters. ACM
Transactions on Applied Perception (TAP) 18, 4 (2021), 1–8.
[11]
Rebecca Fribourg, Etienne Peillard, and Rachel Mcdonnell. 2021. Mirror, Mirror
on My Phone: Investigating Dimensions of Self-Face Perception Induced by
Augmented Reality Filters. In 2021 IEEE International Symposium on Mixed and
Augmented Reality (ISMAR). IEEE, 470–478.
[12]
Mar Gonzalez-Franco and Tabitha C Peck. 2018. Avatar embodiment. towards a
standardized questionnaire. Frontiers in Robotics and AI 5 (2018), 74.
[13]
Georey Gorisse, Gizem Senel, Domna Banakou, Alejandro Beacco, Ramon Oliva,
Daniel Freeman, and Mel Slater. 2021. Self-observation of a virtual body-double
engaged in social interaction reduces persecutory thoughts. Scientic reports 11,
1 (2021), 1–13.
[14]
Brittan Heller and Avi Bar-Zeev. 2021. The Problems with Immersive Advertising:
In AR/VR, Nobody Knows You Are an Ad. Journal of Online Trust and Safety 1, 1
(2021).
[15]
Diane Hosfelt. 2019. Making ethical decisions for the immersive web. arXiv
preprint arXiv:1905.06995 (2019).
[16]
Stephanie Hughes. 2022. Rug-pull scams raked in over US$2.8 billion in crypto
in 2021, report nds. Financial Post (2022). https://nancialpost.com/fp-
nance/cryptocurrency/rug-pull- scams-raked-in- over-us2- 8-billion- in-
crypto-in- 2021-report-nds Accessed: 2022-02-01.
[17]
Seung-A Annie Jin and Justin Bolebruch. 2009. Avatar-based advertising in
Second Life: The role of presence and attractiveness of virtual spokespersons.
Journal of Interactive Advertising 10, 1 (2009), 51–60.
[18]
Tero Karras, Samuli Laine, and Timo Aila. 2019. A style-based generator ar-
chitecture for generative adversarial networks. In Proceedings of the IEEE/CVF
conference on computer vision and pattern recognition. 4401–4410.
[19]
Rabia Fatima Khan and Alistair Sutclie. 2014. Attractive agents are more
persuasive. International Journal of Human-Computer Interaction 30, 2 (2014),
142–150.
[20]
Konstantina Kilteni, Raphaela Groten, and Mel Slater. 2012. The sense of embod-
iment in virtual reality. Presence: Teleoperators and Virtual Environments 21, 4
(2012), 373–387.
[21]
Tze Wei Liew and Su-Mae Tan. 2016. Virtual agents with personality: Adaptation
of learner-agent personality in a virtual learning environment. In 2016 Eleventh
International Conference on Digital Information Management (ICDIM). IEEE, 157–
162.
[22]
Marieke AG Martens, Angus Antley, Daniel Freeman, Mel Slater, Paul J Harrison,
and Elizabeth M Tunbridge. 2019. It feels real: physiological responses to a
stressful virtual reality environment and its impact on working memory. Journal
of Psychopharmacology 33, 10 (2019), 1264–1273.
[23]
Paul R Messinger, Xin Ge, Eleni Stroulia, Kelly Lyons, Kristen Smirnov, and
Michael Bone. 2008. On the relationship between my avatar and myself. Journal
For Virtual Worlds Research 1, 2 (2008).
[24]
Abraham Hani Mhaidli and Florian Schaub. 2021. Identifying manipulative
advertising techniques in xr through scenario construction. In Proceedings of the
2021 CHI Conference on Human Factors in Computing Systems. 1–18.
[25]
Mark Roman Miller, Fernanda Herrera, Hanseul Jun, James A Landay, and
Jeremy N Bailenson. 2020. Personal identiability of user tracking data dur-
ing observation of 360-degree VR video. Scientic Reports 10, 1 (2020), 1–10.
[26]
Ian Mull, Jamie Wyss, Eunjung Moon, and Seung-Eun Lee. 2015. An exploratory
study of using 3D avatars as online salespeople: The eect of avatar type on
credibility, homophily, attractiveness and intention to interact. Journal of Fashion
Marketing and Management (2015).
[27]
Solène Neyret, Xavi Navarro, Alejandro Beacco, Ramon Oliva, Pierre Bourdin,
Jose Valenzuela, Itxaso Barberia, and Mel Slater. 2020. An embodied perspective
as a victim of sexual harassment in virtual reality reduces action conformity in a
later milgram obedience scenario. Scientic reports 10, 1 (2020), 1–18.
[28]
Mónika Nogel, Gábor Kovács, and György Wersényi. 2021. The Regulation of
Digital Reality in Nutshell. In 12th IEEE International Conference on Cognitive
Infocommunications (CogInfoCom). 1–7.
[29]
Sebastian Oberdörfer, David Schraudt, and Marc Erich Latoschik. 2022. Embodied
Gambling-Investigating the Inuence of Level of Embodiment, Avatar Appear-
ance, and Virtual Environment Design on an Online VR Slot Machine. Frontiers
in Virtual Reality (2022), 8.
[30]
Connor P Principe and Judith H Langlois. 2013. Children and adults use attrac-
tiveness as a social cue in real people and avatars. Journal of experimental child
psychology 115, 3 (2013), 590–597.
[31]
Susruthi Rajanala, Mayra BC Maymone, and Neelam A Vashi. 2018. Seles—living
in the era of ltered photographs. JAMA facial plastic surgery 20, 6 (2018), 443–
444.
[32]
Mel Slater. 2009. Place illusion and plausibility can lead to realistic behaviour in
immersive virtual environments. Philosophical Transactions of the Royal Society
B: Biological Sciences 364, 1535 (2009), 3549–3557.
[33]
Feng Tian, Minlei Hua, Wenrui Zhang, Yingjie Li, and Xiaoli Yang. 2021. Emo-
tional arousal in 2D versus 3D virtual reality environments. Plos one 16, 9 (2021),
e0256211.
CHI EA ’22, April 29 - May 5, 2022, New Orleans, LA, USA Buck and McDonnell
[34]
Ilaria Torre, Emma Carrigan, Katarina Domijan, Rachel McDonnell, and Naomi
Harte. 2021. The Eect of Audio-Visual Smiles on Social Inuence in a Cooper-
ative Human–Agent Interaction Task. ACM Transactions on Computer-Human
Interaction (TOCHI) 28, 6 (2021), 1–38.
[35] Christopher Travers. 2022. Virtual Humans. https://www.virtualhumans.org/
[36]
Joseph Tao-yi Wang. 2011. Pupil dilation and eye tracking. A handbook of process
tracing methods for decision research: A critical review and user’s guide (2011),
185–204.
[37]
Andrea Stevenson Won,Jeremy N Bailenson, and Joris H Janssen. 2014. Automatic
detection of nonverbal behavior predicts learning in dyadic interactions. IEEE
Transactions on Aective Computing 5, 2 (2014), 112–125.
[38]
Andrea Stevenson Won, Jeremy N Bailenson, Suzanne C Stathatos, and Wenqing
Dai. 2014. Automatically detected nonverbal behavior predicts creativity in
collaborating dyads. Journal of Nonverbal Behavior 38, 3 (2014), 389–408.
[39]
Nick Yeeand Jeremy Bailenson. 2007. The Proteus eect: The eect of transformed
self-representation on behavior. Human communication research 33, 3 (2007),
271–290.
[40]
Katja Zibrek, Benjamin Niay, Anne-Hélène Olivier, Ludovic Hoyet, Julien Pettré,
and Rachel Mcdonnell. 2020. The eect of gender and attractiveness of motion
on proximity in virtual reality. ACM Transactions on Applied Perception (TAP) 17,
4 (2020), 1–15.
... MoE networks are not far from this fact. Therefore, HitB entities and AI-based things, in the MoE networks, should be trained and programmed in such a way that they are able to satisfy themselves security (Chow et al., 2023;Buck and McDonnell, 2022;Choi et al., 2022). These issues need cybersecurity experts and safety managers who are domine on MoE and can design/develop secure AI-based virtual things. ...
... Providing security and privacy features in each community is a primary requirement. In MoE networks, where no CA or manager exists, achieving security is challenging (Buck and McDonnell, 2022;Choi et al., 2022). It is expected that MoE developers optionally provide high-requested security properties in addition to securing MoE networks against famous security attacks (e.g., replay attack, DDoS attack, impersonation attack, man-in-the-middle attack, etc.). ...
Article
The emergence of Metaverse, a digital environment built on Web 3.0 technologies, has garnered significant attention from various industries and fields, including science and technology. This technology offers the potential to create virtual versions of physical objects, resulting in a digital twin (DT) of the physical world. This paper aims to explore the Metaverse-based Internet of everything (IoE), which we have termed the Metaverse of everything (MoE), as a three-dimensional (3D) virtual communication and relationship platform. The MoE includes Web 3.0 components, generated data, virtual individuals called Metaversians, and all virtual things in Metaverse, forming a comprehensive network of interconnected entities. Our research suggests that Metaverse will become increasingly popular, with the virtual world incorporating more aspects of the physical world. We have conducted a SWOT (Strengths - Weaknesses - Opportunities - Threats) analysis of the MoE concept and have identified several opportunities and challenges. Finally, we propose potential future directions for MoE that academic researchers, industry owners, and IT leaders can pursue. In particular, we suggest the implementation of a new service called Metaverse-as-a-creator (MaaC) based on an artificial intelligence (AI)-enabled Metaverse as a significant achievement in future Internet-based communication.
... Moreover, digital cloning is also possible for the avatar of the Metaverse. This can break the privacy of a user manipulating the avatars [40], [41]. ...
Conference Paper
The Metaverse is the future of the Internet-connected world of social networks. It uses emerging technologies such as extended reality (XR), virtual reality (VR), augmented reality (AR), and meta-quest headsets. As the popularity of the Metaverse grows, cyber-criminals are increasingly targeting it and its users. Unfortunately, security and privacy are some of the most challenging areas of the Metaverse, and yet, very little focus has been given to making it trustworthy. To achieve this goal, it is important to perform a thorough and systematic security analysis that would allow the adoption of proper mitigation strategies. A threat model-based analysis can help in such an analysis. This paper presents a threat model-based systematic security analysis of the Metaverse. We use the widely accepted STRIDE model to identify the threats of the Metaverse along with corresponding mitigation strategies. Our threat model can help the users, researchers, and developers build a trustworthy Metaverse.
... In terms of security and privacy aspects, it is important for regulation on personal data acquisition to be updated accordingly. Additionally, applications should implement robust security measures to protect personal data, as well as be transparent about how the data is being used [287]. Finally, ethical and social implications must be carefully investigated through a set of rules and regulations to avatar design implications and behaviors in the Metaverse. ...
Preprint
Full-text available
The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution.
Chapter
With the burgeoning growth of the metaverse and online virtual environments, new security challenges have been introduced that require careful exploration and mitigation. An increasing proportion of human interactions and transactions now take place in these digital spaces, making it essential to protect users and ensure the safety and integrity of virtual worlds. This chapter explores three dimensions of this issue. First, through a study of the types of crimes that occur in these environments, to gain a holistic understanding of the cybercrime technoscape. Second, the authors use a two-pronged approach to increase the safety of the metaverse by targeting both potential perpetrators and victims. This is achievable by identifying indicators that may be used to detect potential perpetrators or victims. Thirdly and finally, strategies and techniques to make these online communities safer are suggested.
Chapter
The metaverse is poised to offer an immersive virtual environment for people to meet, socialize, collaborate, play, and conduct business transactions. Although the metaverse has not yet reached its full maturity, it has already been misused to launch illegal activities that can jeopardize the safety and well-being of individuals, and cause harm to organizations. Despite its increased importance, “metaverse forensics” remains an unexplored research topic. This chapter starts by reviewing the state-of-the-art research related to metaverse cybersecurity threats and underlying ethical, privacy, and legal issues. It then presents the results of a forensic investigation analysis performed on the VRChat and AltspaceVR metaverse platforms. The authors present new insights into metaverse forensics in terms of accessing digital containers and retrieving useful information for forensic investigators. Additionally, they highlight the primary obstacles encountered in metaverse digital forensic investigations and put forth recommendations for future research directions.
Chapter
This is a concluding chapter that describes metaverse technology challenges posed on businesses as well as users. Ethical consideration is also a crucial point to discuss especially when the metaverse possesses difficulties such as cyberbullying, privacy and fairness and safety. Business implications for both consumers, as well as enterprises, discuss the opportunity landscape in creating end-to-end journeys immersive. In the end, the chapter highlights what can be done today to prepare for the future.
Article
Metaverse is expected to emerge as a new paradigm for the next-generation Internet, providing fully immersive and personalised experiences to socialize, work, and play in self-sustaining and hyper-spatio-temporal virtual world(s). The advancements in different technologies like augmented reality, virtual reality, extended reality (XR), artificial intelligence (AI), and 5G/6G communication will be the key enablers behind the realization of AI-XR metaverse applications. While AI itself has many potential applications in the aforementioned technologies (e.g., avatar generation, network optimization, etc.), ensuring the security of AI in critical applications like AI-XR metaverse applications is profoundly crucial to avoid undesirable actions that could undermine users’ privacy and safety, consequently putting their lives in danger. To this end, we attempt to analyze the security, privacy, and trustworthiness aspects associated with the use of various AI techniques in AI-XR metaverse applications. Specifically, we discuss numerous such challenges and present a taxonomy of potential solutions that could be leveraged to develop secure, private, robust, and trustworthy AI-XR applications. To highlight the real implications of AI-associated adversarial threats, we designed a metaverse-specific case study and analyzed it through the adversarial lens. Finally, we elaborate upon various open issues that require further research interest from the community.
Article
Full-text available
Slot machines are one of the most played games by players suffering from gambling disorder. New technologies like immersive Virtual Reality (VR) offer more possibilities to exploit erroneous beliefs in the context of gambling. Recent research indicates a higher risk potential when playing a slot machine in VR than on desktop. To continue this investigation, we evaluate the effects of providing different degrees of embodiment, i.e., minimal and full embodiment. The avatars used for the full embodiment further differ in their appearance, i.e., they elicit a high or a low socio-economic status. The virtual environment (VE) design can cause a potential influence on the overall gambling behavior. Thus, we also embed the slot machine in two different VEs that differ in their emotional design: a colorful underwater playground environment and a virtual counterpart of our lab. These design considerations resulted in four different versions of the same VR slot machine: 1) full embodiment with high socio-economic status, 2) full embodiment with low socio-economic status, 3) minimal embodiment playground VE, and 4) minimal embodiment laboratory VE. Both full embodiment versions also used the playground VE. We determine the risk potential by logging gambling frequency as well as stake size, and measuring harm-inducing factors, i.e., dissociation, urge to gamble, dark flow, and illusion of control, using questionnaires. Following a between groups experimental design, 82 participants played for 20 game rounds one of the four versions. We recruited our sample from the students enrolled at the University of Würzburg. Our safety protocol ensured that only participants without any recent gambling activity took part in the experiment. In this comparative user study, we found no effect of the embodiment nor VE design on neither the gambling frequency, stake sizes, nor risk potential. However, our results provide further support for the hypothesis of the higher visual angle on gambling stimuli and hence the increased emotional response being the true cause for the higher risk potential.
Article
Full-text available
The proportion of the population who experience persecutory thoughts is 10–15%. People then engage in safety-seeking behaviours, typically avoiding social interactions, which prevents disconfirmatory experiences and hence paranoia persists. Here we show that persecutory thoughts can be reduced if prior to engaging in social interaction in VR participants first see their virtual body-double doing so. Thirty non-clinical participants were recruited to take part in a study, where they were embodied in a virtual body that closely resembled themselves, and asked to interact with members of a crowd. In the Random condition (n = 15) they observed their body-double wandering around but not engaging with the crowd. In the Targeted condition the body-double correctly interacted with members of the crowd. The Green Paranoid Thoughts Scale was measured 1 week before and 1 week after the exposure and decreased only for those in the Targeted condition. The results suggest that the observation of the body-double correctly carrying out a social interaction task in VR may lead to anxiety-reducing mental rehearsal for interaction thus overcoming safety behaviours. The results also extend knowledge of the effects of vicarious agency, suggesting that identification with the actions of body-double can influence subsequent psychological state.
Article
Full-text available
The Problems with Immersive Advertising: In AR/VR, Nobody Knows You Are an Ad
Conference Paper
Full-text available
Digital reality refers to the wide spectrum of technologies and affordances that include Augmented Reality, Virtual Reality and Mixed Reality that simulate reality in various ways. Current level of digital technology and future development towards improving user involvement, entertainment, and accessibility based on digital reality induces not only technological questions but also regulatory, policy and liability issues. The ever-growing market of services using public networks will offer new possibilities and dangers for the user, for the business and create place for criminal activity. Regulators try to follow and adjust laws according to the challenges. This paper briefly analyses the current level and status of regulations on Hungarian and EU level, directing the attention of developers, system engineers and software designers to the questions of responsibility. Based on literature review, this paper discusses issues that are currently in the focus of the regulation in Europe in this regard.
Article
Full-text available
Previous studies have suggested that virtual reality (VR) can elicit emotions in different visual modes using 2D or 3D headsets. However, the effects on emotional arousal by using these two visual modes have not been comprehensively investigated, and the underlying neural mechanisms are not yet clear. This paper presents a cognitive psychological experiment that was conducted to analyze how these two visual modes impact emotional arousal. Forty volunteers were recruited and were randomly assigned to two groups. They were asked to watch a series of positive, neutral and negative short VR videos in 2D and 3D. Multichannel electroencephalograms (EEG) and skin conductance responses (SCR) were recorded simultaneously during their participation. The results indicated that emotional stimulation was more intense in the 3D environment due to the improved perception of the environment; greater emotional arousal was generated; and higher beta (21–30 Hz) EEG power was identified in 3D than in 2D. We also found that both hemispheres were involved in stereo vision processing and that brain lateralization existed in the processing.
Conference Paper
Full-text available
The privacy and security of personal data has been at the forefront of public concern for some time now, and is typically understood in the context of data collected from online interaction (social media, transactions, search engine queries, etc.). The advent of immersive technologies expand data collection beyond what can typically be extracted via online interaction, particularly in terms of the availability of biometric data (eye tracking and gait analysis). However, it has not yet been brought to light that interactions and interaction data will soon be of concern in terms of privacy. We mediate interactions in everyday life through the maintenance of personal space and allow certain individuals and objects into our personal space. We do the same in virtual reality. Our personal space allows us to preserve our feeling of safety, and the way we mediate it shows our biases and preferences. In this work, we take a look at the implications of interaction and the availability of personal data that will bring a host of ethical and privacy concerns.
Article
Emotional expressivity is essential for human interactions, informing both perception and decision-making. Here, we examine whether creating an audio-visual emotional channel mismatch influences decision-making in a cooperative task with a virtual character. We created a virtual character that was either congruent in its emotional expression (smiling in the face and voice) or incongruent (smiling in only one channel). People (N = 98) evaluated the character in terms of valence and arousal in an online study; then, visitors in a museum played the “lunar survival task” with the character over three experiments (N = 597, 78, 101, respectively). Exploratory results suggest that multi-modal expressions are perceived, and reacted upon, differently than unimodal expressions, supporting previous theories of audio-visual integration.
Article
Previous perceptual studies on human faces have shown that specific facial features have consistent effects on perceived personality and appeal, but it remains unclear if and how findings relate to perception of virtual characters. For example, wider human faces have been found to appear more aggressive and dominant, whereas studies on virtual characters have shown opposite trends but have suffered from significant eeriness of exaggerated features. In this study, we use highly realistic virtual faces obtained from 3D scanning, as well as cartoon-rendered counterparts retaining facial proportions. We assess the effects of facial width and eye size on perceptions of appeal, trustworthiness, aggressiveness, dominance, and eeriness. Our manipulations did not affect eeriness, and we find the same perceptual trends previously reported for human faces.