ArticlePDF Available

Security and Privacy in the Metaverse: The Threat of the Digital Human



Each year, researchers and technologists are bringing the vision of the Metaverse, which is predicted to be the future of the internet, closer to becoming a reality. People will spend most of their time in this space interacting face-to-face, so to speak, with highly customizable digital avatars that seamlessly convey precise non-verbal cues from the physical movements of the users themselves. This is an exciting prospect; however, there are many privacy and security concerns that arise from this new form of interaction. Precision motion tracking is required to drive high-fidelity animation, and this affords a mass of data that has never been available before. This data provides a wealth of physical and psychological information that can reveal medical conditions, mental disorders, personality, emotion, personal identity, and more. In this paper, we discuss some implications of the availability of this data, with a focus on the psychological manipulation and coercion capabilities made available by it.
Security and Privacy in the Metaverse: The Threat of the Digital
Lauren Buck
Trinity College Dublin
Dublin, Ireland
Rachel McDonnell
Trinity College Dublin
Dublin, Ireland
Each year, researchers and technologists are bringing the vision of
the Metaverse, which is predicted to be the future of the internet,
closer to becoming a reality. People will spend most of their time
in this space interacting face-to-face, so to speak, with highly cus-
tomizable digital avatars that seamlessly convey precise non-verbal
cues from the physical movements of the users themselves. This is
an exciting prospect; however, there are many privacy and security
concerns that arise from this new form of interaction. Precision
motion tracking is required to drive high-delity animation, and
this aords a mass of data that has never been available before. This
data provides a wealth of physical and psychological information
that can reveal medical conditions, mental disorders, personality,
emotion, personal identity, and more. In this paper, we discuss some
implications of the availability of this data, with a focus on the psy-
chological manipulation and coercion capabilities made available
by it.
Security and privacy
Social aspects of security and pri-
vacy;Human-centered computing
Collaborative and social
computing theory, concepts and paradigms.
virtual reality, personal data collection and use, shared virtual envi-
ronments, avatars, agents, biometric data, machine learning, arti-
cial intelligence
ACM Reference Format:
Lauren Buck and Rachel McDonnell. 2022. Security and Privacy in the Meta-
verse: The Threat of the Digital Human. In Proceedings of CHI Conference
on Human Factors in Computing Systems (CHI EA ’22, Proceedings of the 1st
Workshop on Novel Challenges of Safety, Security and Privacy in Extended
Reality). ACM, New York, NY, USA, 4 pages.
In the present moment, we live in an age where personal data has
been considered by some as more valuable than oil [
]. Tech com-
panies deliberately design applications that are addictive to their
users and source user data to create personalized ad experiences
in order to generate revenue. Meta reported US$33bn in revenue
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the owner/author(s).
CHI EA ’22, Proceedings of the 1st Workshop on Novel Challenges of Safety, Security and
Privacy in Extended Reality, April 29 - May 5, 2022, New Orleans, LA, USA
©2022 Copyright held by the owner/author(s).
in the 4th Quarter of 2021 alone [
], and ByteDance (TikTok) is
currently valued around US$400bn as it reigns as the top grossing
social media app of 2021 [
]. As individuals and governments grap-
ple with the impact of personal data collection and how to protect
individual users from big tech, small pockets of online scammers
vie for the same data. Romance scams occur frequently online,
where individuals are coalesced into fake relationships that often
result in manipulation and theft, and infamous ‘rug pull’ scenarios
involving cryptocurrencies have raked in over US$2.8bn in just
the last year [
]. Protecting the individual has never been more
important than ever as new technologies evolve and create new
hosts of problems to resolve. In this paper, we consider some issues
that are probable to become common, complex issues derived from
the widespread adoption of virtual reality (VR) technology.
VR is a unique technological medium that is set apart from other
media by one dening attribute: 3D interaction. VR environments
emulate real world perceptions and allow users to move through
space and interact with objects and people as they would naturally.
This capability is driven by positional tracking, which calculates the
precise position of a head-mounted display, controllers, and other
trackers attached to the body, within Euclidean space. This tracking
captures the motions of users and allows for the embodiment of
bodily self-representations (i.e., avatars), which are an essential
component of the VR experience. These capabilities alone intro-
duce a myriad of privacy and security issues that require attention,
especially considering the growing of body of knowledge about
data extraction and psychological inuence occurring due to the
use of VR in experimental settings alone.
The existing catalog of literature that revolves around privacy
and security in VR mainly focuses on motion tracking (gait analy-
sis, eye tracking, general bodily motions), and there is increasing
awareness of the fact that this data provides personally identi-
able information. A shocking study carried out by Mark Miller
and colleagues reports that ve minutes of motion tracking data
gathered only from a head-mounted display during a typical view-
ing task is enough to identify a user with 95% accuracy [
]. This
type of data has the potential to reveal information about users
that reect mental state and medical status. Buck and Bodenheimer
note that tracking a user’s personal space representation can reveal
social preferences and disorders like social anxiety [
]. Dierent
types of body motions detected can predict levels of creativity [
and learning [
], along with medical conditions such as autism,
ADHD, and PTSD [
]. Diane Hosfelt, among others, have warned
that the misuse and abuse of this data can produce life-altering
consequences [15, 28].
However, the safety and privacy issues of VR that we as a com-
munity are cognizant of still remain a gray area. In this work, we
hope to bring to light one pocket of issues directly related to the
CHI EA ’22, April 29 - May 5, 2022, New Orleans, LA, USA Buck and McDonnell
psychological manipulation of users by digital humans based on
biometric data collection. There has been little publication, to our
knowledge, on this issue.
It is no secret that VR experiences can be psychologically com-
pelling. Mel Slater and his colleagues have published an abundance
of research that can attest to this. Men report feelings of empathy
toward women after experiencing sexual harassment in a woman’s
body [
], the observation of a virtual body-double of oneself in-
teracting with a crowd can reduce self-persecutory thoughts [
and putatively stressful simulations can produce both physiologi-
cal and psychological responses [
]. Why these simulations can
be so impactful stems from the ability of VR users to embody vir-
tual self-representations. The embodiment of a virtual character
is a commonplace phenomenon in VR [
], and increases the per-
ceived plausability of the simulation [
]. These graphical self-
representations, or self-avatars, do not have to match the physical-
ity of their users in order to elicit the sensation of embodiment [
and virtual characters are not bound by physical constraints.
The importance of knowing that avatar appearance is mutable
is in the detail that humans are naturally prone to making judg-
ments based on physical appearance. It has been proven, even in
early 2D games with elementary graphics, that the way users in-
teract with one another has a lot to do with looks [
]. It has
been made clear that attractiveness dictates how VR users perceive
themselves and others and the behaviors they choose to carry out.
Avatars have already been posited as potential salespeople [
whose appearance can be persuasive enough to inuence decision-
making [
], and when embodied can embolden users to engage
in risk-taking behaviors [
]. VR gives us the exibility to be dig-
ital chameleons, and we can connect the dots to understand that
this ability in the hands of those with malicious intent will drive
us toward a dystopian vision of the future we are all becoming
increasingly familiar with.
This is where biometric data comes into play: the appearance of
the computer or human-driven agents and avatars users are inter-
acting with can be adapted to user preferences based on data that
the user is unaware they have shared. Depending on the technol-
ogy a system is using, eye tracking data can provide pupil dilation
and gaze xation data in response to visual stimuli [
], motion
data can provide proxemic behavior and body language cues [
and physiological data like EEG and skin conductance can provide
levels of emotional activation [
]. Additionally, facial tracking can
also provide a window into emotional response [
]. These non-
verbal cues can be fed into clever machine learning and articial
intelligence algorithms to create personalized, idealized interaction
Besides outward appearance, both voice and motion can be ma-
nipulated to be appealing to users. Software can manipulate vocal
tone, pitch and amplitude, which can allow users to change their
voice from male to female and vice versa. Attractive voices that are
smooth in texture and similar in pitch and timbre can be created
easily via auditory morphing [
]. Vocal cloning software can mimic
the sound of a particular person’s voice. Motion data can give way
to an expanse of physical and psychological information, which
we have discussed in the introduction. Vocal expression in virtual
characters has already been shown to impact social inuence and
attraction [
], and motion data is rich with cues that can be cat-
egorized into levels of attractiveness, which inuence interaction
behaviors [40].
These interaction partners may not only be designed to be at-
tractive to users, but may understand users deeply on both a physi-
cal and psychological level. The personality of an agent could be
adapted to be most likeable by the personality type the user is ex-
hibiting [
]. A more advanced iteration of articial intelligence or
a human driving an avatar could perhaps detect and empathize with
medical and mental conditions to create a sense of closeness and
trust with a user. There are many applications of this psychological
information that can take place with both benevolent and malicious
Herein lies our danger. How far are we willing to take these
technological capabilities? Generative adversarial networks (GANs)
have already been leveraged to create human likenesses from scratch
] and deepfakes generate video and audio to create scenes that
have never happened in real life. Adapting agent and avatar appear-
ance, personality, and interaction in order to sell products to users
is not a far reach, as Amazon populates recommended items based
on shopping habits, and social media sites generate personalised
ads, by using machine learning techniques. Neither is it obscene to
think avatars may be used for political duress, as some social media
sites are notorious for serving politically polarizing content to users
and manipulating emotions to increase engagement. Radical groups
have been known to recruit impressionable, young people through
online tactics. Additionally, Online gambling sites take advantage
of those suering with gambling addictions, and VR is already
considered to promote high-risk gambling behaviors [
]. Addition-
ally, virtual inuencers are already materializing [
]. Could not a
strategically placed agent persuade a user to engage in high-risk
behavior? If we think about it, mental and physical traits extracted
from biometric data could be exploited to coerce users into situa-
tions and behaviors they would otherwise refuse to engage in.
Particular attention needs to be paid to the potential for users to
be manipulated not just by businesses and institutions, but by other
individuals. It would be quite easy for a number of existing internet
scams to spill into the Metaverse, and for their impact to be even
more psychologically devastating because of the immersive aspect
of VR. Extortion and bullying could put users in more personal,
compromising situations, and concern comes with how children
will be protected since they are particularly impressionable. Online
predators will be handed a whole new toolkit of coercive measures
with the availability of more natural interaction. Finally, cyberat-
tacks will expose sensitive biometric data that could be sold on the
dark web, which would be devastating to one’s personal privacy.
Fortunately, there are many things that can be done to combat
the misuse of biometric data before it begins, and we are all called
to make positive contributions in this space. Power should be given
to the user. People should be made aware of and educated on the
implications of biometric data coupled with VR, and should be given
the option to opt out of this type of data collection. Developers can
implement cybersecurity protocols and can also choose to introduce
noise to this type of data in order to generalize it and prevent it from
revealing personally identiable information. Finally, legislators
Security and Privacy in the Metaverse: The Threat of the Digital Human CHI EA ’22, April 29 - May 5, 2022, New Orleans, LA, USA
can introduce laws that prevent businesses and individuals from
collecting and using this data with malicious intent.
Academics are called to make an impact through ongoing re-
search to help understand and mitigate known and unknown psy-
chological problems that will arise in the Metaverse. Potential re-
search avenues include continuing to understand how users can
be manipulated in advertisement scenarios [
], what physical
properties of agents and avatars are likely to have psychological
inuence over users [
], how risky behaviors translate from real to
virtual scenarios [
], and the overall psychological impact of digital
interaction in the Metaverse that will translate into the daily lives
of users (think how augmented images aect self-esteem [
There are many positive impacts that interaction with digital hu-
mans can have, and it is up to bring about an ethical iteration of
the Metaverse.
Widespread adoption of the Metaverse comes with many unique
threats to user privacy and security, some of which we have broached
in this work with regard to digital humans. Biometric data reveals
a host of personally identiable information which can in turn
be used to potentially manipulate users on a psychological level
through the creation of avatars that are adaptable to user prefer-
ences. With respect to security and privacy issues, the VR commu-
nity is in the midst of the Collingridge Dilemma; it is faced with the
responsibility of understanding the potential risks that the Meta-
verse poses to the individual and mitigating those problems before
it is too late. In the grand scheme of things, the digital human is
something amazing and fearsome, and an aspect of VR that is not
to be considered lightly.
This research was funded by Science Foundation Ireland under
the ADAPT Centre for Digital Content Technology (Grant No.
13/RC/2106_P2) and RADICal (Grant No. 19/FFP/6409).
2021. Facebook Reports Third Quarter 2021 Results.
investor-news/press-release-details/2022/Meta-Reports-Fourth- Quarter-and-
Full-Year-2021- Results/default.aspx. Accessed: 2022-02-21.
Anonymous. 2017. The world’s most valuable resource is no longer oil, but
data. The Economist (2017).
worlds-most- valuable-resource- is-no-longer-oil-but- data Accessed: 2022-02-01.
Jeremy N Bailenson, Emmanuel D Pontikakis, Iris B Mauss, James J Gross, Maria E
Jabon, Cendri AC Hutcherson, Cliord Nass, and Oliver John. 2008. Real-time
classication of evoked emotions using facial feature tracking and physiological
responses. International journal of human-computer studies 66, 5 (2008), 303–317.
Laetitia Bruckert, Patricia Bestelmeyer, Marianne Latinus, Julien Rouger, Ian
Charest, Guillaume A Rousselet, Hideki Kawahara, and Pascal Belin. 2010. Vocal
attractiveness increases by averaging. Current biology 20, 2 (2010), 116–120.
Lauren E Buck and Bobby Bodenheimer. 2021. Privacy and Personal Space:
Addressing Interactions and Interaction Data as a Privacy Concern. In 2021 IEEE
Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops
(VRW). IEEE, 399–400.
David Curry. 2022. Top Grossing Apps (2022). Business of Apps (2022). https:
// Accessed: 2022-02-01.
Carla de Juan-Ripoll, José L Soler-Domínguez, Jaime Guixeres, Manuel Contero,
Noemi Álvarez Gutiérrez, and Mariano Alcañiz. 2018. Virtual reality as a new
approach for risk taking assessment. Frontiers in psychology (2018), 2532.
[8] Julius Fast. 1970. Body language. Vol. 82348. Simon and Schuster.
Ylva Ferstl, Elena Kokkinara, and Rachel Mcdonnell. 2017. Facial features of non-
player creatures can inuence moral decisions in video games. ACM Transactions
on Applied Perception (TAP) 15, 1 (2017), 1–12.
Ylva Ferstl, Michael McKay, and Rachel McDonnell. 2021. Facial feature manip-
ulation for trait portrayal in realistic and cartoon-rendered characters. ACM
Transactions on Applied Perception (TAP) 18, 4 (2021), 1–8.
Rebecca Fribourg, Etienne Peillard, and Rachel Mcdonnell. 2021. Mirror, Mirror
on My Phone: Investigating Dimensions of Self-Face Perception Induced by
Augmented Reality Filters. In 2021 IEEE International Symposium on Mixed and
Augmented Reality (ISMAR). IEEE, 470–478.
Mar Gonzalez-Franco and Tabitha C Peck. 2018. Avatar embodiment. towards a
standardized questionnaire. Frontiers in Robotics and AI 5 (2018), 74.
Georey Gorisse, Gizem Senel, Domna Banakou, Alejandro Beacco, Ramon Oliva,
Daniel Freeman, and Mel Slater. 2021. Self-observation of a virtual body-double
engaged in social interaction reduces persecutory thoughts. Scientic reports 11,
1 (2021), 1–13.
Brittan Heller and Avi Bar-Zeev. 2021. The Problems with Immersive Advertising:
In AR/VR, Nobody Knows You Are an Ad. Journal of Online Trust and Safety 1, 1
Diane Hosfelt. 2019. Making ethical decisions for the immersive web. arXiv
preprint arXiv:1905.06995 (2019).
Stephanie Hughes. 2022. Rug-pull scams raked in over US$2.8 billion in crypto
in 2021, report nds. Financial Post (2022). https://
nance/cryptocurrency/rug-pull- scams-raked-in- over-us2- 8-billion- in-
crypto-in- 2021-report-nds Accessed: 2022-02-01.
Seung-A Annie Jin and Justin Bolebruch. 2009. Avatar-based advertising in
Second Life: The role of presence and attractiveness of virtual spokespersons.
Journal of Interactive Advertising 10, 1 (2009), 51–60.
Tero Karras, Samuli Laine, and Timo Aila. 2019. A style-based generator ar-
chitecture for generative adversarial networks. In Proceedings of the IEEE/CVF
conference on computer vision and pattern recognition. 4401–4410.
Rabia Fatima Khan and Alistair Sutclie. 2014. Attractive agents are more
persuasive. International Journal of Human-Computer Interaction 30, 2 (2014),
Konstantina Kilteni, Raphaela Groten, and Mel Slater. 2012. The sense of embod-
iment in virtual reality. Presence: Teleoperators and Virtual Environments 21, 4
(2012), 373–387.
Tze Wei Liew and Su-Mae Tan. 2016. Virtual agents with personality: Adaptation
of learner-agent personality in a virtual learning environment. In 2016 Eleventh
International Conference on Digital Information Management (ICDIM). IEEE, 157–
Marieke AG Martens, Angus Antley, Daniel Freeman, Mel Slater, Paul J Harrison,
and Elizabeth M Tunbridge. 2019. It feels real: physiological responses to a
stressful virtual reality environment and its impact on working memory. Journal
of Psychopharmacology 33, 10 (2019), 1264–1273.
Paul R Messinger, Xin Ge, Eleni Stroulia, Kelly Lyons, Kristen Smirnov, and
Michael Bone. 2008. On the relationship between my avatar and myself. Journal
For Virtual Worlds Research 1, 2 (2008).
Abraham Hani Mhaidli and Florian Schaub. 2021. Identifying manipulative
advertising techniques in xr through scenario construction. In Proceedings of the
2021 CHI Conference on Human Factors in Computing Systems. 1–18.
Mark Roman Miller, Fernanda Herrera, Hanseul Jun, James A Landay, and
Jeremy N Bailenson. 2020. Personal identiability of user tracking data dur-
ing observation of 360-degree VR video. Scientic Reports 10, 1 (2020), 1–10.
Ian Mull, Jamie Wyss, Eunjung Moon, and Seung-Eun Lee. 2015. An exploratory
study of using 3D avatars as online salespeople: The eect of avatar type on
credibility, homophily, attractiveness and intention to interact. Journal of Fashion
Marketing and Management (2015).
Solène Neyret, Xavi Navarro, Alejandro Beacco, Ramon Oliva, Pierre Bourdin,
Jose Valenzuela, Itxaso Barberia, and Mel Slater. 2020. An embodied perspective
as a victim of sexual harassment in virtual reality reduces action conformity in a
later milgram obedience scenario. Scientic reports 10, 1 (2020), 1–18.
Mónika Nogel, Gábor Kovács, and György Wersényi. 2021. The Regulation of
Digital Reality in Nutshell. In 12th IEEE International Conference on Cognitive
Infocommunications (CogInfoCom). 1–7.
Sebastian Oberdörfer, David Schraudt, and Marc Erich Latoschik. 2022. Embodied
Gambling-Investigating the Inuence of Level of Embodiment, Avatar Appear-
ance, and Virtual Environment Design on an Online VR Slot Machine. Frontiers
in Virtual Reality (2022), 8.
Connor P Principe and Judith H Langlois. 2013. Children and adults use attrac-
tiveness as a social cue in real people and avatars. Journal of experimental child
psychology 115, 3 (2013), 590–597.
Susruthi Rajanala, Mayra BC Maymone, and Neelam A Vashi. 2018. Seles—living
in the era of ltered photographs. JAMA facial plastic surgery 20, 6 (2018), 443–
Mel Slater. 2009. Place illusion and plausibility can lead to realistic behaviour in
immersive virtual environments. Philosophical Transactions of the Royal Society
B: Biological Sciences 364, 1535 (2009), 3549–3557.
Feng Tian, Minlei Hua, Wenrui Zhang, Yingjie Li, and Xiaoli Yang. 2021. Emo-
tional arousal in 2D versus 3D virtual reality environments. Plos one 16, 9 (2021),
CHI EA ’22, April 29 - May 5, 2022, New Orleans, LA, USA Buck and McDonnell
Ilaria Torre, Emma Carrigan, Katarina Domijan, Rachel McDonnell, and Naomi
Harte. 2021. The Eect of Audio-Visual Smiles on Social Inuence in a Cooper-
ative Human–Agent Interaction Task. ACM Transactions on Computer-Human
Interaction (TOCHI) 28, 6 (2021), 1–38.
[35] Christopher Travers. 2022. Virtual Humans.
Joseph Tao-yi Wang. 2011. Pupil dilation and eye tracking. A handbook of process
tracing methods for decision research: A critical review and user’s guide (2011),
Andrea Stevenson Won,Jeremy N Bailenson, and Joris H Janssen. 2014. Automatic
detection of nonverbal behavior predicts learning in dyadic interactions. IEEE
Transactions on Aective Computing 5, 2 (2014), 112–125.
Andrea Stevenson Won, Jeremy N Bailenson, Suzanne C Stathatos, and Wenqing
Dai. 2014. Automatically detected nonverbal behavior predicts creativity in
collaborating dyads. Journal of Nonverbal Behavior 38, 3 (2014), 389–408.
Nick Yeeand Jeremy Bailenson. 2007. The Proteus eect: The eect of transformed
self-representation on behavior. Human communication research 33, 3 (2007),
Katja Zibrek, Benjamin Niay, Anne-Hélène Olivier, Ludovic Hoyet, Julien Pettré,
and Rachel Mcdonnell. 2020. The eect of gender and attractiveness of motion
on proximity in virtual reality. ACM Transactions on Applied Perception (TAP) 17,
4 (2020), 1–15.
... As a result, their collection within the Metaverse further exacerbates existing privacy concerns as well as increases the cyberrelated risks for both organisations and users. For example, biometric data, e.g., eye tracking, pupil dilation, motion and positioning, electroencephalogram (EGG), skin conductance and facial expression, will all be collected by the various sensors used by users to interact with the Metaverse [12]. ...
... Secondly, since users interact with Metaverse-based applications through a series of sensors (as depicted in Fig. 1), attackers can potentially target users more directly to acquire this data. As a result, the cyber-related risk associated with a user's interaction with the Metaverse is increased significantly compared to the current context [12]. ...
The emergence of the Metaverse promises to provide several benefits to society, e.g., fully immersive and interactive environments in healthcare, the workplace, and everyday life. However, it introduces a number of privacy and security concerns, including identity, network, and economy-related threats, for both providers and users of Metaverse-based applications, which remain unaddressed. In this paper, we highlight the potential role of Cyber Threat Intelligence (CTI) sharing within the Metaverse context as a promising opportunity to address such privacy and security issues. CTI refers to the relevant, timely and actionable information about the latest threats and attacks. In particular, we examine how organisations could proactively manage the cyber risks associated with their digital infrastructure by participating in CTI sharing within a Metaverse context. We also propose that user-based CTI sharing should be considered within the Metaverse given the potential prevalence of a number of user-specific threats, e.g., identity-related threats. Finally, we emphasise that the integration of user-based sharing creates additional threat sharing flows between users as well as between organisations and users that are lacking from existing CTI sharing approaches, e.g., user-to-user, organisation-to-user, and user-to-organisation.
... Some countermeasures were also suggested. The security concerns arising out of the communication framework for the metaverse and how they can be addressed in future 6G networks are highlighted in [35], with several inbuilt security features suggested at the protocol level for metaverse-specific applications. The human aspects of privacy issues in the metaverse are discussed in [36], with a focus on the psychological manipulation and coercion capabilities afforded by a huge amount of personal data available on the metaverse. ...
Full-text available
Nazir, S.; Shafiq, M.; Shabaz, M. Metaverse Security: Issues, Challenges and a Viable ZTA Model. Electronics 2023, 12, 391. https:// Abstract: The metaverse is touted as an exciting new technology amalgamation facilitating next-level immersive experiences for users. However, initial experiences indicate that a host of privacy, security and control issues will need to be effectively resolved for its vision to be realized. This paper highlights the security issues that will need to be resolved in the metaverse and the underlying enabling technologies/platforms. It also discussed the broader challenges confronting the developers, the service providers and other stakeholders in the metaverse ecosystem which if left unaddressed may hamper its broad adoption and appeal. Finally, some ideas on building a viable Zero-Trust Architecture (ZTA) model for the metaverse are presented.
... The biometric data plays a significant role in terms of social interactions inside the metaverse. To elaborate, human-driven agents and avatars, with whom the users interact, are being developed based on users' personal data [51]. Such sensitive data can be fed into AI/ML algorithms to create personalized "interaction partners" and influence social interaction behaviors of the metaverse users. ...
In this article, the authors provide a comprehensive overview on three core pillars of metaverse-as-a-service (MaaS) platforms; privacy and security, edge computing, and blockchain technology. The article starts by investigating security aspects for the wireless access to the metaverse. Then it goes through the privacy and security issues inside the metaverse from data-centric, learning-centric, and human-centric points-of-view. The authors address private and secure mechanisms for privatizing sensitive data attributes and securing machine learning algorithms running in a distributed manner within the metaverse platforms. Novel visions and less-investigated methods are reviewed to help mobile network operators and metaverse service providers facilitate the realization of secure and private MaaS through different layers of the metaverse, ranging from the access layer to the social interactions among clients. Later in the article, it has been explained how the paradigm of edge computing can strengthen different aspects of the metaverse. Along with that, the challenges of using edge computing in the metaverse have been comprehensively investigated. Additionally, the paper has comprehensively investigated and analyzed 10 main challenges of MaaS platforms and thoroughly discussed how blockchain technology provides solutions for these constraints. At the final, future vision and directions, such as content-centric security and zero-trust metaverse, some blockchain's unsolved challenges are also discussed to bring further insights for the network designers in the metaverse era.
Full-text available
Slot machines are one of the most played games by players suffering from gambling disorder. New technologies like immersive Virtual Reality (VR) offer more possibilities to exploit erroneous beliefs in the context of gambling. Recent research indicates a higher risk potential when playing a slot machine in VR than on desktop. To continue this investigation, we evaluate the effects of providing different degrees of embodiment, i.e., minimal and full embodiment. The avatars used for the full embodiment further differ in their appearance, i.e., they elicit a high or a low socio-economic status. The virtual environment (VE) design can cause a potential influence on the overall gambling behavior. Thus, we also embed the slot machine in two different VEs that differ in their emotional design: a colorful underwater playground environment and a virtual counterpart of our lab. These design considerations resulted in four different versions of the same VR slot machine: 1) full embodiment with high socio-economic status, 2) full embodiment with low socio-economic status, 3) minimal embodiment playground VE, and 4) minimal embodiment laboratory VE. Both full embodiment versions also used the playground VE. We determine the risk potential by logging gambling frequency as well as stake size, and measuring harm-inducing factors, i.e., dissociation, urge to gamble, dark flow, and illusion of control, using questionnaires. Following a between groups experimental design, 82 participants played for 20 game rounds one of the four versions. We recruited our sample from the students enrolled at the University of Würzburg. Our safety protocol ensured that only participants without any recent gambling activity took part in the experiment. In this comparative user study, we found no effect of the embodiment nor VE design on neither the gambling frequency, stake sizes, nor risk potential. However, our results provide further support for the hypothesis of the higher visual angle on gambling stimuli and hence the increased emotional response being the true cause for the higher risk potential.
Full-text available
The proportion of the population who experience persecutory thoughts is 10–15%. People then engage in safety-seeking behaviours, typically avoiding social interactions, which prevents disconfirmatory experiences and hence paranoia persists. Here we show that persecutory thoughts can be reduced if prior to engaging in social interaction in VR participants first see their virtual body-double doing so. Thirty non-clinical participants were recruited to take part in a study, where they were embodied in a virtual body that closely resembled themselves, and asked to interact with members of a crowd. In the Random condition (n = 15) they observed their body-double wandering around but not engaging with the crowd. In the Targeted condition the body-double correctly interacted with members of the crowd. The Green Paranoid Thoughts Scale was measured 1 week before and 1 week after the exposure and decreased only for those in the Targeted condition. The results suggest that the observation of the body-double correctly carrying out a social interaction task in VR may lead to anxiety-reducing mental rehearsal for interaction thus overcoming safety behaviours. The results also extend knowledge of the effects of vicarious agency, suggesting that identification with the actions of body-double can influence subsequent psychological state.
Full-text available
The Problems with Immersive Advertising: In AR/VR, Nobody Knows You Are an Ad
Conference Paper
Full-text available
Digital reality refers to the wide spectrum of technologies and affordances that include Augmented Reality, Virtual Reality and Mixed Reality that simulate reality in various ways. Current level of digital technology and future development towards improving user involvement, entertainment, and accessibility based on digital reality induces not only technological questions but also regulatory, policy and liability issues. The ever-growing market of services using public networks will offer new possibilities and dangers for the user, for the business and create place for criminal activity. Regulators try to follow and adjust laws according to the challenges. This paper briefly analyses the current level and status of regulations on Hungarian and EU level, directing the attention of developers, system engineers and software designers to the questions of responsibility. Based on literature review, this paper discusses issues that are currently in the focus of the regulation in Europe in this regard.
Full-text available
Previous studies have suggested that virtual reality (VR) can elicit emotions in different visual modes using 2D or 3D headsets. However, the effects on emotional arousal by using these two visual modes have not been comprehensively investigated, and the underlying neural mechanisms are not yet clear. This paper presents a cognitive psychological experiment that was conducted to analyze how these two visual modes impact emotional arousal. Forty volunteers were recruited and were randomly assigned to two groups. They were asked to watch a series of positive, neutral and negative short VR videos in 2D and 3D. Multichannel electroencephalograms (EEG) and skin conductance responses (SCR) were recorded simultaneously during their participation. The results indicated that emotional stimulation was more intense in the 3D environment due to the improved perception of the environment; greater emotional arousal was generated; and higher beta (21–30 Hz) EEG power was identified in 3D than in 2D. We also found that both hemispheres were involved in stereo vision processing and that brain lateralization existed in the processing.
Emotional expressivity is essential for human interactions, informing both perception and decision-making. Here, we examine whether creating an audio-visual emotional channel mismatch influences decision-making in a cooperative task with a virtual character. We created a virtual character that was either congruent in its emotional expression (smiling in the face and voice) or incongruent (smiling in only one channel). People (N = 98) evaluated the character in terms of valence and arousal in an online study; then, visitors in a museum played the “lunar survival task” with the character over three experiments (N = 597, 78, 101, respectively). Exploratory results suggest that multi-modal expressions are perceived, and reacted upon, differently than unimodal expressions, supporting previous theories of audio-visual integration.
Previous perceptual studies on human faces have shown that specific facial features have consistent effects on perceived personality and appeal, but it remains unclear if and how findings relate to perception of virtual characters. For example, wider human faces have been found to appear more aggressive and dominant, whereas studies on virtual characters have shown opposite trends but have suffered from significant eeriness of exaggerated features. In this study, we use highly realistic virtual faces obtained from 3D scanning, as well as cartoon-rendered counterparts retaining facial proportions. We assess the effects of facial width and eye size on perceptions of appeal, trustworthiness, aggressiveness, dominance, and eeriness. Our manipulations did not affect eeriness, and we find the same perceptual trends previously reported for human faces.
Conference Paper
The privacy and security of personal data has been at the forefront of public concern for some time now, and is typically understood in the context of data collected from online interaction (social media, transactions, search engine queries, etc.). The advent of immersive technologies expand data collection beyond what can typically be extracted via online interaction, particularly in terms of the availability of biometric data (eye tracking and gait analysis). However, it has not yet been brought to light that interactions and interaction data will soon be of concern in terms of privacy. We mediate interactions in everyday life through the maintenance of personal space and allow certain individuals and objects into our personal space. We do the same in virtual reality. Our personal space allows us to preserve our feeling of safety, and the way we mediate it shows our biases and preferences. In this work, we take a look at the implications of interaction and the availability of personal data that will bring a host of ethical and privacy concerns.