Figure 1 - uploaded by Joanna Aldhous
Content may be subject to copyright.
Setup for Experiment 1 to repeat the classical RHI. 

Setup for Experiment 1 to repeat the classical RHI. 

Source publication
Conference Paper
Full-text available
The rubber hand illusion (RHI) is a body ownership illusion whereby congruently stroking a fake rubber hand and a subject's hidden hand while observing the rubber hand produces the illusion of them feeling the touch on the rubber hand and experiencing the rubber hand to be part of their own body. The parameters of the RHI have not been fully define...

Context in source publication

Context 1
... participants were seated at a table with their left arm resting on the table in a pronated position (palms down) and their forearm and index finger positioned over a marker. A standing screen was positioned beside the participant's left arm to hide it from their view and a life-size rubber model of a left hand and arm was placed over another marker in front of the participant (Figure 1). A measuring tape was attached to the side of the table facing the participant to record proprioceptive drift. ...

Similar publications

Article
Full-text available
Previous research showed that a simple target interception task reveals differences between younger adults (YA) and older adults (OA) on a large screen under laboratory conditions. Participants intercept downward moving objects while a horizontally moving background creates an illusion of the object moving in the opposite direction of the backgroun...

Citations

... The Rubber Hand Illusion Botvinick and Cohen (1998) is a phenomenon where people feel like a rubber hand is their own when they observe it is stroked in the same way as their hidden hand. Recently, Aldhous et al. replicated the rubber hand illusion in a VR setting Aldhous et al. (2017). In a setup where there has been a mismatch between the positions of the hidden real and visible virtual hands, participants were asked to close their eyes and nail the spot where they thought their hand was located. ...
... Our preliminary data demonstrated that this adaptation phase was crucial in obtaining the relevant stimulus-response compatibility effects. It is known that it is possible to embody a virtual hand via synchronous visuo-tactile feedback Aldhous et al. (2017). Here, we demonstrated that synchronous visual feedback, even in the absence of haptic input, is enough to create such embodiment effects. ...
Article
Full-text available
In immersive Virtual Reality (VR), your brain can trick you into believing that your virtual hands are your real hands. Manipulating the representation of the body, namely the avatar, is a potentially powerful tool for the design of innovative interactive systems in VR. In this study, we investigated interactive behavior in VR by using the methods of experimental psychology. Objects with handles are known to potentiate the afforded action. Participants tend to respond faster when the handle is on the same side as the responding hand in bi-manual speed response tasks. In the first experiment, we successfully replicated this affordance effect in a Virtual Reality (VR) setting. In the second experiment, we showed that the affordance effect was influenced by the avatar, which was manipulated by two different hand types: (1) hand models with full finger tracking that are able to grasp objects, and (2) capsule-shaped—fingerless—hand models that are not able to grasp objects. We found that less than 5 mins of adaptation to an avatar, significantly altered the affordance perception. Counter intuitively, action planning was significantly shorter with the hand model that is not able to grasp. Possibly, fewer action possibilities provided an advantage in processing time. The presence of a handle speeded up the initiation of the hand movement but slowed down the action completion because of ongoing action planning. The results were examined from a multidisciplinary perspective and the design implications for VR applications were discussed.
... This conceptualization of embodiment differs from others in the HCI domain, which consider embodiment as the users' sense of their own body (e.g. Longo, Schüür, Kammers, Tsakiris, & Haggard, 2008), particularly regarding their capacity to control, to own and to feel self-located with their virtual counterpart in a digital environment (Aldhous, Hetherington, & Turner, 2017;Liepelt, Dolk, & Hommel, 2017;Nimcharoen, Zollmann, Collins, & Regenbrecht, 2018). ...
Article
Full-text available
Open Access --> https://doi.org/10.1016/j.jbusres.2020.09.036 We live in a multisensory world. Our experiences are constructed by the stimulation of all our senses. Nevertheless, digital interactions are mainly based on audiovisual elements, while other sensory stimuli have been less explored. Virtual reality (VR) is a sensory-enabling technology that facilitates the integration of sensory inputs to enhance multisensory digital experiences. This study analyzes how the addition of ambient scent to a VR experience affects digital pre-experiences in a service context (tourism). Results from a laboratory experiment confirmed that embodied VR devices, together with pleasant and congruent ambient scents, enhance sensory stimulation, which directly (and indirectly through ease of imagination) influence affective and behavioral reactions. These enriched multisensory experiences strengthen the link between the affective and conative images of destinations. We make recommendations for researchers and service providers with ambitions to deliver ambient scents, especially those congruent with displayed content, to enhance the sensorialization of digital VR experiences. • Pleasant and congruent scents in virtual reality experiences enhance sensory states. • These enriched multisensory experiences promote affective and behavioral reactions. • Ease of imagination mediates sensory stimulation effect on destination images. • Relationship between affective-conative images is stronger in multisensory experiences. • Congruent integration of sensory stimuli is essential for digital experiences.
... The Rubber Hand Illusion [8] is a phenomenon where people feel like a rubber hand is their own when they observe it is stroked in the same way as their hidden hand. Recently, Aldhous et al. replicated the rubber hand illusion in a VR setting [2]. In a setup where there has been a mismatch between the positions of the hidden real and visible virtual hands, participants were asked to close their eyes and nail the spot where they thought their hand was located. ...
... Our preliminary data demonstrated that this adaptation phase was crucial in obtaining the relevant stimulus-response compatibility effects. It is known that it is possible to embody a virtual hand via synchronous visuo-tactile feedback [2]. Here, we demonstrated that synchronous visual feedback, even in the absence of haptic input, is enough to create such embodiment effects. ...
Preprint
In immersive Virtual Reality (VR), your brain can trick you into believing that your virtual hands are your real hands. Manipulating the representation of the body, namely the avatar, is a potentially powerful tool for the design of innovative interactive systems in VR. In this study, we investigated interactive behavior in VR by using the methods of experimental psychology. Objects with handles are known to potentiate the afforded action. Participants tend to respond faster when the handle is on the same side as the responding hand in bi-manual speed response tasks. In the first experiment, we successfully replicated this affordance effect in a Virtual Reality (VR) setting. In the second experiment, we showed that the affordance effect was influenced by the avatar, which was manipulated by two different hand types: 1) hand models with full finger tracking that are able to grasp objects, and 2) capsule-shaped -- fingerless -- hand models that are not able to grasp objects. We found that less than 5 minutes of adaptation to an avatar, significantly altered the affordance perception. Counter intuitively, action planning was significantly shorter with the hand model that is not able to grasp. Possibly, fewer action possibilities provided an advantage in processing time. The presence of a handle speeded up the initiation of the hand movement but slowed down the action completion because of ongoing action planning. The results were examined from a multidisciplinary perspective and the design implications for VR applications were discussed.
... These artistic orchestrations, mediated by technology, are designed to be playful and immersive. 1 Emerging research includes remote, prosthetic and multi-modal interfaces for touch practices between humans, virtual agents, robots (Van Erp and Toet 2015, Huisman 2017). 2 Research on experience of affective touch using vibro-tactile technologies showed that telematic, haptic experiences of slow (1-10 cm/s) gentle stroking of the body, such as caressing, are associated by participants with experiences of affection (Huisman et al. 2016). 3 These works can be related to the facial illusion experiment, in which acts of touching a participant's face are mirrored real-time in acts of touching other peoples' faces, visible on a monitor in front of the participant. ...
Chapter
Full-text available
Can shared experience and dialogue on social touch be orchestrated in playful smart public spaces? In smart city public spaces, in which physical and virtual realities are currently merging, new forms of social connections, interfaces and experiences are being explored. Within art practice, such new connections include new forms of affective social communication with additional social and sensorial connections to enable and enhance empathic, intimate experience in playful smart public space. This chapter explores a novel design for shared intimate experience of playful social touch in three orchestrations of ‘Saving Face’, in different cultural and geographical environments of smart city (semi-) public spaces, in Beijing, Utrecht, Dessau-Berlin. These orchestrations are purposefully designed to create a radically unfamiliar sensory synthesis to disrupt the perception of ‘who sees and who is being seen, who touches and who is being touched’. Participants playfully ‘touch themselves and feel being touched, to connect with others on a screen’. All three orchestrations show that shared experience and dialogue on social touch can be mediated by playful smart cities technologies in public spaces, but rely on design of mediated, intimate and exposed forms of ‘self-touch for social touch’, ambivalent relations, exposure of dialogue and hosting.
Chapter
Full-text available
This chapter outlines a specific framework for the creation of critical playable cities. This framework combines three different concepts: DIY urbanism, critical design and urban gamification which are seen as complementary to each other. Cities are complex systems. Various actors often explicitly or implicitly harmonize or collide to shape the landscape of a city and its future. In the past decades, there has been an increased interest in activating citizens as vital actors in shaping urban life. This has taken place through various practical works and research around the paradigms of Playable Cities, DIY Urbanism and Gamification amongst other paradigms. Urban gamification—that is, using play and playfulness to alter our perception of and interactions with city spaces—is specifically emerging as one of the main strategies to activate citizens. Urban gamification alone, however, risks to be disconnected from the urban fabric and its communities. In this chapter we argue that combining it with the grassroot approach of DIY urbanism and the thought-provoking techniques of critical design creates a unique, multi-dimensional approach to designing urban experiences. This chapter, then, aims to explore how play can be used by citizens as a mean for critical reflection and practical re-appropriation of public urban spaces.