To read the full-text of this research, you can request a copy directly from the authors.
Abstract
To control bodily movements the human brain relies on a somatosensory representation referred to as the body schema [1]. The almost century-old hypothesis that tool-use induces plastic changes resulting in the tool being incorporated in the body schema is nowadays widely accepted. Whether this somatosensory representation is truly modified remains unknown, however, as tool-use has never been shown to affect arm motor behaviour. Here we report that using a mechanical grabber that physically extends the arm does alter the kinematics of subsequent free-hand grasping movements. Remarkably, tool-use after-effects generalise to pointing movements, despite the absence of specific tool-training. Furthermore, this effect is driven by an increase of the represented length of the arm: after tool-use, subjects localised touches delivered on the elbow and middle fingertip of their arm as if they were farther apart. These findings indicate that tool-use alters the body schema, and also show that what is modified is the somatosensory representation of intrinsic properties of the body morphology.
... After using a tool for a few minutes, healthy adults start performing free-hand movements differently, with longer latencies and reduced amplitudes for the acceleration, velocity and deceleration profiles (e.g. Cardinali et al., 2009;for review Martel et al., 2016). This kinematic signature of what has been called tool-incorporation, typically observed also in long-armed vs. short-armed participants (Cardinali et al., 2012;Martel et al., 2019), is indicative of a longer arm estimate after tool use. ...
... Participants between 123 and 146 cm of height used a 32 cm long tool (DCD group: 4/17; TD group: 5/17). The remaining participants were taller than 147 cm and used a tool of 40 cm long, which is the original length also used for adults (Baccarini et al., 2014;Cardinali et al., 2009Cardinali et al., , 2011Martel et al., 2019Martel et al., , 2021, but 100g lighter to prevent fatigue. Participants' height was collected before the experiment, the lengths of their arm and forearm were measured afterwards. ...
... We recorded the spatial localization of the hand and of the tool using infrared light emitting diodes (IREDs) with an Optotrak 3020 (Northern Digital Inc; sampling rate: 200 Hz; 3D resolution: 0.01 mm at 2.25 m distance). Following previous studies using the same paradigm (Baccarini et al., 2014;Cardinali et al., 2009Cardinali et al., , 2011Cardinali et al., , 2012Martel et al., 2019Martel et al., , 2021, we assessed the grasping component of the hand/tool movements by placing IREDs on the thumb and index finger nails of participants' dominant hand, as well as on the two "fingers" of the tool. The reaching component was evaluated thanks to IRED located on the dominant wrist (styloid process of the radius and distal part of the tool shaft). ...
Developmental Coordination Disorder (DCD) is a pathological condition characterized by impaired motor skills. Current theories advance that a deficit of the internal models is mainly responsible for DCD children's altered behavior. Yet, accurate movement execution requires not only correct movement planning, but also integration of sensory feedback into body representation for action (Body Schema) to update the state of the body. Here we advance and test the hypothesis that the plasticity of this body representation is altered in DCD. To probe Body Schema (BS) plasticity, we submitted a well-established tool-use paradigm to seventeen DCD children, required to reach for an object with their hand before and after tool use, and compared their movement kinematics to that of a control group of Typically Developing (TD) peers. We also asked both groups to provide explicit estimates of their arm length to probe plasticity of their Body Image (BI). Results revealed that DCD children explicitly judged their arm shorter after tool use, showing changes in their BI comparable to their TD peers. Unlike them, though, DCD did not update their implicit BS estimate: kinematics showed that tool use affected their peak amplitudes, but not their latencies. Remarkably, the kinematics of tool use showed that the motor control of the tool was comparable between groups, both improving with practice, confirming that motor learning abilities are preserved in DCD. This study thus brings evidence in favor of an alternative theoretical account of the DCD etiology. Our findings point to a deficit in the plasticity of the body representation used to plan and execute movements. Though not mutually exclusive, this widens the theoretical perspective under which DCD should be considered: DCD may not be limited to a problem affecting the internal models and their motor functions, but may concern the state of the effector they have to use.
... To account for changes in arm motor control induced by tool use, we used a well-established paradigm 5,8,11 : free-hand reach-to-grasp movements are performed before and after reachto-grasp tool movements. The kinematics of free-hand movements are then compared (pre-post) to assess the consequences of tool use on free-hand movements. ...
... The target object was a wooden parallelepiped (10 × 2, 5 × 5 cm, weighting 96 g) placed on the table at a distance of 35 cm along the sagittal axis, in line with participants' right shoulder (or left shoulder for left-handed participants). Importantly, as in previous studies using the same paradigm [4][5][6][7][8] , the target object was always located inside the arm reaching space, thus preventing the potential confounding effects of tool use www.nature.com/scientificreports/ in different sectors of space (reachable vs. non-reachable space). Once reached a comfortable position, the chair was fixed to keep the distance from the table constant, but could rotate to adapt to the different tasks. ...
... This was to ensure that the manipulated tool would fit the participants' body size. As a result, the big tool was 40 cm long, similarly to the one we used on previous studies [4][5][6]8 , except that it was 100 g lighter to prevent fatigue in the younger adolescents; it was used for the young adults and the adolescents who were taller than 147 cm (66 participants; range from 9.2 to 21.5 years old). The middle one, for children between 123 and 146 cm of height, was 32 cm long (23 participants; range from 7.6 to 11.9 years old). ...
Humans evolution is distinctly characterized by their exquisite mastery of tools, allowing them to shape their environment in more elaborate ways compared to other species. This ability is present ever since infancy and most theories indicate that children become proficient with tool use very early. In adults, tool use has been shown to plastically modify metric aspects of the arm representation, as indexed by changes in movement kinematics. To date, whether and when the plastic capability of updating the body representation develops during childhood remains unknown. This question is particularly important since body representation plasticity could be impacted by the fact that the human body takes years to achieve a stable metric configuration. Here we assessed the kinematics of 90 young participants (8–21 years old) required to reach for an object before and after tool use, as a function of their pubertal development. Results revealed that tool incorporation, as indexed by the adult typical kinematic pattern, develops very slowly and displays a u-shaped developmental trajectory. From early to mid puberty, the changes in kinematics following tool use seem to reflect a shortened arm representation, opposite to what was previously reported in adults. This pattern starts reversing after mid puberty, which is characterized by the lack of any kinematics change following tool use. The typical adult-like pattern emerges only at late puberty, when body size is stable. These findings reveal the complex dynamics of tool incorporation across development, possibly indexing the transition from a vision-based to a proprioception-based body representation plasticity.
... Converging evidence has suggested that human tooluse is facilitated by a highly plastic body representation (Maravita & Iriki, 2004), a process termed tool embodiment (Cardinali et al., 2009b). By this view, tool-use produces a tool-specific perceptual transformation of the body representation, wherein the body itself is extended to fit the tool. ...
... While an altered body representation's role in tool-use had been indirectly supported by much of the work on peripersonal space, Cardinali et al. (2009b) aimed to provide the first direct evidence of a change in the action-oriented body schema and are often cited as the primary support for tool embodiment effects. In their seminal paradigm, seated participants executed repetitive reaches toward a rectangular prism, lifting and replacing it before returning to a starting position. ...
... These kinematic measurements exist as the sole online evidence for changes in the body schema, which is suggested to be a primarily unconscious representation (de Vignemont, 2010). Their validity for this purpose has been substantiated by the observation that individuals with longer arms display baseline kinematic characteristics that are more consistent with the trends observed after tool-use than participants with shorter arms (Cardinali et al., 2009b;Martel et al., 2019). ...
The predominant view on human tool-use suggests that an action-oriented body representation, the body schema, is altered to fit the tool being wielded, a phenomenon termed tool embodiment. While observations of perceptual change after tool-use purport to support this hypothesis, several issues undermine their validity in this context, discussed at length in this critical review. The primary measures used as indicators of tool embodiment each face unique challenges to their construct validity. Further, the perceptual changes taken as indicating extension of the body representation only appear to account for a fraction of the tool’s size in any given experiment, and do not demonstrate the covariance with tool length that the embodiment hypothesis would predict. The expression of tool embodiment also appears limited to a narrow range of tool-use tasks, as deviations from a simple reaching paradigm can mollify or eliminate embodiment effects altogether. The shortcomings identified here generate important avenues for future research. Until the source of the kinematic and perceptual effects that have substantiated tool embodiment is disambiguated, the hypothesis that the body representation changes to fit tools during tool-use should not be favored over other possibilities such as the formation of separable internal tool models, which seem to offer a more complete account of human tool-use behaviors. Indeed, studies of motor learning have observed analogous perceptual changes as aftereffects to adaptation despite the absence of handheld tool-use, offering a compelling alternative explanation, though more work is needed to confirm this possibility.
... This can be ascertained by analysing the kinematic variables, i.e. the physical characteristics of the grasping movements performed after using a tool (e.g., wrist velocity, peak velocity, grip aperture, etc.). Cardinali et al. (2009) proved how the use of a mechanical arm even alters the kinematics of the subsequent grasping movements performed with free hand for 10-15 min (Cardinali et al., 2009). In particular, after using the mechanical arm, participants showed an altered wrist velocity detectable in longer latencies (velocity latency and deceleration latency) and in a reduced peak of reaching movement parameters (acceleration peak, velocity peak, deceleration peak), as well as an overall longer movement time. ...
... This can be ascertained by analysing the kinematic variables, i.e. the physical characteristics of the grasping movements performed after using a tool (e.g., wrist velocity, peak velocity, grip aperture, etc.). Cardinali et al. (2009) proved how the use of a mechanical arm even alters the kinematics of the subsequent grasping movements performed with free hand for 10-15 min (Cardinali et al., 2009). In particular, after using the mechanical arm, participants showed an altered wrist velocity detectable in longer latencies (velocity latency and deceleration latency) and in a reduced peak of reaching movement parameters (acceleration peak, velocity peak, deceleration peak), as well as an overall longer movement time. ...
... These effects, which are consistent with an alteration of the real arm's representation, did not arise when a weight corresponding to that of the tool was applied to the arm of the experimental individual, thereby ruling out a mere effect due to fatigue in handling the tool. The same altered kinematics was found also for other non-training movements like pointing movements (Cardinali et al., 2009). ...
This review aims to explore what I call the "Embodiment Cost Hypothesis" (ECH), according to which, when humans "embody" a part of the world other than their bodies, a measurable cost is detectable on their real bodies. The review analyzes experimental evidence in favor of the ECH by examining studies from different research fields, including studies of action observation (2), tool-use (3), rubber hand illusion (4), and full-body illusions (5). In light of this literature, this review argues that embodiment effects can profitably be seen as phenomena associated with both benefits (resulting from the embodiment of external objects/bodies) and costs (resulting from the disembodiment at various levels of the subject's own body). Implications are discussed in relation to the ongoing debate on the embodied cognition (EC) approach.
... Our participants used the robotic finger in a cued finger habituation task in two conditions: one when they had control of the robotic finger (controllable condition) and the other when they could not control the robotic finger (the random condition). Following the habituation (see Fig. 1C for the sequence of experiments), we evaluated whether and how the use of fingers in the two conditions affected the participant's perception of their innate hand by using cognitive questionnaires (to measure changes in perceived levels of ownership and agency towards the robotic finger as well as body image of their hand) and behavioral measures collected in two test experiments that measured possible changes in their "body schema" 31,32 and their "body image" 33 . The 3D-printed robotic sixth finger actuated by a servo motor in the chassis. ...
... In the reaching task, participants were instructed to reach with their index finger to a target line while avoiding an obstacle to impede the shortest path from the starting location of the index finger to the target location (see Fig. 3A). Body schema is said to represent the internal representation of the body in the brain that is used for motor control 31,34 , and hence we expected an increase in the avoidance if the hand representation in the body schema was larger after the habituation task with the robotic finger. ...
... In addition to a questionnaire to judge perceived changes in ownership, agency, and body image, we used two test experiments to evaluate possible changes in body representation after the use of the robotic finger. The first reaching test evaluated changes in the body schema, the representation of our body in internal coordinates that are used for motor control 31,32,34 . Specifically, we evaluated changes in the representation of the width of the hand in the body schema. ...
Can our brain perceive a sense of ownership towards an independent supernumerary limb; one that can be moved independently of any other limb and provides its own independent movement feedback? Following the rubber-hand illusion experiment, a plethora of studies have shown that the human representation of “self” is very plastic. But previous studies have almost exclusively investigated ownership towards “substitute” artificial limbs, which are controlled by the movements of a real limb and/or limbs from which non-visual sensory feedback is provided on an existing limb. Here, to investigate whether the human brain can own an independent artificial limb, we first developed a novel independent robotic “sixth finger.” We allowed participants to train using the finger and examined whether it induced changes in the body representation using behavioral as well as cognitive measures. Our results suggest that unlike a substitute artificial limb (like in the rubber hand experiment), it is more difficult for humans to perceive a sense of ownership towards an independent limb. However, ownership does seem possible, as we observed clear tendencies of changes in the body representation that correlated with the cognitive reports of the sense of ownership. Our results provide the first evidence to show that an independent supernumerary limb can be embodied by humans.
... It is probable that elite badminton players can control their personal racket as if it were their own hand. A previous study has demonstrated that repeated use of a mechanical grabber that physically extends the arm produces subsequent arm movement as if the tool is being incorporated into the body schema (Cardinali et al., 2009). Similar findings were also obtained using a robot manipulandum (Ganesh et al., 2014). ...
... The perception of arm length has been modulated using a long stick (Canzoneri et al., 2013;Sposito et al., 2012). In addition, repeated use of a mechanical grabber that physically extends the arm (Cardinali et al, 2009) and a robot manipulandum (Ganesh et al., 2014) caused motor behaviors as if these tools were being incorporated into the body schema. Such tool embodiment may be brought about by the repeated use of the tools. ...
... Furthermore, the tool embodiment is rigid when they hold the racket. In the past, tool embodiment has been estimated from the perceived arm length (Canzoneri et al., 2013;Sposito et al., 2012), perceived peripersonal space (Biggio et al., 2017;Làdavas & Serino, 2008), and motor behaviors or perceptions as if the tool and hand were assimilated (Cardinali et al, 2009;Ganesh et al., 2014;Miyazaki et al., 2010;Yamamoto & Kitazawa, 2001). In addition to these studies, the findings of this study suggest that the RHI paradigm can be used as a novel index of tool embodiment associated with prolonged use of the racket in athletes in racket sports. ...
Badminton players have a plastic modification of their arm representation in the brain due to the prolonged use of their racket. However, it is not known whether their arm representation can be altered through short-term visuotactile integration. The neural representation of the body is easily altered when multiple sensory signals are integrated in the brain. One of the most popular experimental paradigms for investigating this phenomenon is the “rubber hand illusion.” This study was designed to investigate the effect of prolonged use of a racket on the modulation of arm representation during the rubber hand illusion in badminton players. When badminton players hold the racket, their badminton experience in years is negatively correlated with the magnitude of the rubber hand illusion. This finding suggests that tool embodiment obtained by the prolonged use of the badminton racket is less likely to be disturbed when holding the racket.
... Several studies have found that using a tool modulates body representations for action (Martel et al., 2019;Cardinali et al., 2009Cardinali et al., , 2012Gentilucci, Roy, & Stefanini, 2004) and somatosensory perception (Miller, Longo, & Saygin, 2014Canzoneri et al., 2013;Sposito, Bolognini, Vallar, & Maravita, 2012;Cardinali et al., 2009). Our results support and help qualify the notion that tools are incorporated into the body representation (Cardinali et al., 2009), as not only low-frequency modulations for touch applied on a handheld object were similar to when touch is applied on the skin (Nierula et al., 2013;Cheyne et al., 2003;Salenius et al., 1997) but they were also localized in brain regions previously implicated in localizing touch on the body (Miller et al., 2019;Azañón et al., 2010;Avillac et al., 2005;Lloyd et al., 2003). ...
... Several studies have found that using a tool modulates body representations for action (Martel et al., 2019;Cardinali et al., 2009Cardinali et al., , 2012Gentilucci, Roy, & Stefanini, 2004) and somatosensory perception (Miller, Longo, & Saygin, 2014Canzoneri et al., 2013;Sposito, Bolognini, Vallar, & Maravita, 2012;Cardinali et al., 2009). Our results support and help qualify the notion that tools are incorporated into the body representation (Cardinali et al., 2009), as not only low-frequency modulations for touch applied on a handheld object were similar to when touch is applied on the skin (Nierula et al., 2013;Cheyne et al., 2003;Salenius et al., 1997) but they were also localized in brain regions previously implicated in localizing touch on the body (Miller et al., 2019;Azañón et al., 2010;Avillac et al., 2005;Lloyd et al., 2003). ...
... Several studies have found that using a tool modulates body representations for action (Martel et al., 2019;Cardinali et al., 2009Cardinali et al., , 2012Gentilucci, Roy, & Stefanini, 2004) and somatosensory perception (Miller, Longo, & Saygin, 2014Canzoneri et al., 2013;Sposito, Bolognini, Vallar, & Maravita, 2012;Cardinali et al., 2009). Our results support and help qualify the notion that tools are incorporated into the body representation (Cardinali et al., 2009), as not only low-frequency modulations for touch applied on a handheld object were similar to when touch is applied on the skin (Nierula et al., 2013;Cheyne et al., 2003;Salenius et al., 1997) but they were also localized in brain regions previously implicated in localizing touch on the body (Miller et al., 2019;Azañón et al., 2010;Avillac et al., 2005;Lloyd et al., 2003). Furthermore, fronto-parietal alpha connectivity has been reported to emerge only when touch is applied to the self-body but not when applied to an object (Pisoni, Romero Lauro, Vergallito, Maddaluno, & Bolognini, 2018). ...
The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
... Tout comme un corps extérieur au sien pourrait être intégré à la représentation de son propre corps, un objet pourrait également être intégré dans la représentation de son corps permettant d'anticiper les capacités d'actions possibles à réaliser. suggère que l'outil qu'utilise un individu deviendrait une extension de son corps et serait intégré dans la représentation qu'il en a. L'utilisation d'un outil, qu'elle soit réelle ou imaginée influencerait la perception visuelle (Cardinali et al., 2009;Longo & Serino, 2012;Witt, 2011b). Une représentation dynamique, déduite des informations proprioceptives, tactiles et kinesthésiques impliquées dans la réalisation ou l'anticipation d'une action, constituerait le schéma corporel de l'individu (Cardinali et al., 2009). ...
... suggère que l'outil qu'utilise un individu deviendrait une extension de son corps et serait intégré dans la représentation qu'il en a. L'utilisation d'un outil, qu'elle soit réelle ou imaginée influencerait la perception visuelle (Cardinali et al., 2009;Longo & Serino, 2012;Witt, 2011b). Une représentation dynamique, déduite des informations proprioceptives, tactiles et kinesthésiques impliquées dans la réalisation ou l'anticipation d'une action, constituerait le schéma corporel de l'individu (Cardinali et al., 2009). L'influence du corps sur la perception de l'espace pourrait être envisagée de deux manières différentes. ...
... L'influence du corps sur la perception de l'espace pourrait être envisagée de deux manières différentes. Elle pourrait être médiatisée par l'utilisation du schéma corporel comme métrique sensorimotrice permettant de guider les actions (Cardinali et al., 2009;Guardia et al., 2013;Longo & Serino, 2012;van der Hoort et al., 2011). D'autres auteurs suggèrent que l'influence du corps pourrait être médiatisée par la contrainte des actions qui sont prévues par l'individu (Molto et al., 2020;Morgado & Palluel-Germain, 2015;Witt & Proffitt, 2008). ...
Cette thèse propose de nouvelles méthodologies permettant de mesurer de façon indirecte et quantitative les composantes affectives du consommateur. Dans une première série d’études, nous avons observé que des variables socio-affectives influencent la perception de l’espace. Plus particulièrement, des variables comme l’estime de soi et l’anxiété sociale modèrent la façon dont les individus perçoivent une ouverture. Nos résultats suggèrent que ce type de tâche pourrait être, à terme, utilisée pour évaluer l’effet socio-affectif d’un produit porté. Dans une seconde série d’étude, nous avons analysé le mouvement de la souris lorsque des consommateurs devaient réaliser une tâche de catégorisation dichotomique. Cette méthode semble permettre d’identifier et de hiérarchiser certaines caractéristiques relatives à l’identité d’une marque. Ces résultats suggèrent que cette méthode pourrait être, à terme, utilisée afin de prédire les comportements d’achats. En conclusion, ces travaux proposent de nouvelles mesures indirectes, basées sur des variables sensori-motrices, pour l’étude du consommateur.
... Experimental evidence has shown that the body schema [120] and the peripersonal space [102] (for reviews see [76,97]) can be temporarily remapped during active or passive interaction [121] with a tool. This phenomenon was originally observed in monkeys [1] (for a review see [7]), and was later described in brain damaged [8,122] and healthy humans [2,5,120,[123][124][125]. ...
... Experimental evidence has shown that the body schema [120] and the peripersonal space [102] (for reviews see [76,97]) can be temporarily remapped during active or passive interaction [121] with a tool. This phenomenon was originally observed in monkeys [1] (for a review see [7]), and was later described in brain damaged [8,122] and healthy humans [2,5,120,[123][124][125]. It has been suggested that modifications of the PPS underlying these effects depend on Hebbian plasticity [126][127][128], i.e., connectivity transformations driven by statistical associations of multisensory inputs from the environment. ...
... Plastic reorganisations of the PPS and the body schema after tool use are supported by evidence showing stronger CCE when a visual stimulus is presented near the used tool [8,122] or specifically next to the used part of the tool [10], while a tactile stimulus is delivered on the participants' hand. Moreover, Cardinali and colleagues [120] showed that tool use not only changes the multisensory integration effects near the object ("extending" the PPS), but that it also affects motor indexes related to the body schema (i.e., action execution), and to the somatosensory body representation (i.e., increase of the represented length of the arm), once again supporting the close relation between the PPS and sensorimotor body representations. Interestingly, in a recent study, Miller and colleagues found that the somatosensory cortex responds to stimuli located beyond the physical body, and showed how, when a hand-held tool was touched, vibrotactile stimuli triggered activity in the primary and secondary somatosensory cortices of the participants [129]. ...
This perspective review focuses on the proposal that predictive multisensory integration occurring in one’s peripersonal space (PPS) supports individuals’ ability to efficiently interact with others, and that integrating sensorimotor signals from the interacting partners leads to the emergence of a shared representation of the PPS. To support this proposal, we first introduce the features of body and PPS representations that are relevant for interpersonal motor interactions. Then, we highlight the role of action planning and execution on the dynamic expansion of the PPS. We continue by presenting evidence of PPS modulations after tool use and review studies suggesting that PPS expansions may be accounted for by Bayesian sensory filtering through predictive coding. In the central section, we describe how this conceptual framework can be used to explain the mechanisms through which the PPS may be modulated by the actions of our interaction partner, in order to facilitate interpersonal coordination. Last, we discuss how this proposal may support recent evidence concerning PPS rigidity in Autism Spectrum Disorder (ASD) and its possible relatioship with ASD individuals’ difficulties during interpersonal coordination. Future studies will need to clarify the mechanisms and neural underpinning of these dynamic, interpersonal modulations of the PPS.
... Such change of sensorimotor context occurs when, for instance, we use a tool that extends the motor capabilities of our body. Past studies indeed revealed that manipulating a tool changes how we represent near and far visual spaces (Berti and Frassinetti, 2000;Maravita and Iriki, 2004;Cardinali et al., 2012;Canzoneri et al., 2013;Bourgeois et al., 2014), due to the alteration of the representation of the arm in the body schema resulting from tool-use (Cardinali et al., 2009). Indeed, using a long-handle tool with the hand produces an extension of the representation of arm's length, as if the tool was suddenly like a body segment (Grüsser, 1983). ...
... Evidence for an integration of the tool into the somatosensory cortical representation of the arm came from the observation that tool-use modifies the way the brain represents the metric characteristics of the body segment controlling the tool. For example, it was shown that the use of a tool that elongates the physical length of the arm induced kinematic changes affecting subsequent objectoriented motor actions (Cardinali et al., 2009. In a sense, kinematic parameters of the reaching movement towards a visual object changed as if the object was perceived as being at a closer location after tool-use than before. ...
... Also, tool-use modified the perceived morphology of the arm. When neurotypical individuals were asked to localize simultaneous tactile stimuli on their arm after having used a tool, they localized these tactile stimuli as being more distant from each other than before tool-use (Cardinali et al., 2009;Sposito et al., 2012). These findings indicate that tool-use alters the body schema, and more specifically modifies the somatosensory representation of intrinsic properties of body morphology (Cardinali et al., 2009). ...
The peripersonal space is an adaptive and flexible interface between the body and the environment that fulfills a dual-motor function: preparing the body for voluntary object-oriented actions to interact with incentive stimuli and preparing the body for defensive responses when facing potentially harmful stimuli. In this position article, we provide arguments for the sensorimotor rooting of the peripersonal space representation and highlight the variables that contribute to its flexible and adaptive characteristics. We also demonstrate that peripersonal space represents a mediation zone between the body and the environment contributing to not only the control of goal-directed actions but also the organization of social life. The whole of the data presented and discussed led us to the proposal of a new theoretical framework linking the peripersonal action space and the interpersonal social space and we highlight how this theoretical framework can account for social behaviors in populations with socio-emotional deficits.
... After using a tool for a few minutes, healthy adults start performing free-hand movements differently, with longer latencies and reduced amplitudes for the acceleration, velocity and deceleration profiles (e.g. Cardinali et al., 2009); for review Martel et al., 2016). This kinematic signature of what has been called tool-incorporation, typically observed also in long-armed vs. short-armed participants (Cardinali et al., 2012;Martel et al., 2019), is indicative of a longer arm estimate after tool use. ...
... Participants between 123 and 146 cm of height used a 32 cm long tool (DCD group: 4/17; TD group: 5/17). The remaining participants were taller than 147 cm and used a tool of 40 cm long, which is the original length also used for adults (Baccarini et al., 2014;Cardinali et al., 2011Cardinali et al., , 2009Martel et al., 2021Martel et al., , 2019, but 100g lighter to prevent fatigue. Participants' height was collected before the experiment, the lengths of their arm and forearm were measured afterwards. ...
... We recorded the spatial localization of the hand and of the tool using infrared light emitting diodes (IREDs) with an Optotrak 3020 (Northern Digital Inc; sampling rate: 200 Hz; 3D resolution: 0.01 mm at 2.25 m distance). Following previous studies using the same paradigm (Baccarini et al., 2014;Cardinali et al., 2012Cardinali et al., , 2011Cardinali et al., , 2009Martel et al., 2021Martel et al., , 2019, we assessed the grasping component of the hand/tool movements by placing IREDs on the thumb and index finger nails of participants' dominant hand, as well as on the two "fingers" of the tool. The reaching component was evaluated thanks to IRED located on the dominant wrist (styloid process of the radius and distal part of the tool shaft). ...
Developmental Coordination Disorder (DCD) is a pathological condition characterized by impaired motor skills. Current theories advance that a deficit of the internal models is mainly responsible for DCD children’s altered behavior. Yet, accurate movement execution requires not only correct movement planning, but also integration of sensory feedback into body representation for action (Body Schema) to update the state of the body. Here we advance and test the hypothesis that the plasticity of this body representation is altered in DCD. To probe Body Schema (BS) plasticity, we submitted a well-established tool-use paradigm to seventeen DCD children, required to reach for an object with their hand before and after tool use, and compared their movement kinematics to that of a control group of Typically Developing (TD) peers. We also asked both groups to provide explicit estimates of their arm length to probe plasticity of their Body Image (BI). Results revealed that DCD children explicitly judged their arm shorter after tool use, showing changes in their BI comparable to their TD peers. Unlike them, though, DCD did not update their implicit BS estimate: kinematics showed that tool use affected their peak amplitudes, but not their latencies. Remarkably, the kinematics of tool use showed that the motor control of the tool was comparable between groups, both improving with practice, confirming that motor learning abilities are preserved in DCD. This study thus brings evidence in favor of an alternative theoretical account of the DCD etiology. Our findings point to a deficit in the plasticity of the body representation used to plan and execute movements. Though not mutually exclusive, this widens the theoretical perspective under which DCD should be considered: DCD may not be limited to a problem affecting the internal models and their motor functions, but may concern the state of the effector they have to use.
... [3-5] [6,7] 1.0 1.5 [8] 1 [9] [10] ...
... RHI Rubber Hand Illusion [11] VHI Virtual Hand Illusion [12] [13] [6,7] [ ...
It is difficult for humans to manipulate their virtual body in the immersive virtual system when its size is different from the real body. The authors assumed that this problem is caused by misfit of the body schema and have been working on the research of body schema calibration, which is the method to update the body schema to fit the virtual body. However, in the previous research, the survey of the basic characteristics of updating body schema has been limited to a narrow range in the extension direction. Therefore, in this paper, we have dealt with updating body schema in wider range in both extension and contraction direction, in order to clarify not only the characteristics of updating in the contraction direction but the limits of update. The results showed symmetry characteristics of body schema update in both directions, while the update effect deteriorated in case of extreme changes.
... Embodied selfhood is a good candidate for a generalized core in providing parsimonious modeling of correlated activity between heterogeneous sensations, whether interoceptive, proprioceptive, or exteroceptive [75]. I suggest ESMs provide such powerful explanations for experience that they form a necessary scaffolding for all other aspects of mind, with different aspects of selfhood being understood as kinds of extended embodiment [194][195][196][197], ranging from material possessions [198] to social roles, and other more abstract senses of self and ownership [111]. From this view, psychological development would be reframed in terms of preserving and adapting various core patterns-in neo-Piagetian terms, assimilation and accommodation-so allowing minds to bootstrap themselves towards increasingly rarefied states of complexity. ...
... Notably, parietal cortex provides sources of both high-level body representations as well as spatial awareness, with damage not only resulting in anosognosia and alien limb syndromes, but also hemi-spatial neglect [218]. There is also a counter-intuitive finding in which the spatial extent of neglect symptoms are extended via providing a reach-extending tool for the hand corresponding to the affected side of space [195,247]. Speculatively, affordance-based bindings via ESMs may potentially provide a partial explanation for this surprising phenomenon, in that neglect symptoms could result from coupling with ESMs whose coherent (synchronous and inferential) dynamics have been compromised. Resonant coupling between percepts and ESMs may also help explain how external objectspotentially including other agents [248]-may become incorporated into body maps [249], with synchronous motions helping to establish expansion/binding. ...
Drawing from both enactivist and cognitivist perspectives on mind, I propose that explaining teleological phenomena may require reappraising both “Cartesian theaters” and mental homunculi in terms of embodied self-models (ESMs), understood as body maps with agentic properties, functioning as predictive-memory systems and cybernetic controllers. Quasi-homuncular ESMs are suggested to constitute a major organizing principle for neural architectures due to their initial and ongoing significance for solutions to inference problems in cognitive (and affective) development. Embodied experiences provide foundational lessons in learning curriculums in which agents explore increasingly challenging problem spaces, so answering an unresolved question in Bayesian cognitive science: what are biologically plausible mechanisms for equipping learners with sufficiently powerful inductive biases to adequately constrain inference spaces? Drawing on models from neurophysiology, psychology, and developmental robotics, I describe how embodiment provides fundamental sources of empirical priors (as reliably learnable posterior expectations). If ESMs play this kind of foundational role in cognitive development, then bidirectional linkages will be found between all sensory modalities and frontal-parietal control hierarchies, so infusing all senses with somatic-motoric properties, thereby structuring all perception by relevant affordances, so solving frame problems for embodied agents. Drawing upon the Free Energy Principle and Active Inference framework, I describe a particular mechanism for intentional action selection via consciously imagined (and explicitly represented) goal realization, where contrasts between desired and present states influence ongoing policy selection via predictive coding mechanisms and backward-chained imaginings (as self-realizing predictions). This embodied developmental legacy suggests a mechanism by which imaginings can be intentionally shaped by (internalized) partially-expressed motor acts, so providing means of agentic control for attention, working memory, imagination, and behavior. I further describe the nature(s) of mental causation and self-control, and also provide an account of readiness potentials in Libet paradigms wherein conscious intentions shape causal streams leading to enaction. Finally, I provide neurophenomenological handlings of prototypical qualia including pleasure, pain, and desire in terms of self-annihilating free energy gradients via quasi-synesthetic interoceptive active inference. In brief, this manuscript is intended to illustrate how radically embodied minds may create foundations for intelligence (as capacity for learning and inference), consciousness (as somatically-grounded self-world modeling), and will (as deployment of predictive models for enacting valued goals).
... A higher awareness of one's inner body sensations would decrease the plasticity of the BR and make it more difficult to feel ownership for artificial body parts that do not pertain to the physical configuration of the actual body. Further evidence of the body schema plasticity can be found in studies with tool use paradigms (Maravita and Iriki, 2004;Cardinali et al., 2009Cardinali et al., , 2012. Indeed, when actions are performed with tools, the morphology and functionality of specific body parts is modified through a quick and efficient updating of the body schema that allows the maintenance of action accuracy (Cardinali et al., 2009(Cardinali et al., , 2016. ...
... Further evidence of the body schema plasticity can be found in studies with tool use paradigms (Maravita and Iriki, 2004;Cardinali et al., 2009Cardinali et al., , 2012. Indeed, when actions are performed with tools, the morphology and functionality of specific body parts is modified through a quick and efficient updating of the body schema that allows the maintenance of action accuracy (Cardinali et al., 2009(Cardinali et al., , 2016. The association between IS and BR in aging could also be due to a reorganization of functional brain networks (Steffener et al., 2012;Bagarinao et al., 2019), and in particular to a decreased within-network and an increased between-network connectivity (Tomasi and Volkow, 2012;Betzel et al., 2014) in the visuospatial, the sensorimotor and the salience network (Bagarinao et al., 2019) associated with BR and interoceptive processing (Takeuchi et al., 2016;Chong et al., 2017). ...
Interoceptive information plays a pivotal role in building body representations (BR), but the association between interoception and the different types of BR in healthy individuals has never been systematically investigated. Thus, this study aimed to explore the association between BR and interoceptive sensibility (IS) throughout adulthood. One hundred thirty-seven healthy participants (50 aged from 18 to 40 years old; 50 aged from 41 to 60 years old; and 37 over 60 years old) were given a self-report tool for assessing IS (the Self-Awareness Questionnaire; SAQ), and a specific battery including tasks evaluating three different BR (i.e., the body schema, using the Hand Laterality Task; the body structural representation, using the Frontal Body Evocation task, FBE; and body semantics, using the Object-Body Part Association Task) as well as control tasks (i.e., tasks with non-body stimuli). The older age group (aged over 60 years old) showed lower performances on the tasks probing the body schema and body structural representation than younger groups (aged 18 to 40 and 41 to 60 years old). More interestingly, worse performances on a task assessing the body schema were significantly associated with higher IS with older age, suggesting that higher awareness of one’s inner body sensations would decrease the plasticity of this BR. These findings are interpreted according to the neuropsychological model of BR development and the effects of aging on the brain.
... 1). Several studies have found that manipulating the internal representation of body part size (e.g., through illusions) modifies tactile spatial perception (30), including where touch is localized in space (53). Given a fixed number of units, a change in represented body part size would lead to a corresponding change in the widths of the tuning curves. ...
... We have recently demonstrated that humans can accurately localize where a tool has been touched (75) and that mechanisms in somatosensory cortex for localizing touch on an arm may be reused to localize touch on a tool (25). Furthermore, tool use leads to lasting changes in somatosensory perception (53,76) that are likely driven by plasticity in somatosensory cortex (77). Thus, given the high degree of flexibility in the somatosensory system, we predict that the computation of trilateration is also used to localize touch on tools. ...
Perhaps the most recognizable sensory map in all of neuroscience is the somatosensory homunculus. Although it seems straightforward, this simple representation belies the complex link between an activation in a somatotopic map and the associated touch location on the body. Any isolated activation is spatially ambiguous without a neural decoder that can read its position within the entire map, but how this is computed by neural networks is unknown. We propose that the somatosensory system implements multilateration, a common computation used by surveying and global positioning systems to localize objects. Specifically, to decode touch location on the body, multilateration estimates the relative distance between the afferent input and the boundaries of a body part (e.g., the joints of a limb). We show that a simple feedforward neural network, which captures several fundamental receptive field properties of cortical somatosensory neurons, can implement a Bayes-optimal multilateral computation. Simulations demonstrated that this decoder produced a pattern of localization variability between two boundaries that was unique to multilateration. Finally, we identify this computational signature of multilateration in actual psychophysical experiments, suggesting that it is a candidate computational mechanism underlying tactile localization.
... The body schema is normally assumed to be a common body representation supporting all types of motor actions, and numerous studies have typically used a single motor task to explore aspects of the body schema (17,19,(37)(38)(39)(40). However, we often take multiple simultaneous actions with different effectors, such as the eye and hand, in our daily lives. ...
... The present study reveals that there are multiple representations of the body schema for the same body part. The body schema has been assumed to be a common body representation used to control all types of motor actions for more than a century (17,19,(37)(38)(39)(40). Although many studies have suggested that the body schema is based on the process of multisensory ...
Purposeful motor actions depend on the brain’s representation of the body, called the body schema, and disorders of the body schema have been reported to show motor deficits. The body schema has been assumed for almost a century to be a common body representation supporting all types of motor actions, and previous studies have considered only a single motor action. Although we often execute multiple motor actions, how the body schema operates during such actions is unknown. To address this issue, I developed a technique to measure the body schema during multiple motor actions. Participants made simultaneous eye and reach movements to the same location of 10 landmarks on their hand. By analyzing the internal configuration of the locations of these points for each of the eye and reach movements, I produced maps of the mental representation of hand shape. Despite these two movements being simultaneously directed to the same bodily location, the resulting hand map (i.e., a part of the body schema) was much more distorted for reach movements than for eye movements. Furthermore, the weighting of visual and proprioceptive bodily cues to build up this part of the body schema differed for each effector. These results demonstrate that the body schema is organized as multiple effector-specific body representations. I propose that the choice of effector toward one’s body can determine which body representation in the brain is observed and that this visualization approach may offer a new way to understand patients’ body schema.
... Previous studies show that even ten minutes of tool-use can deeply modify the representations of both the body and the space around it [7][8][9][10][11][12][13][14][15] . For example, when using a long grabber tool to retrieve objects, the arm representation is updated to reflect the functional elongation of the effector. ...
A tool can function as a body part yet not feel like one: Putting down a fork after dinner does not feel like losing a hand. However, studies show fake body-parts are embodied and experienced as parts of oneself. Typically, embodiment illusions have only been reported when the fake body-part visually resembles the real one. Here we reveal that participants can experience an illusion that a mechanical grabber, which looks scarcely like a hand, is part of their body. We found changes in three signatures of embodiment: the real hand’s perceived location, the feeling that the grabber belonged to the body, and autonomic responses to visible threats to the grabber. These findings show that artificial objects can become embodied even though they bear little visual resemblance to the hand.
... Moreover, many studies have demonstrated that body representations are also affected by the way the body acts in space, such as when using tools (e.g., rake or pliers) to reach out of reach objects, i.e., in far space (Maravita and Iriki, 2004;Martel et al., 2016;Cléry and Ben Hamed, 2018). Studies have demonstrated that tool-use re-shapes body representations by extending the estimated length of the tool-holding limb or by altering the limb kinematics after tool-use (Cardinali et al., 2009a;Sposito et al., 2012;Canzoneri et al., 2013b). Other studies have shown modifications of PPS representations after tool-use. ...
The perception of our body is mediated by cortical representations of the body and the space around it. Body representations are constantly updated by the integration of interoceptive and exteroceptive signals during body-environment interactions. Research in the last years has distinguished multiple body representations, with specific functions, whose number and characteristics are still debated. This article provides a synthetic, critical overview of (1) the main taxonomies and (2) methods to study body representations; (3) key examples of plasticity in body representations, due to experimental manipulations, development, aging and pathologies; (4) current knowledge about the neural correlates and (5) some crucial open issues.
... Studies in non-human primates, patients and healthy participants have demonstrated that short or long experiences with tools (Bassolino, Serino, Ubaldi, & L adavas, 2010;Biggio, Bisio, Avanzino, Ruggeri, & Bove, 2017;Maravita & Iriki, 2004;Serino, Bassolino, Farn e, & L adavas, 2007) affect PPS representation, for instance by increasing multisensory interactions between stimuli on the body and in the far space (see for a review Maravita & Iriki, 2004). Similarly, plasticity of BR after tool-use has been reported both in terms of kinematic changes and modifications of the perceived limb dimensions (Canzoneri et al., 2013;Cardinali et al., 2009Cardinali et al., , 2011Garbarini et al., 2015;Romano, Uberti, Caggiano, Cocchini, & Maravita, 2018;Sposito, Bolognini, Vallar, & Maravita, 2012). Moreover, BR and PPS are also modifiable by reduced use of the upper limb as during immobilization in healthy participants Toussaint, Wamain, Bidet-Ildei, & Coello, 2018). ...
To efficiently interact with the external world, the brain needs to represent the size of the involved body parts - body representations (BR) - and the space around the body in which the interactions with the environment take place - peripersonal space representation (PPS). BR and PPS are both highly flexible, being updated by the continuous flow of sensorimotor signals between the brain and the body, as observed for example after tool-use or immobilization. The progressive decline of sensorimotor abilities typically described in ageing could thus influence BR and PPS representations in the older adults. To explore this hypothesis, we compared BR and PPS in healthy young and older participants. By focusing on the upper limb, we adapted tasks previously used to evaluate BR and PPS plasticity, i.e., the body-landmarks localization task and audio-tactile interaction task, together with a new task targeting explicit BR (avatar adjustment task, AAT). Results show significantly higher distortions in the older rather than young participants in the perceived metric characteristic of the upper limbs. We found significant modifications in the implicit BR of the global shape (length and width) of both upper limbs, together with an underestimation in the arm length. Similar effects were also observed in the AAT task. Finally, both young and older adults showed equivalent multisensory facilitation in the space close to the hand, suggesting an intact PPS representation. Together, these findings demonstrated significant alterations of implicit and explicit BR in the older participants, probably associated with a less efficient contribution of bodily information typically subjected to age-related decline, whereas the comparable PPS representation in both groups could be supported by preserved multisensory abilities in older participants. These results provide novel empirical insight on how multiple representations of the body in space, subserving actions and perception, are shaped by the normal course of life.
... Studies in humans have reinforced these findings (for a review, see Maravita & Iriki, 2004) and, recently, have provided compelling evidence that a sense of agency might be important for inducing changes in the body schema (D'Angelo, di Pellegrino, Seriani, Gallina, & Frassinetti, 2018). Tool-derived body schema plasticity is now a widely accepted phenomenon in the literature (Cardinali et al., 2009). It thus appears that van Lawick-Goodall's (1971) early description of a tool as an extension of the body 1 is quite literally accurate from a neurocognitive perspective, at least in primates. ...
Tool use research has suffered from a lack of consistent theoretical frameworks. There is a plethora of tool use definitions and the most widespread ones are so inclusive that the behaviors that fall under them arguably do not have much in common. The situation is aggravated by the prevalence of anecdotes, which have played an undue role in the literature. In order to provide a more rigorous foundation for research and to advance our understanding of the interrelation between tool use and cognition, we suggest the adoption of Fragaszy and Mangalam's (2018) tooling framework, which is characterized by the creation of a body‐plus‐object system that manages a mechanical interface between tool and surface. Tooling is limited to a narrower suite of behaviors than tool use, which might facilitate its neurocognitive investigation. Indeed, evidence in the literature indicates that tooling has distinct neurocognitive underpinnings not shared by other activities typically classified as tool use, at least in primates. In order to understand the extent of tooling incidences in previous research, we systematically surveyed the comprehensive tool use catalog by Shumaker et al. (2011). We identified 201 tool use submodes, of which only 81 could be classified as tooling, and the majority of the tool use examples across species were poorly supported by evidence. Furthermore, tooling appears to be phylogenetically less widespread than tool use, with the greatest variability found in the primate order. However, in order to confirm these findings and to understand the evolution and neurocognitive mechanisms of tooling, more systematic research will be required in the future, particularly with currently underrepresented taxa.
... Un team di biologi francesi ha sottoposto alcuni soggetti a un test: questi, immediatamente dopo aver manipolato una pinza, percepivano degli stimoli tattili ai gomiti e sulle punta delle dita come molto più distanti dal tronco rispetto alla situazione antecedente la manipolazione, cioè quando non tenevano in mano nulla. Anche dopo avere posato la pinza, i soggetti, per un momento, si comportavano come se avessero avuto una sorta di braccio esteso (cf.Cardinali, 2009). ...
... The learning processes required to understand and practice the use of a tool that, for instance augments the range of activity of the human body, e.g. a rake, leads to the restructuring of the very processes of control and interaction of the hands, both while using the tool and during free hands movements. (Cardinali et al., 2009) Such restructuring of neurological processes is not limited to the senses and muscles directly acting, however it involves the whole body of the individual. For instance, the rake example not only involves the hands and the restructuring of the neural pattern controlling them, but it requires the involvement of vestibular and proprioceptive pathways, as well as the ones involving sight. ...
The role that the human sensory system plays in the daily interaction of individuals with the external world has far more articulated ramifications than expected from a component of human nature that appears to be so linear and straightforward. Starting from the conditions of sensory impairment as a vehicle for the analysis of the field, the multidisciplinary study of the perceptual process highlights how sight plays a dominant role compared to the other senses, and how this dominance fits into a context of dichotomies inherent in the today’s social structure, with negative impacts on the personal, psychological and social context of a part of humanity. The technical-scientific developments of the last decades, and in particular the innovations aimed at the enhancement of the human being through technology, have proved not only to be an instrument of emancipation from these hierarchical dictates, but also a space of opportunity for a reconsideration of design and research constraints, constraints generated by a bounded consideration of the sensory spectrum. The objective of the thesis in question consists in the development of an intervention in the field of visual impairments capable of suggesting an alternative to these principles through the intersection between human and technological elements in the restructuring of the perceptual approach and the analysis of the potential of the discipline of design in the exploration of alternative intervention methods. This goal will be achieved through the development of a support device for individuals with visual impairments. This device will allow the creation of a sensorial space alternative to the vision-centric standard, through the synesthetic interaction between an artificial vision system and natural systems of visual and haptic perception. In the course of the research path, the design process will be positioned in a role complementary to its usual connotation, which sees it as a tool for developing solutions and products dedicated to the specific area of intervention. It will also play a role as a creative force for new exploratory spaces where the design itself can have a functional, social and political value; spaces created in the intersection of science, speculation and interaction between the natural components and the artificial ones of organisms.
... In favour of this argument, there is neurological evidence for the inclusion of external tools into the body schema, spread across the entire nervous system, rather than solely in regions of the brain (e.g. Cardinali et al., 2009;Maravita & Iriki, 2004). From a radically embodied perspective, a major resource here is the human's manual hand, i.e., handedness (Drain, 2014). ...
The over-reaching purpose of this chapter is to address the sense-making and embodied nature of the artful mind. In doing so, I will reformulate Merlin Donald’s (2006) governing cognitive principles for art, advocating the enactive, emergent, situated, and distributed aspects of art and aesthetic experience as sense-making practices that are compatible with a radical view of embodied cognitive science. I therefore introduce, disentangle and present a more radical view of the human and artful mind, i.e. the idea that mind emerges in the interaction of an agent with a material and social environment as a result of sensorimotor activity. In short, sense-making practices. I then address some implications to the artful mind related to the topics presented here. In doing so, I reformulate Donald’s seven governing cognitive principles of the artful mind - from a rather traditional cognitivist view - to a more radically embodied perspective. The chapter ends with some concluding remarks.
... Tool Use: Increasingly, evidence is emerging that tool-use may affect body image [95] and body schema [26]. This point is further discussed by Seinfeld et al. [142] through the concept of User Representations. ...
We can create Virtual Reality (VR) interactions that have no equivalent in the real world by remapping spacetime or altering users' body representation, such as stretching the user's virtual arm for manipulation of distant objects or scaling up the user's avatar to enable rapid locomotion. Prior research has leveraged such approaches, what we call beyond-real techniques, to make interactions in VR more practical, efficient, ergonomic, and accessible. We present a survey categorizing prior movement-based VR interaction literature as reality-based, illusory, or beyond-real interactions. We survey relevant conferences (CHI, IEEE VR, VRST, UIST, and DIS) while focusing on selection, manipulation, locomotion, and navigation in VR. For beyond-real interactions, we describe the transformations that have been used by prior works to create novel remappings. We discuss open research questions through the lens of the human sensorimotor control system and highlight challenges that need to be addressed for effective utilization of beyond-real interactions in future VR applications, including plausibility, control, long-term adaptation, and individual differences.
... This finding is in agreement with the result showed in this work and strongly suggests that every sport requiring the frequent use of the same tool could lead to an integration of that specific equipment, depending on the level of familiarity with it. It has been suggested that tool-use not only modifies our perception of the world around us (Baccarini et al. 2014), letting us to improve the anticipation of action possibilities with the tool (Bourgeois et al. 2014), but also it modifies the spatial metric of our own body (Cardinali, Frassinetti et al. 2009;Sposito et al. 2012;Canzoneri, Marzolla et al. 2013;Baccarini et al. 2014). ...
Long-term experience with a tool stably enlarges peripersonal space (PPS). Also, gained experience with a tool modulates internal models of action. The aim of this work was to understand whether the familiarity with a tool influences both PPS and motor representation. Toward this goal, we tested in 13 expert fencers through a multisensory integration paradigm the embodiment in their PPS of a personal (pE) or a common (cE) épée. Then, we evaluated the primary motor cortex excitability of proximal (ECR) and distal (APB) muscles during a motor imagery (MI) task of an athletic gesture when athletes handled these tools. Results showed that pE enlarges subjects’ PPS, while cE does not. Moreover, during MI, handling tools increased cortical excitability of ECR muscle. Notably, APB’s cortical excitability during MI only increased with pE as a function of its embodiment in PPS. These findings indicate that the familiarity with a tool specifically enlarges PPS and modulates the cortical motor representation of those muscles involved in the haptic contact with it.
... Participants after the training moved the perceived midpoint distally (as if the arm was longer), thus demonstrating that the tool became part of the arm representation. In different context, other studies show that artificial elongation of the arm by tool use is reflected in kinematic changes so that the kinematic pattern observed after tool-use, was different with respect to pre-training parameters and similar to that of long-armed people (e.g., Cardinali et al. 2009). Overall, these data demonstrate that body representation is willing to accept external objects as parts of one's body if they are critical to performing the task that the subject wants to accomplish. ...
Years ago, it was demonstrated (e.g., Rizzolatti et al. in Handbook of neuropsychology, Elsevier Science, Amsterdam, 2000) that the brain does not encode the space around us in a homogeneous way, but through neural circuits that map the space relative to the distance that objects of interest have from the body. In monkeys, relatively discrete neural systems, characterized by neurons with specific neurophysiological responses, seem to be dedicated either to represent the space that can be reached by the hand (near/peripersonal space) or to the distant space (far/extrapersonal space). It was also shown that the encoding of spaces has dynamic aspects because they can be remapped by the use of tools that trigger different actions (e.g., Iriki et al. 1998). In this latter case, the effect of the tool depends on the modulation of personal space, that is the space of our body. In this paper, I will review and discuss selected research, which demonstrated that also in humans: 1 spaces are encoded in a dynamic way; 2 encoding can be modulated by the use of tool that the system comes to consider as parts of the own body; 3 body representations are not fixed, but they are fragile and subject to change to the point that we can incorporate not only the tools necessary for action, but even limbs belonging to other people. What embodiment of tools and of alien limb tell us about body representations is then briefly discussed.
... Although it is clear that prostheses and devices modify the representations of the human body and peri-personal space [25,26], more profound reflection is necessary when it comes to the concept of identity. In the present study, we understand identity as a composite and changing psychological element, one that is closely associated with the device or prosthesis (terms used as synonyms in this paper in a reductionist perspective, foreseeing subsequent developments of research on the differentiation of the types of prostheses/devices and of subjective reactions in the various specialist fields). ...
Many subjects with somatic pathologies or traumas in their recent anamnesis tend to experience symptoms and changes to their daily life parameters after technically successful treatment. Hence, this study aims to validate an investigation tool inspired by the prosthetic–bionic paradigm—namely, the PBP-Q—which allows for the evaluation of variation in questions relating to identity, psychosociality, and psychopathology in relation to the use of a prosthesis or device. We gathered 118 participants (68 females and 50 males) aged between 27 and 94 years (Mean = 58.42 ± 15.17). We performed both exploratory (EFA) and confirmatory (CFA) factor analyses on this sample. Moreover, we calculated the internal consistency for the PBP-Q scales and the total score for the questionnaire’s final 26-item and 5-factor versions. The five scales are psychological well-being; interpersonal relationships; professional relationships; autonomy and safety; addictions, compulsions, and obsessions. The internal consistency is good for both the total score and the subscales. In conclusion, overall, the PBP-Q has satisfactory psychometric properties, especially considering the measure’s complexity. It provides a quick and effective way to evaluate the changes that might arise after the use of a prosthesis or device and, subsequently, has implications for clinical practice.
... A vast literature shows that tool use induces plastic changes, leading to perceived body transformations (e.g. [8,10,18]). Against this background, it is not hard to believe that just as professional baseball players experience the bat as an extension of their own arm, Formula1 and MotoGP pilots experience their vehicles as extensions of their own bodies. ...
In Level 3 automated vehicles, drivers must take back control when prompted by a Take Over Request (TOR). However, there is currently no consensus on the safest way to achieve this. Research has shown that participants interact faster with an avatar when this “glows” in synchrony with participant physiology (heartbeat). We hypothesized that a similar form of synchronization might allow drivers to react faster to a TOR. Using a driving simulator, we studied driver responses to a TOR when permanently visible ambient lighting was synchronized with participants’ breathing. Experimental participants responded to the TOR faster than controls. There were no significant effects on self-reported trust or physiological arousal, and none of the participants reported that they were aware of the manipulation. These findings suggest that new ways of keeping the driver unconsciously “connected” to the vehicle could facilitate faster, and potentially safer, transfers of control.
... The mislocalisation between the real and the fake hand is called proprioceptive drift. Another example of a change in body schema is the extension of the body schema by the use of tools to reach distant objects [33][34][35] . The tool becomes part of the body and the environment which can be reached expands. ...
The body schema is a much discussed aspect of body awareness. Although there is still no single definition, there is widespread consensus that the body schema is responsible for movement and interaction with the environment. It usually remains outside of active consciousness. There are only few investigations on influences on the body schema and none of them investigated feeling of satiety or hunger. Thirty-two healthy women were investigated twice, one time sat and the other time hungry. To measure the body schema, we used a door-like-aperture and compared the critical aperture-to-shoulder-ratio (cA/S). A cover story was used to ensure that the unconscious body schema has been measured. We found a significantly higher cA/S for satiety compared to hungry, which indicates that during satiety participants rotate their shoulders for relatively larger door compared to hunger, unconsciously estimating their body size to be larger. We showed that even a moderate rated feeling of hunger or satiety leads to an adjustment in body-scaled action and consequently also an adaptation of body schema. It suggests that, in addition to the visual-spatial and the proprioceptive representation, somatic information can also be relevant for the body schema.
... sures. In comparative psychology, it is also widely recognized that humans and animals tend to view tools as mere extensions of their own body/embodiment [3]. ...
Humans and many animals exhibit a robust capability to manipulate diverse objects, often directly with their bodies and sometimes indirectly with tools. Such flexibility is likely enabled by the fundamental consistency in underlying physics of object manipulation such as contacts and force closures. Inspired by viewing tools as extensions of our bodies, we present Tool-As-Embodiment (TAE), a parameterization for tool-based manipulation policies that treat hand-object and tool-object interactions in the same representation space. The result is a single policy that can be applied recursively on robots to use end effectors to manipulate objects, and use objects as tools, i.e. new end-effectors, to manipulate other objects. By sharing experiences across different embodiments for grasping or pushing, our policy exhibits higher performance than if separate policies were trained. Our framework could utilize all experiences from different resolutions of tool-enabled embodiments to a single generic policy for each manipulation skill. Videos at https://sites.google.com/view/recursivemanipulation
... Near/far spatial coding influences time perception too: stimuli presented in far space are perceived shorter in duration as compared to stimuli presented in near space (Anelli et al., 2015). The distinction between near and far space depends on the extent to which an action can be performed (Rizzolatti et al., 1997;Berti and Frassinetti, 2000;Cardinali et al., 2009). Thus, near space is defined as the reachable space inside the arm's reaching distance, and far space is defined as the unreachable space outside the reaching distance (Maravita and Iriki, 2004;Bartolo et al., 2014;De Vignemont and Iannetti, 2015). ...
In this study, we explored the time and space relationship according to two different spatial codings, namely, the left/right extension and the reachability of stimulus along a near/far dimension. Four experiments were carried out in which healthy participants performed the time and spatial bisection tasks in near/far space, before and after short or long tool-use training. Stimuli were prebisected horizontal lines of different temporal durations in which the midpoint was manipulated according to the Muller-Lyer illusion. The perceptual illusory effects emerged in spatial but not temporal judgments. We revealed that temporal and spatial representations dynamically change according to the action potentialities of an individual: temporal duration was perceived as shorter and the perceived line’s midpoint was shifted to the left in far than in near space. Crucially, this dissociation disappeared following a long but not short tool-use training. Finally, we observed age-related differences in spatial attention which may be crucial in building the memory temporal standard to categorize durations.
... Instead, this end-effector becomes the active part of the tool, that is, the part of the tool that is used to act upon another object (e.g., the head of the hammer). Thus, there is an attentional shift, from the natural effector to the active part of the tool, while it is still the hand that needs to be controlled (for evidence for this distalization mechanism, see Cardinali et al., 2009;Iriki et al., 1996;Maravita et al., 2001;Maravita & Iriki, 2004;Miller et al., 2018;Osiurak et al., 2017; for an alternative explanation, see Holmes, 2012). This distalization mechanism also implies that the user needs to control the degrees of freedom of the body-tool system differently to the bodyonly system, a phenomenon called tooling (Fragaszy & Mangalam, 2018;. ...
The ubiquity of tool use in human life has generated multiple lines of scientific and philosophical investigation to understand the development and expression of humans’ engagement with tools and its relation to other dimensions of human experience. However, existing literature on tool use faces several epistemological challenges in which the same set of questions generate many different answers. At least four critical questions can be identified, which are intimately intertwined—(1) What constitutes tool use? (2) What psychological processes underlie tool use in humans and nonhuman animals? (3) Which of these psychological processes are exclusive to tool use? (4) Which psychological processes involved in tool use are exclusive to Homo sapiens? To help advance a multidisciplinary scientific understanding of tool use, six author groups representing different academic disciplines (e.g., anthropology, psychology, neuroscience) and different theoretical perspectives respond to each of these questions, and then point to the direction of future work on tool use. We find that while there are marked differences among the responses of the respective author groups to each question, there is a surprising degree of agreement about many essential concepts and questions. We believe that this interdisciplinary and intertheoretical discussion will foster a more comprehensive understanding of tool use than any one of these perspectives (or any one of these author groups) would (or could) on their own.
... Several studies have highlighted important similarities between tool-based and body-based tactile spatial processing (3)(4)(5), including on a neural level (6). Tool use also modulates somatosensory perception and action processes (7). While these findings suggest functional similarities between tools and limbs, direct evidence that body-based computational mechanisms are repurposed to sense and act with tools is lacking. ...
It is often claimed that tools are embodied by the user, but whether the brain actually repurposes its body-based computations to perform similar tasks with tools is not known. A fundamental body-based computation used by the somatosensory system is trilateration. Here, the location of touch on a limb is computed by integrating estimates of the distance between sensory input and its boundaries (e.g., elbow and wrist of the forearm). As evidence of this computational mechanism, tactile localization on a limb is most precise near its boundaries and lowest in the middle. If the brain repurposes trilateration to localize touch on a tool, we should observe this computational signature in behavior. In a large sample of participants, we indeed found that localizing touch on a tool produced the signature of trilateration, with highest precision close to the base and tip of the tool. A computational model of trilateration provided a good fit to the observed localization behavior. Importantly, model selection demonstrated that trilateration better explained each participant's behavior than an alternative model of localization. These results have important implications for how trilateration may be implemented by somatosensory neural populations. In sum, the present study suggests that tools are indeed embodied at a computational level, repurposing a fundamental spatial computation.
... A large part of the premotor-parietal network for actions control in humans is devoted to the representation of the own's body schema, a hierarchical high-level construct indicating a non-conscious process, continuously updated during movements, through which the individual registers his posture (or body part position) in relation to the peripersonal space 55 . Body schema is intrinsically highly flexible and adaptable: for example, the level of dexterity in tools use is reached once the tool has been incorporated, or embodied, into the body schema 56 and that repeated tool use (as a mechanical grabber extending the space around the arm) may carry after effects on subsequent free-hand grasping and pointing movements, likely altering the individual's body schema 57 . Emerging behavioural data indicate that a supernumerary finger, which is conceptually similar to a common tool, can actually be embodied into the user's body schema 6 , even when presented as an avatar in virtual reality scenarios 58 . ...
It is likely that when using an artificially augmented hand with six fingers, the natural five plus a robotic one, corticospinal motor synergies controlling grasping actions might be different. However, no direct neurophysiological evidence for this reasonable assumption is available yet. We used transcranial magnetic stimulation of the primary motor cortex to directly address this issue during motor imagery of objects’ grasping actions performed with or without the Soft Sixth Finger (SSF). The SSF is a wearable robotic additional thumb patented for helping patients with hand paresis and inherent loss of thumb opposition abilities. To this aim, we capitalized from the solid notion that neural circuits and mechanisms underlying motor imagery overlap those of physiological voluntary actions. After a few minutes of training, healthy humans wearing the SSF rapidly reshaped the pattern of corticospinal outputs towards forearm and hand muscles governing imagined grasping actions of different objects, suggesting the possibility that the extra finger might rapidly be encoded into the user’s body schema, which is integral part of the frontal-parietal grasping network. Such neural signatures might explain how the motor system of human beings is open to very quickly welcoming emerging augmentative bioartificial corticospinal grasping strategies. Such an ability might represent the functional substrate of a final common pathway the brain might count on towards new interactions with the surrounding objects within the peripersonal space. Findings provide a neurophysiological framework for implementing augmentative robotic tools in humans and for the exploitation of the SSF in conceptually new rehabilitation settings.
Cette thèse s’inscrit dans les développements récents de la théorie de l’incorporation appliqués à la cognition sociale qui défendent que la plasticité de la représentation du Soi induite par interaction multisensorielle peut s’étendre à notre relation à l’autre, et modifier notre ressenti envers autrui. L’utilisation d’une procédure inspirée des neurosciences pour téléporter des sujets humains dans des robots a permis d’induire, par interaction sensorimotrice, l’incorporation du sujet. Nous avons mis en évidence que la nature de la manipulation, des mouvements de tête sujet-robot synchrones ou des caresses simultanées des deux visages, module la plasticité de la représentation du Soi corporel évaluée à travers les modifications des sensations illusoires d’appartenance incluant l’appropriation du visage, de localisation et d’agentivité. Le mouvement volontaire induit une singularité via un fort sentiment d’agentivité alors que l’induction tactile produit une perception illusoire plus distribuée. Cette même modulation est aussi observée lors d’une perspective d’interaction homme-robot en face-à-face. Cette incorporation peut augmenter l’acceptabilité et la proximité sociale vis-à-vis du robot mais celle-ci ne suffit pas : elle dépend de la nature de la manipulation sensorielle. Uniquement les manipulations motrices renforcent la sympathie et l’affinité du sujet envers le robot, les deux étant corrélées positivement au sentiment d’agentivité. Ces résultats suggèrent des mécanismes dissociés sous-jacents à l’incorporation en termes d’agentivité. Ainsi, la résonance intentionnelle lors de mouvements simultanés entre l’homme et le robot pourrait être responsable de sentiments accrus de proximité sociale et émotionnelle envers le robot. Remarquablement, la sensation d’incorporation et les sentiments sociaux induits associés sont indépendants de l’apparence humanoïde ou non du robot.
Представлен обзор исследований телесных иллюзий – мультимодальных феноменов, возникающих в соматосенсорной сфере в результате моделирования особых условий восприятия. Рассказывается об экспериментальных моделях формирования этих иллюзий, а также предлагается анализ механизмов восприятия тела, лежащих в их основе. Описаны локальные иллюзии («иллюзия резиновой руки», «иллюзия изменяющегося лица» и др.), а также иллюзии, в которые вовлекается все тело целиком (например, «иллюзия обмена телами», «опыт внетелесного существования»). Эксперименты сгруппированы по основным изменениям, которые они вызывают в схеме тела: обладание искусственным объектом как частью своего тела; искажение отдельных характеристик тела; добавление (удвоение) элемента схемы тела. Феномены, полученные в этих экспериментах, сопоставляются с различными неврологическими и психопатологическими симптомами нарушения восприятия тела.
In der Puppenhandillusion (PHI) wird durch die synchrone Berührung der nicht-sichtbaren Hand des Probanden und einer sichtbaren Puppenhand ein illusio-näres Körperzugehörigkeitsgefühl induziert. Dieses Paradigma erlaubt es zu untersuchen, wie das Gehirn widersprüchliche multisensorische Informationen während einer perzeptiven Inferenz auflöst. Vorausgehende Studien weisen darauf hin, dass der Konflikt zwischen visueller und propriozeptiver Information vor der PHI durch eine Abschwächung des so-matosensiblen Inputs behoben wird. Um herauszufinden, ob eine Exzitabilitäts-Minderung des primären somatosensiblen Kortex die PHI verstärken kann, kam die kathodale transkranielle Gleichstromstimulation (c-tDCS) zum Einsatz. An dreißig gesunden Probanden wurde die PHI ohne (=baseline) und während tDCS untersucht. Jeder Proband erhielt kathodale, anodale und sham-Stimulation an drei unterschiedlichen Tagen im Abstand von je einer Woche. Das PHI-Paradigma wurde in sechs Distanzen (von 17,5 bis 67,5 cm) zwischen der eigenen Hand und der Puppenhand durchgeführt. Das Auftreten der PHI wurde anhand eines Fragebogens (Illusionsscore, IS) und der Abweichung der gefühlten Handposition in Bezug zur realen Position (relativer Drift, RD) evalu-iert. Die kathodale Stimulation war mit einem signifikanten Anstieg des IS im Vergleich zur anodalen Stimulation assoziiert, wohingegen die RD-Werte über alle Stimulationsarten hinweg vergleichbar waren. Die fehlende Signifikanz zwischen Verum und Sham-Stimulation wurde auf die geringe Effektstärke bei vergleichsweise kleinem Probandenkollektiv bezogen. Die Ergebnisse dieser Studie zeigen jedoch eine verstärkte Wahrnehmung der PHI unabhängig von demographischen Faktoren, wenn kathodale tDCS über dem kontralateralen primären somatosensiblen Kortex appliziert wurde. Dies unterstützt unsere Hypothese, dass eine Abschwächung der somatosensiblen Präzision den Weg für eine erleichterte Integration eines fremden Körperteils in das eigene Körperschema ebnet.
Recent studies have hypothesized that the stereotypical representation of the body may reflect some functional aspects of routine actions that are performed in specific peripersonal domains. For example, the lower and upper limbs tend to ‘act’ in different peripersonal spaces and perform different functions. The present study aims to directly investigate the relationship between body representation and the spatial context where actions are performed. By means of a modified version of the body image task, we investigated body representation before and after a sorting task training in two groups of participants who were asked to carry out the same task/actions in two different spaces: on a table or on the floor, while sitting on a chair. Findings showed that a significant recalibration of the perceived upper arms’ length occurred when participants were asked to perform a motor task on the floor. These results seem to suggest that the modulation of the body representation reflects an increase action capabilities driven by the contribution of motor training, and importantly, the location in which the action occurs. Furthermore, the modulation was not limited to the body part actively involved in the action (the arms), it extended to other upper body parts (the torso) to maintain, we propose, a functionally coherent representation of the upper body.
Research in cognitive neuroscience indicates that we process the space surrounding our body in a specific way, both for protecting our body from immediate danger and for interacting with the environment. This research has direct implications for philosophical issues as diverse as self-location, sensorimotor theories of perception, and affective perception. This chapter briefly describes the overall directions that some of these discussions might take. But, beforehand, it is important to fully grasp what the notion of peripersonal space involves. One of the most difficult questions that the field has had to face these past 30 years is to define peripersonal space. Although it bears some relations to the social notion of personal space, to the sensorimotor notion of reaching space and to the spatial notion of egocentric space, there is something unique about peripersonal space and the special way we represent it. One of the main challenges is thus to offer a satisfactory definition of peripersonal space that is specific enough to account for its peculiar spatial, multisensory, plastic, and motor properties. Emphasis can be put on perception or on action, but also on impact prediction or defence preparation. However, each new definition brings with it new methods to experimentally investigate peripersonal space. There is then the risk of losing the unity of the notion of peripersonal space within this multiplicity of conceptions and methods. This chapter offers an overview of the key notions in the field, the way they have been operationalized, and the questions they leave open.
The neuroscientific approach to peripersonal space (PPS) stems directly from electrophysiological studies assessing the response properties of multisensory neurons in behaving non-human primates. This multisensory context fostered frameworks which i) stress the PPS role in actions (including defensive reactions) and affordances, which are optimally performed through multiple sensory convergence; and ii) largely make use of tasks that are multisensory in nature. Concurrently, however, studies on spatial attention reported proximity-related advantages in purely unisensory tasks. These advantages appear to share some key PPS features. Activations in brain areas reported to be multisensory, indeed, can also be found using unimodal (visual) paradigms. Overall, these findings point to the possibility that closer objects may benefit from being processed as events occurring in PPS. The dominant multisensory view of PPS should therefore be expanded accordingly, as perceptual advantages in PPS may be broader than previously thought.
Body representations are known to be dynamically modulated or extended through tool use. Here, we review findings that demonstrate the importance of a user's tool experience or expertise for successful tool embodiment. Examining expert tool users, such as individuals who use tools in professional sports, people who use chopsticks at every meal, or spinal injury patients who use a wheelchair daily, offers new insights into the role of expertise in tool embodiment: Not only does tool embodiment differ between novices and experts, but experts may experience enhanced changes to their body representation when interacting with their own, personal tool. The findings reviewed herein reveal the importance of assessing tool skill in future studies of tool embodiment.
A substantial body of prior work has documented the relationship between extensive smartphone use and individuals' psychological features, yet little is known about how smartphone influences people's body representation. Based on previous studies on tool embodiment, this study adapted the hand mental rotation paradigm to determine whether smartphone use would change the body schema. We compared smartphone users' behavioral performances and electrophysiological activities when presented with different stimuli. We found that people had faster and more accurate responses to smartphone in hand stimuli than other two stimuli. ERP results showed that N200 amplitude elicited by gesture of holding a phone stimuli was smaller than smartphone in hand stimuli, which was smaller than non-smartphone in hand stimuli. For the P300 component, gesture of holding a phone stimuli evoked larger P300 amplitude than non-smartphone in hand stimuli, and there was no significant difference in P300 amplitude between gesture of holding a phone stimuli and smartphone in hand stimuli. These results provide new evidence that smartphone is embodied into the body schema. It not only expands the research scope and field of tool embodiment but also provides insight into the impact of smartphone on individuals' body self.
Understanding the embodiment of robotic devices and using this knowledge to improve human-robot interaction touches a variety of open research questions. A considerable body of research outlines the complexity and plasticity of human bodily experience. When examining robotic devices, which can be seen as “intelligent” tools, this challenge is getting even tougher due to the interaction of the involved agents. Beyond this, recent studies point out how the investigation of psychological fundamentals can benefit from using robotic devices for human-in-the-loop evaluation in return. This chapter conveys and discusses fundamental concepts and terminology of human body experience, aiming at accessibility for all concerned disciplines. Concepts of different body representations and the presence in virtual environments are presented along with fundamentals of human motor control and haptic perception. Based on those, perspective potentials in human-robot interaction are outlined with respect to control, sensing, feedback, and assessment. To support the design and application of human-in-the-loop approaches in fundamental research and engineering, the related literature is analyzed to determine and assess crucial design requirements.
Human–swarm interaction is a frontier in the realms of swarm robotics and human-factors engineering. However, no holistic theory has been explicitly formulated that can inform how humans and robot swarms should interact through an interface while considering real-world demands, the relative capabilities of the components, as well as the desired joint-system behaviours. In this article, we apply a holistic perspective that we refer to as joint human–swarm loops, that is, a cybernetic system made of human, swarm and interface. We argue that a solution for human–swarm interaction should make the joint human–swarm loop an intelligent system that balances between centralized and decentralized control. The swarm-amplified human is suggested as a possible design that combines perspectives from swarm robotics, human-factors engineering and theoretical neuroscience to produce such a joint human–swarm loop. Essentially, it states that the robot swarm should be integrated into the human’s low-level nervous system function. This requires modelling both the robot swarm and the biological nervous system as self-organizing systems. We discuss multiple design implications that follow from the swarm-amplified human, including a computational experiment that shows how the robot swarm itself can be a self-organizing interface based on minimal computational logic.
Trenton Merricks has objected to dualist conceptions of the Incarnation in a similar way to Jaegwon Kim’s pairing problem. On the original pairing problem, so argues Kim, we lack a pairing relationship between bodies and souls such that body A is causally paired with soul A and not soul B. Merricks, on the other hand, argues that whatever relations dualists propose that do pair bodies and souls together (e.g. causal relations) are relations that God the Son has with all bodies whatsoever via his divine attributes (e.g. God the Son could cause motion in any and all bodies via his omnipotence). So if we count these relations as sufficient for embodiment, then dualism implies that God the Son is embodied in all bodies whatsoever. I shall argue that while the original pairing problem might be easily answerable, the Christological pairing problem is not and that dualists must shift some of their focus from the defense of the soul’s existence to explicating the nature of the mind-body relationship.
Can our brain perceive a sense of ownership towards an independent supernumerary limb; one that can be moved independently of any other limb and provides its own independent movement feedback? Following the rubber-hand illusion experiment, a plethora of studies have shown that the human representation of 'self' is very plastic. But previous studies have almost exclusively investigated ownership towards 'substitute' artificial limbs, which are controlled by the movements of a real limb and/or limbs from which non-visual sensory feedback is provided on an existing limb. Here, to investigate whether the human brain can own an independent artificial limb, we first developed a novel independent robotic 'sixth finger.' We allowed participants to train using the finger and examined whether it induced changes in the body representation using behavioral as well as cognitive measures. Our results suggest that unlike a substituted artificial limb (like in the rubber hand experiment), it is more difficult for humans to perceive a sense of ownership towards an independent limb. However, ownership does seem possible, as we observed clear tendencies of changes in the body representation that correlated with the cognitive reports of the sense of ownership. Our results provide the first evidence to show that an independent supernumerary limb can be embodied by the human brain.
Motivated by a set of converging empirical findings and theoretical suggestions pertaining to the construct of ownership, we survey literature from multiple disciplines and present an extensive theoretical account linking the inception of a foundational naïve theory of ownership to principles governing the sense of (body) ownership. The first part of the account examines the emergence of the non-conceptual sense of ownership in terms of the minimal self and the body schema—a dynamic mental model of the body that functions as an instrument of directed action. A remarkable feature of the body schema is that it expands to incorporate objects that are objectively controlled by the person. Moreover, this embodiment of extracorporeal objects is accompanied by the phenomenological feeling of ownership towards the embodied objects. In fact, we argue that the sense of agency and ownership are inextricably linked, and that predictable control over an object can engender the sense of ownership. This relation between objective agency and the sense of ownership is moderated by gestalt-like principles. In the second part, we posit that these early emerging principles and experiences lead to the formation of a naïve theory of ownership rooted in notions of agential involvement.
Far (extrapersonal) and near (peripersonal) spaces are behaviorally defined as the space outside the hand-reaching distance and the space within the hand-reaching distance. Animal and human studies have confirmed this distinction, showing that space is not homogeneously represented in the brain. In this paper we demonstrate that the coding of space as "far" and "near" is not only determined by the hand-reaching distance, but it is also dependent on how the brain represents the extension of the body space. We will show that when the cerebral representation of body space is extended to include objects or tools used by the subject, space previously mapped as far can be remapped as near. Patient P.P., after a right hemisphere stroke, showed a dissociation between near and far spaces in the manifestation of neglect. Indeed, in a line bisection task, neglect was apparent in near space, but not in far space when bisection in the far space was performed with a projection lightpen. However, when in the far space bisection was performed with a stick, used by the patient to reach the line, neglect appeared and was as severe as neglect in the near space. An artificial extension of the patient's body (the stick) caused a remapping of far space as near space.
What happens in our brain when we use a tool to reach for a distant object? Recent neurophysiological, psychological and neuropsychological research suggests that this extended motor capability is followed by changes in specific neural networks that hold an updated map of body shape and posture (the putative "Body Schema" of classical neurology). These changes are compatible with the notion of the inclusion of tools in the "Body Schema", as if our own effector (e.g. the hand) were elongated to the tip of the tool. In this review we present empirical support for this intriguing idea from both single-neuron recordings in the monkey brain and behavioural performance of normal and brain-damaged humans. These relatively simple neural and behavioural aspects of tool-use shed light on more complex evolutionary and cognitive aspects of body representation and multisensory space coding for action.
Recent research demonstrates neurologic and behavioral differences in people's responses to the space that is within and beyond reach. The present studies demonstrated a perceptual difference as well. Reachability was manipulated by having participants reach with and without a tool. Across 2 conditions, in which participants either held a tool or not, targets were presented at the same distances. Perceived distances to targets within reach holding the tool were compressed compared with targets that were beyond reach without it. These results suggest that reachability serves as a metric for perception. The 3rd experiment found that reachability only influenced perceived distance when the perceiver intended to reach. These experiments suggest that the authors perceive the environment in terms of our intentions and abilities to act within it.
This book [reviews] cognitive neuroscience studies of the representation for actions. The fundamental question addressed concerns the nature and role of different representations in the planning and execution of movements. Adopting a cognitive neuroscience approach to this question generates a new perspective and some challenging hypotheses.
The book explores in detail the contribution of the brain structures, particularly the cerebral cortex, to the various aspects of movement preparation and execution. In so doing, the author discusses a wide range of evidence, including the study of anatomical connections between areas, the recording of single neuron activity in animals, and brain stimulation and imaging studies in human Ss. This neuroscience evidence is related to both behavioral experiments in normal Ss and clinical observations in brain-lesioned Ss, resulting in provocative hypotheses about the cognitive structure of central representations and processes which subserve actions. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Body schema disturbances were studied in a 62-yr-old woman with Alzheimer's disease. She was severely impaired in verbal and nonverbal tasks requiring her to localize body parts (on her own body, the examiner's body or a doll's body) even though she correctly named the same parts when pointed at by the examiner. Pointing responses were misdirected mainly to parts contiguous with the target area and, to a lesser extent, to functionally equivalent body parts. We also found that the patient was able to define body part names functionally but not spatially. In another series of tasks, and in contrast to the above results, performances were normal when small objects, attached to the patient's body, served as pointing targets. Furthermore, on subsequent testing she pointed correctly at the remembered position of these objects. The fact that the same point in 'body space' is localized correctly when it corresponds to an external object and erroneously when it corresponds to a body part contradicts the idea of the body schema as a unitary function. Learning the position of objects on the body surface requires access to some form of body-reference system on which this information can be mapped. We argue that such a system can be available in autotopagnosia and is independent from the visuospatial representations of the body structure that are postulated to be damaged or inaccessible in this syndrome. An integrated account of the present results and of those reported by other authors suggests that multiple levels of representation (e.g., sensorimotor, visuospatial, semantic) are involved in the organization of body knowledge.
A tool is an extension of the hand in both a physical and a perceptual sense. The presence of body schemata has been postulated as the basis of the perceptual assimilation of tool and hand. We trained macaque monkeys to retrieve distant objects using a rake, and neuronal activity was recorded in the caudal postcentral gyrus where the somatosensory and visual signals converge. There we found a large number of bimodal neurones which appeared to code the schema of the hand. During tool use, their visual receptive fields were altered to include the entire length of the rake or to cover the expanded accessible space. These findings may represent neural correlates of the modified schema of the hand in which the tool was incorporated.
Evidence suggests homologies in parietofrontal circuits involved in object prehension among humans and monkeys. Likewise, tool use is known to induce functional reorganization of their visuotactile limb representations. Yet, humans are the only species for whom tool use is a defining and universal characteristic. Why? Comparative studies of chimpanzee tool use indicate that critical differences are likely to be found in mechanisms involved in causal reasoning rather than those implementing sensorimotor transformations. Available evidence implicates higher-level perceptual areas in these processes.
The aim of the present study was to determine whether kinematic parameters of the grasping motor act are controlled independently of the biomechanical features of the grasping effector. With this purpose in mind, we compared grasping movements performed naturally or using a tool. The tool consisted of two mechanical fingers whose opening and closing phases required squeezing (flexion of the biological fingers) and releasing (extension of the biological fingers) of a handle, respectively. The forces required for opening and closing the mechanical fingers were, respectively, greater and smaller than those used to grasp the objects naturally. In a control experiment the participants grasped with their thumb and index finger the same objects grasped with the tool. The kinematics of the mechanical and biological fingers as well as those of the arm in the two experiments were compared with each other. Grasping an object with the tool showed some kinematic characteristics strikingly similar to those of the natural grasp, whereas others were different. Like the natural grasp, the tool grasp consisted of a finger opening and closing phase. The scaling of both peak velocity of aperture and maximal aperture of the mechanical fingers as a function of object size was the same as that of the biological fingers. In contrast, the tool grasp differed from the natural one for the temporal aspects of the movement. Finally, the initial reach (i.e. the acceleration phase) was poorly influenced by the tool use whereas the final reach (i.e. the deceleration phase) was lengthened and more sensitive to object size. We discuss the results of the present study as being in favour of the hypothesis that some features of the grasp motor representation are coded in cortical areas independently of the used effector. In addition, they suggest a partial independence between the reach and the grasp components.
Recent findings from neurophysiology, neuropsychology and psychology have shown that peri-personal space is represented through an integrated multisensory processing. In humans, the interaction between peri-personal space representation and action execution can be revealed through the use of tools that, by extending the reachable space, modify the strength of visual-tactile extinction. We have previously shown that the peri-hand space whereby vision and touch are integrated can be expanded, and contracted, depending upon tool-use. Here, we show that these dynamic changes critically depend upon active tool-use, as they are not found after an equally long, but passive exposure to an elongated (hand+tool) body configuration. We also show that the extent of the peri-hand space elongation, as assessed at fixed far location (60 cm from the hand), varies according to the tool length such that a 30 cm long tool produced less elongation than a 60 cm long tool. This reveals for the first time that the distal border of elongated area is not sharply limited to the tool length, but extends beyond its physical size to include a peri-tool space whereby the strength of visual-tactile integration seems to fade. Remarkably, a similar amount of peri-hand space elongation was found when the effects of using a 30 cm long tool were compared with those produced by using a tool that was physically 60 cm long, but operationally 30 cm long. By dissociating with this 'hybrid' tool, the amount of space that is globally added to the hand (60 cm) from the one that is actually reachable (30 cm), we provide here the first evidence that the extent of peri-hand space elongation after tool use is tightly related to the functionally effective length of the tool, and not merely to its absolute length.