Article

Expanding the primate body schema in sensorimotor cortex by virtual touches of an avatar.

School of Engineering, Institute of Microengineering, Ecole Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland.
Proceedings of the National Academy of Sciences (Impact Factor: 9.81). 08/2013; DOI: 10.1073/pnas.1308459110
Source: PubMed

ABSTRACT The brain representation of the body, called the body schema, is susceptible to plasticity. For instance, subjects experiencing a rubber hand illusion develop a sense of ownership of a mannequin hand when they view it being touched while tactile stimuli are simultaneously applied to their own hand. Here, the cortical basis of such an embodiment was investigated through concurrent recordings from primary somatosensory (i.e., S1) and motor (i.e., M1) cortical neuronal ensembles while two monkeys observed an avatar arm being touched by a virtual ball. Following a period when virtual touches occurred synchronously with physical brushes of the monkeys' arms, neurons in S1 and M1 started to respond to virtual touches applied alone. Responses to virtual touch occurred 50 to 70 ms later than to physical touch, consistent with the involvement of polysynaptic pathways linking the visual cortex to S1 and M1. We propose that S1 and M1 contribute to the rubber hand illusion and that, by taking advantage of plasticity in these areas, patients may assimilate neuroprosthetic limbs as parts of their body schema.

1 Bookmark
 · 
108 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Bodily illusions have been used to study bodily self-consciousness and disentangle its various components, among other the sense of ownership and self-location. Congruent multimodal correlations between the real body and a fake humanoid body can in fact trigger the illusion that the fake body is one’s own and/or disrupt the unity between the perceived self-location and the position of the physical body. However, the extent to which changes in self-location entail changes in ownership is still matter of debate. Here we address this problem with the support of immersive virtual reality. Congruent visuotactile stimulationwas delivered on healthy participants to trigger full body illusions from different visual perspectives, each resulting in a different degree of overlap between real and virtual body. Changes in ownership and self-location were measured with novel self-posture assessment tasks and with an adapted version of the cross-modal congruency task. We found that, despite their strong coupling, self-location and ownership can be selectively altered: self-location was affected when having a third person perspective over the virtual body, while ownership toward the virtual body was experienced only in the conditions with total or partial overlap. Thus, when the virtual body is seen in the far extra-personal space, changes in self-location were not coupled with changes in ownership. If a partial spatial overlap is present, ownership was instead typically experienced with a boosted change in the perceived self-location.We discussed results in the context of the current knowledge of the multisensory integration mechanisms contributing to self-body perception. We argue that changes in the perceived self-location are associated to the dynamical representation of peripersonal space encoded by visuotactile neurons. On the other hand, our results speak in favor of visuoproprioceptive neuronal populations being a driving trigger in full body ownership illusions.
    Frontiers in Human Neuroscience 09/2014; 8(September):1-19. · 2.90 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The experience of ourselves as an embodied agent with a first-person perspective is referred to as 'bodily self'. We present a selective overview of relevant clinical and experimental studies.
    Current Opinion in Neurology 10/2014; · 5.73 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1-2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices.
    Frontiers in Systems Neuroscience 10/2014; 8:190.