The brain representation of the body, called the body schema, is susceptible to plasticity. For instance, subjects experiencing a rubber hand illusion develop a sense of ownership of a mannequin hand when they view it being touched while tactile stimuli are simultaneously applied to their own hand. Here, the cortical basis of such an embodiment was investigated through concurrent recordings from primary somatosensory (i.e., S1) and motor (i.e., M1) cortical neuronal ensembles while two monkeys observed an avatar arm being touched by a virtual ball. Following a period when virtual touches occurred synchronously with physical brushes of the monkeys' arms, neurons in S1 and M1 started to respond to virtual touches applied alone. Responses to virtual touch occurred 50 to 70 ms later than to physical touch, consistent with the involvement of polysynaptic pathways linking the visual cortex to S1 and M1. We propose that S1 and M1 contribute to the rubber hand illusion and that, by taking advantage of plasticity in these areas, patients may assimilate neuroprosthetic limbs as parts of their body schema.
"Regardless of the weight assigned to vision and proprioception by the brain, the interaction between the two sensory inputs may not be based on a simple algebraic sum, not least because of the different time-period necessary for the two inputs to access the brain, as shown by the different latency of their primary components in the cortical evoked potentials (Schieppati and Ducati, 1984; Bodis-Wollner, 1992; Shokur et al., 2013) or to reach consciousness (Barnett-Cowan and Harris, 2009). Further, the ultimate functional effects of either input or of their interaction over time relates to the particular current balance or movement constraints. "
[Show abstract][Hide abstract] ABSTRACT: Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1-2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices.
Frontiers in Systems Neuroscience 10/2014; 8:190. DOI:10.3389/fnsys.2014.00190
"Recently, intracortical recordings in primates have revealed that S1 and M1 are involved in the plastic processes responsible for the embodiment of a virtual hand. The time delay of those responses was compatible with an indirect activation of primary sensorimotor areas by visual cortices, probably trough the frontoparietal cortical circuitry (Shokur et al., 2013). "
[Show abstract][Hide abstract] ABSTRACT: Today, the anthropomorphism of the tools and the development of neural interfaces require reconsidering the concept of human-tools interaction in the framework of human augmentation. This review analyzes the plastic process that the brain undergoes when it comes into contact with augmenting artificial sensors and effectors and, on the other hand, the changes that the use of external augmenting devices produces in the brain. Hitherto, few studies investigated the neural correlates of augmentation, but clues on it can be borrowed from logically-related paradigms: sensorimotor training, cognitive enhancement, cross-modal plasticity, sensorimotor functional substitution, use and embodiment of tools. Augmentation modifies function and structure of a number of areas, i.e. primary sensory cortices shape their receptive fields to become sensitive to novel inputs. Motor areas adapt the neuroprosthesis representation firing-rate to refine kinematics. As for normal motor outputs, the learning process recruits motor and premotor cortices and the acquisition of proficiency decreases attentional recruitment, focuses the activity on sensorimotor areas and increases the basal ganglia drive on the cortex. Augmentation deeply relies on the frontoparietal network. In particular, premotor cortex is involved in learning the control of an external effector and owns the tool motor representation, while the intraparietal sulcus extracts its visual features. In these areas, multisensory integration neurons enlarge their receptive fields to embody supernumerary limbs. For operating an anthropomorphic neuroprosthesis, the mirror system is required to understand the meaning of the action, the cerebellum for the formation of its internal model and the insula for its interoception. In conclusion, anthropomorphic sensorized devices can provide the critical sensory afferences to evolve the exploitation of tools through their embodiment, reshaping the body representation and the sense of the self.
Frontiers in Systems Neuroscience 06/2014; 8(109). DOI:10.3389/fnsys.2014.00109
"Rhode's and our studies indicate that the proprioceptively sensed position of our own hand can be adjusted by visual inputs of a hand and that this effect is enhanced when tactile stimulation is synchronously delivered with the visual inputs and attenuated when asynchronously delivered. Similarly, monkey sensorimotor (M1 and S1) neuron responses for the visual-only stimulations to the virtual hand that had repeatedly been delivered synchronous visual-tactile stimulations was reported recently (Shokur et al., 2013). This process can be unconscious and not a sufficient, though perhaps prerequisite, condition for the subjective feeling of ownership of the hand. "
[Show abstract][Hide abstract] ABSTRACT: The rubber-hand illusion (RHI) is that the subject feels the visually presented tactile stimulation of an artificial (rubber) hand as their own tactile sensation and is caused by stimulating the rubber and real hands synchronously. Our previous study showed that the RHI was greatly reduced as the visual feedback delay of the tactile stimulation of the hand became longer. In the present study, we investigate the relationship between the attenuation of the RHI and the detection of the delay in two experiments: (1) an RHI experiment and (2) a visuotactile asynchrony detection experiment, in which the subjects underwent tactile stimulation of their hand and judged whether visual feedback was consistent with the touch sensation. In line with our previous study, the RHI was significantly reduced as the delay lengthened. Interestingly, proprioceptive drift declined linearly as the delay increased, while the delay detection rate was better fitted by a non-linear (logistic) function. The illusion score showed the intermittent pattern. We suggest that proprioceptive drift is relevant to the processing of the body schema, whereas the delay detection and the subjective feeling of the RHI are more related to the body image processing.
Neuroscience Research 05/2014; 85. DOI:10.1016/j.neures.2014.04.009 · 1.94 Impact Factor
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.