Humans routinely use both of their hands to gather information about shape and texture of objects. Yet, the mechanisms of how the brain combines haptic information from the two hands to achieve a unified percept are unclear. This study systematically measured the haptic precision of humans exploring a virtual curved object contour with one or both hands to understand if the brain integrates haptic information from the two hemispheres. Bayesian perception theory predicts that redundant information from both hands should improve haptic estimates. Thus exploring an object with two hands should yield haptic precision that is superior to unimanual exploration. A bimanual robotic manipulandum passively moved the hands of 20 blindfolded, right-handed adult participants along virtual curved contours. Subjects indicated which contour was more "curved" (forced choice) between two stimuli of different curvature. Contours were explored uni- or bimanually at two orientations (toward or away from the body midline). Respective psychophysical discrimination thresholds were computed. First, subjects showed a tendency for one hand to be more sensitive than the other with most of the subjects exhibiting a left-hand bias. Second, bimanual thresholds were mostly within the range of the corresponding unimanual thresholds and were not predicted by a maximum-likelihood estimation (MLE) model. Third, bimanual curvature perception tended to be biased toward the motorically dominant hand, not toward the haptically more sensitive left hand. Two-handed exploration did not necessarily improve haptic sensitivity. We found no evidence that haptic information from both hands is integrated using a MLE mechanism. Rather, results are indicative of a process of "sensory selection", where information from the dominant right hand is used, although the left, nondominant hand may yield more precise haptic estimates.
"Studies of information processing by the nervous system have found Bayesian models to often be consistent with empirical data, for a rather broad set of behaviours that includes infant cognition , language  , face perception , rhythm perception , haptics , and multi-signal integration . It is thus an important current theory for sensory-motor neuroscience and motor control. "
[Show abstract][Hide abstract] ABSTRACT: Information about the position of an object that is held in both hands, such as a golf club or a tennis racquet, is transmitted to the human central nervous system from peripheral sensors in both left and right arms. How does the brain combine these two sources of information? Using a robot to move participant's passive limbs, we performed psychophysical estimates of proprioceptive function for each limb independently, and again when subjects grasped the robot handle with both arms. We compared empirical estimates of bimanual proprioception to several models from the sensory integration literature: some that propose a combination of signals from the left and right arms (such as a Bayesian maximum-likelihood estimate), and some that propose using unimanual signals alone. Our results are consistent with the hypothesis that the nervous system both has knowledge of, and uses the limb with the best proprioceptive acuity for bimanual proprioception. Surprisingly, a Bayesian model that postulates optimal combination of sensory signals could not predict empirically observed bimanual acuity. These findings suggest that while the central nervous system seems to have information about the relative sensory acuity of each limb, it uses this information in a rather rudimentary fashion, essentially ignoring information from the less reliable limb.
Journal of Neurophysiology 12/2013; 111(6). DOI:10.1152/jn.00537.2013 · 2.89 Impact Factor
"In addition, no significant effect was observed in the Handedness × Hand × Task type three-way interaction, indicating that the non-dominant arm/hemisphere superiority was irrespective as to whether the task was performed unimanually or bimanually. This finding contradicts the prediction that sensory selection or sensory gating would be biased towards the dominant side when individuals are required to perform bimanual pinch movement discrimination tasks concurrently , and that the hemisphere advantage observed in unimanual tasks would not extend to different bimanual tasks . Rather, the results here support the Goble et al. hypothesis of functional differences between the preferred and non-preferred limbs during bilateral tasks . "
[Show abstract][Hide abstract] ABSTRACT: It has been proposed that asymmetry between the upper limbs in the utilization of proprioceptive feedback arises from functional differences in the roles of the preferred and non-preferred hands during bimanual tasks. The present study investigated unimanual and bimanual proprioceptive performance in right- and left-handed young adults with an active finger pinch movement discrimination task. With visual information removed, participants were required to make absolute judgments about the extent of pinch movements made to physical stops, either by one hand, or by both hands concurrently, with the sequence of presented movement extents varied randomly. Discrimination accuracy scores were derived from participants' responses using non-parametric signal detection analysis. Consistent with previous findings, a non-dominant hand/hemisphere superiority effect was observed, where the non-dominant hands of right- and left-handed individuals performed overall significantly better than their dominant hands. For all participants, bimanual movement discrimination scores were significantly lower than scores obtained in the unimanual task. However, the magnitude of the performance reduction, from the unimanual to the bimanual task, was significantly greater for left-handed individuals. The effect whereby bimanual proprioception was disproportionately affected in left-handed individuals could be due to enhanced neural communication between hemispheres in left-handed individuals leading to less distinctive separation of information obtained from the two hands in the cerebral cortex.
[Show abstract][Hide abstract] ABSTRACT: We often handle an object with both of our hands. The information from the two hands has to be combined to arrive at a single percept of the object. Research on multi-sensory perception has shown that redundant information between the senses is integrated such that the combined percept is more precise than either of the two individual inputs. However, while bimanual information can be redundant, it is not necessarily the case because both hands are usually touching different parts of the same object. To investigate whether there is a difference in precision of unimanual and bimanual information, we asked participants to discriminate stiffness unimanually as well as bimanually. Our results clearly show that bimanual perception of stiffness was more precise than unimanual perception. The precision of the bimanual percept was in agreement with the precision predicted from combining the two unimanual inputs in a statistical optimal fashion.
Proceedings of the 2012 international conference on Haptics: perception, devices, mobility, and communication - Volume Part II; 06/2012
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.