Figure 2 - uploaded by Herbert Heuer
Content may be subject to copyright.
Behavioral task of a 3-stroke movement and analysis. (a) The experimental setup. (b) The judgment task of the hand and cursor conditions. SP, T1 and T2 refer to a starting position, a first target, and a second target, respectively. The visual feedback of the 2 nd-stroke is rotated and displayed simultaneously with hand movements. After each movement, the participants make an explicit judgment regarding the hand or cursor direction at the end of the 2nd-stroke. (c) The illustration of the relationship between the actual hand position (black circle) at the end of the 2nd-stroke and its felt hand position (dotted outline circle), which is estimated based on the shift of the hand position at the end of the 3rd-stroke (solid outline circle) from the first target (T1, grey circle). Angle a' is calculated as the angular deviation of the implicit judgment of hand direction. doi:10.1371/journal.pone.0068471.g002

Behavioral task of a 3-stroke movement and analysis. (a) The experimental setup. (b) The judgment task of the hand and cursor conditions. SP, T1 and T2 refer to a starting position, a first target, and a second target, respectively. The visual feedback of the 2 nd-stroke is rotated and displayed simultaneously with hand movements. After each movement, the participants make an explicit judgment regarding the hand or cursor direction at the end of the 2nd-stroke. (c) The illustration of the relationship between the actual hand position (black circle) at the end of the 2nd-stroke and its felt hand position (dotted outline circle), which is estimated based on the shift of the hand position at the end of the 3rd-stroke (solid outline circle) from the first target (T1, grey circle). Angle a' is calculated as the angular deviation of the implicit judgment of hand direction. doi:10.1371/journal.pone.0068471.g002

Source publication
Article
Full-text available
Understanding the interactions of visual and proprioceptive information in tool use is important as it is the basis for learning of the tool's kinematic transformation and thus skilled performance. This study investigated how the CNS combines seen cursor positions and felt hand positions under a visuo-motor rotation paradigm. Young and older adult...

Contexts in source publication

Context 1
... experimental setup (Fig. 2a) was quite similar to the one used in our previous study [18]. Participants were seated at a table, on which a digitizer tablet (133 Hz sampling rate) and a monitor were placed. The monitor was covered by a large black circular screen with a semi-circular window (32 cm in diameter) in its center. The participants held a stylus with ...
Context 2
... performed three-stroke arm movements as de- scribed previously [18]. The first target (T1, 1.4 cm in diameter) was located in the center of the semi-circular window (Fig. 2b). The start position (SP, 1.2 cm in diameter) was located 3 cm below T1. A second target (T2, 1 cm in diameter) was presented at pseudo random locations, ranging from 260u to +60u relative to the central location, on an invisible circle with a radius of 15 cm around T1. The participants made three-stroke movements from the SP to T1 (1 ...
Context 3
... cursor was displayed concurrently with the hand movements only during the 1 st and the 2 nd strokes. Only during the 2 nd strokes, the motions of the cursor were rotated relative to the directions of the hand movements. Participants had to adjust their movements so that the cursor on the monitor would move toward the remembered T2 location (Fig. 2b, 3 rd panel). The remembered T2 was introduced instead of a visible T2, so that the participants focused on the visual feedback cursor rather than on the visual target during the 2 nd stroke. There were twelve different rotation angles (clockwise [CW] direction: 230u, 225u, 220u, 215u, 210u, 25u; counter-clockwise [CCW] direction: 5u, ...
Context 4
... participants made the 3 rd stroke without visual feedback. One second after completing the 3 rd stroke, the participants were asked to judge either the hand or cursor direction at the end of the 2 nd stroke (Fig. 2b, 4 th panels). For the judgment of cursor direction (explicit cursor judgment, 4 th panel top), a short white line (width: 0.15 cm; length: 2 cm) was displayed. It marked the peripheral end of a radial line from T1 to the circumference of the invisible ring of 15 cm diameter centered at T1. The radial line, and thus its visible peripheral ...
Context 5
... measure of the sensed direction of the hand. The indirect (implicit) measure of the sensed direction of the hand was computed as described previously [18]. The following three steps were taken. First, the angle (a in Fig. 2c) between the line connecting T1 with the end of the 2 nd stroke (Line A) and the line connecting the end of the 2 nd stroke with the end of the 3 rd stroke (Line B) was measured. Second, Line B was shifted in parallel to the axes of the coordinate system until its end (i.e., the end of the 3 rd -stroke, white open circle in Fig. 2c) ...
Context 6
... angle (a in Fig. 2c) between the line connecting T1 with the end of the 2 nd stroke (Line A) and the line connecting the end of the 2 nd stroke with the end of the 3 rd stroke (Line B) was measured. Second, Line B was shifted in parallel to the axes of the coordinate system until its end (i.e., the end of the 3 rd -stroke, white open circle in Fig. 2c) was in T1. Third, the location of the other end of the shifted line (Line B') served as an estimate of the felt hand position at the end of the 2 nd - stroke (dotted white circle in Fig. 2c). We used the angle a' between the Line A and the Line B' as estimate of the rotation of the felt hand position relative to the actual hand ...
Context 7
... measured. Second, Line B was shifted in parallel to the axes of the coordinate system until its end (i.e., the end of the 3 rd -stroke, white open circle in Fig. 2c) was in T1. Third, the location of the other end of the shifted line (Line B') served as an estimate of the felt hand position at the end of the 2 nd - stroke (dotted white circle in Fig. 2c). We used the angle a' between the Line A and the Line B' as estimate of the rotation of the felt hand position relative to the actual hand position, that is, as implicit angular deviation. When Line B' (i.e., 3 rd stroke) was rotated to the CCW or CW direction compared to Line A (i.e., 2 nd stroke), the implicit angular deviation (a') ...

Citations

... The resulting multisensory coding would obey the "reliability rule" (Colonius and Diederich, 2020): the weights in averaging the unisensory estimates would be proportional to their relative reliabilities. Several studies showed that the perceived positions of the hand (conveyed by proprioception) and the object to reach (conveyed by vision) are biased toward each other (e.g., Rand and Heuer, 2013;Debats et al., 2017). This perceptual attraction has been shown to be modulated by the reliability of the hand position and the visual object position reliability (Debats et al., 2017). ...
Article
Full-text available
A continuous task was used to determine how the reliability of on-line visual feedback during acquisition impacts motor learning. Participants performed a right hand pointing task of a repeated sequence with a visual cursor that was either reliable, moderately unreliable, or largely unreliable. Delayed retention tests were administered 24 h later, as well as intermanual transfer tests (performed with the left hand). A visuospatial transfer test was performed with the same targets’ sequence (same visuospatial configuration) while a motor transfer test was performed with the visual mirror of the targets’ sequence (same motor patterns). Results showed that pointing was slower and long-term learning disrupted in the largely unreliable visual cursor condition, compared with the reliable and moderately unreliable conditions. Also, analysis of transfers revealed classically better performance on visuospatial transfer than on motor transfer for the reliable condition. However, here we first show that such difference disappears when the cursor was moderately or largely unreliable. Interestingly, these results indicated a difference in the type of sequence coding, depending on the reliability of the on-line visual feedback. This recourse to mixed coding opens up interesting perspectives, as it is known to promote better learning of motor sequences.
... These regularities hold for diverse interactions with the environment including virtual object manipulations [10][11][12] . For example, in a cursor control task, participants control a cursor displayed in the fronto-parallel plane by arm movements performed on a horizontal plane 13,14 . When the direction of the cursor movement is rotated relative to the direction of the hand movement, the felt hand direction is biased towards the currently seen cursor direction and, vice versa, the seen cursor direction is biased towards the currently felt hand direction (though to a lesser extent). ...
... This is, however, very unlikely as we previously consistently observed a perceptual attraction of the felt finger distance by visual target size in virtual grasping under diverse conditions including those where a purely visual effect could be ruled out, e.g. 34 , see also [10][11][12][13][14] for related results. Nevertheless, with Exp. 3 we aimed to demonstrate that the size of the visual target affects the judgments of finger distance as in Exp. 1 and 2 also when a purely visual effect can be ruled out. ...
Article
Full-text available
Changes in body perception often arise when observers are confronted with related yet discrepant multisensory signals. Some of these effects are interpreted as outcomes of sensory integration of various signals, whereas related biases are ascribed to learning-dependent recalibration of coding individual signals. The present study explored whether the same sensorimotor experience entails changes in body perception that are indicative of multisensory integration and those that indicate recalibration. Participants enclosed visual objects by a pair of visual cursors controlled by finger movements. Then either they judged their perceived finger posture (indicating multisensory integration) or they produced a certain finger posture (indicating recalibration). An experimental variation of the size of the visual object resulted in systematic and opposite biases of the perceived and produced finger distances. This pattern of results is consistent with the assumption that multisensory integration and recalibration had a common origin in the task we used.
... We investigated the combination of visuo-proprioceptive position information in a cursor-control task: participants made out-and-back movements in a semi-circular workspace with visual feedback provided on a monitor, and subsequently judged the most outward position of their unseen hand ('movement endpoint'). This task has served as a versatile paradigm to study both integration (e.g., 6,7 ) and recalibration (e.g., [8][9][10] ). Multisensory binding in this task is facilitated by the experienced coherence of hand and cursor trajectories, even when these are presented in different spatial planes 11 . ...
... For the former we expected a bias towards the synchronous visual stimulus in bimodal trials, reflecting integration, and in unimodal trials following repeated bimodal exposure, reflecting sensory recalibration. For the latter, we tentatively hypothesized that the position estimates of both visual stimuli are biased toward the movement endpoint, as in conditions with individual visual stimuli these are usually judged towards the position of the hand (e.g., 6,7,9 ). ...
Article
Full-text available
To organize the plethora of sensory signals from our environment into a coherent percept, our brain relies on the processes of multisensory integration and sensory recalibration. We here asked how visuo-proprioceptive integration and recalibration are shaped by the presence of more than one visual stimulus, hence paving the way to study multisensory perception under more naturalistic settings with multiple signals per sensory modality. We used a cursor-control task in which proprioceptive information on the endpoint of a reaching movement was complemented by two visual stimuli providing additional information on the movement endpoint. The visual stimuli were briefly shown, one synchronously with the hand reaching the movement endpoint, the other delayed. In Experiment 1, the judgments of hand movement endpoint revealed integration and recalibration biases oriented towards the position of the synchronous stimulus and away from the delayed one. In Experiment 2 we contrasted two alternative accounts: that only the temporally more proximal visual stimulus enters integration similar to a winner-takes-all process, or that the influences of both stimuli superpose. The proprioceptive biases revealed that integration—and likely also recalibration—are shaped by the superposed contributions of multiple stimuli rather than by only the most powerful individual one.
... To address this question, we investigated hand movements in a cursor-control task, which has previously served as a versatile paradigm to study multisensory integration (e.g., Rand and Heuer 2013;Debats et al. 2017a, b;Debats and Heuer 2018a) and recalibration (e.g., Synofzik et al. 2008;Henriques and Cressman 2012;Rand and Heuer 2019). We asked how perceptual estimates of a proprioceptive target (hand position) are biased by the presence of two (rather than one) visual stimuli presented during the movement. ...
Preprint
Full-text available
To organize the plethora of sensory signals from our environment into a coherent percept, our brain relies on the processes of multisensory integration and sensory recalibration. We here asked how visuo-proprioceptive integration and recalibration are shaped by the presence of more than one potentially relevant visual stimulus, hence paving the way to studying multisensory perception under more naturalistic settings with multiple signals per sensory modality. By manipulating the spatio-temporal correspondence between the hand position and two visual stimuli during a cursor-control task, we contrasted two alternative accounts: that only the temporally more proximal signal enters integration and recalibration similar to a winner-takes-all process, or that the influences of both visual signals superpose. Our results show that integration - and likely also recalibration - are shaped by the superposed contributions of multiple stimuli rather than by only individual ones.
... As outlined above, the neglect of the less relevant action effects in a given block and the modulation of this through (in)compatibility of resident and remote effects was independent of the judgment modality, thus whether visual or proprioceptive estimates had to be made. Additionally though, we also found no significant differences between absolute estimation errors of the two judgment modalities in the experimental blocks which is surprising given the common finding that people are usually much better in judging visual compared to proprioceptive positions [22,[27][28][29][30], though we did find a significant effect of this factor in the exploratory single-trial analyses. However, this might be due to the significant crossed judgment modality x compatibility interaction which indicated not only a compatibility effect for the proprioceptive judgments, but also that these were both less precise than visual judgments in the incompatible condition, while being more precise than visual judgments in the compatible condition. ...
Article
Full-text available
Movements of a tool typically diverge from the movements of the hand manipulating that tool, such as when operating a pivotal lever where tool and hand move in opposite directions. Previous studies suggest that humans are often unaware of the position or movements of their effective body part (mostly the hand) in such situations. It has been suggested that this might be due to a “haptic neglect” of bodily sensations to decrease the interference of representations of body and tool movements. However, in principle this interference could also be decreased by neglecting sensations regarding the tool and focusing instead on body movements. While in most tool use situations the tool-related action effects are task-relevant and thus suppression of body-related rather than tool-related sensations is more beneficial for successful goal achievement, we manipulated this task-relevance in a controlled experiment. The results showed that visual, tool-related effect representations can be suppressed just as proprioceptive, body-related ones in situations where effect representations interfere, given that task-relevance of body-related effects is increased relative to tool-related ones.
... The study concluded that age affects distinct explicit and implicit neural representations of hand direction, similar to the notion of distinct visual systems. 32 These results imply that there is a modifiable component to sensibility. If musicians can "learn" to be more sensitive by using certain areas of the fingers, children develop disparate sensation and older adults can "lose" sensation due to inactivity, the differences in sensibility between the 2 sides may be related to level of functional use. ...
Article
Sensorimotor testing is used to measure outcomes in surgery, to document results of treatment and rehabilitation, and to compare results between surgeons, therapists, and institutions. When performing sensorimotor testing, failure to address dominant side differences may cause a bias in evaluation of outcomes. This study evaluated the effect of hand dominance on outcomes testing performed on patients following surgery for distal radius fractures (DRF). We hypothesized that the injured dominant hand will perform differently than the injured non-dominant hand. This is a retrospective study of patients following DRF treated surgically and evaluated in therapy. The patients were evaluated at fixed intervals: initially, at 6 weeks, and at 3 months post-surgery. Testing included grip strength, monofilaments, static and moving 2-point discrimination, Moberg testing, and stereognosis. Sixty patients included 46 (76.6%) females. Age averaged 62.1 (standard deviation: 16.9) years, and 54 were right-handed (90%). There were differences between dominant and non-dominant hand injury in 2 of 9 tests of sensibility for each time period, including little finger monofilament and Moberg testing initially, and moving 2-point discrimination in the little finger, monofilament testing of the thumb at 3 months. Both groups improved between initial and 3-month evaluation without differences in amount of improvement. Despite some significant differences in the applied tests between dominant and non-dominant injured hands, our results do not support correction for hand-dominance when using the described examinations in evaluating outcomes following DRF surgery.
... Furthermore, it was repeatedly observed in cursor-control tasks that judgements of the hand position were systematically biased to a-slightly deviating-cursor position, and vice versa for the judgements of the cursor position (e.g. [2][3][4][5][6][7][8]). In the current study, we address the time window during which this link between actions and their distant visual consequences is established. ...
... Here, we explore the characteristics of the perceptual link between actions and their visual effects by means of a basic cursor-control paradigm of the same type as in previous studies (e.g. [3,5,8,9]). In this paradigm, participants perform out-and-back reaching movements with their occluded right hand on a horizontal digitizer tablet. ...
Article
Full-text available
Successful computer use requires the operator to link the movement of the cursor to that of his or her hand. Previous studies suggest that the brain establishes this perceptual link through multisensory integration, whereby the causality evidence that drives the integration is provided by the correlated hand and cursor movement trajectories. Here, we explored the temporal window during which this causality evidence is effective. We used a basic cursor-control task, in which participants performed out-and-back reaching movements with their hand on a digitizer tablet. A corresponding cursor movement could be shown on a monitor, yet slightly rotated by an angle that varied from trial to trial. Upon completion of the backward movement, participants judged the endpoint of the outward hand or cursor movement. The mutually biased judgements that typically result reflect the integration of the proprioceptive information on hand endpoint with the visual information on cursor endpoint. We here manipulated the time period during which the cursor was visible, thereby selectively providing causality evidence either before or after sensory information regarding the to-be-judged movement endpoint was available. Specifically, the cursor was visible either during the outward or backward hand movement (conditions Out and Back, respectively). Our data revealed reduced integration in the condition Back compared with the condition Out, suggesting that causality evidence available before the to-be-judged movement endpoint is more powerful than later evidence in determining how strongly the brain integrates the endpoint information. This finding further suggests that sensory integration is not delayed until a judgement is requested.
... Against this background it is not surprising that mutual perceptual biases are reported with a spatial separation of body-related and visual action effects when a conflict between them is introduced. For example, the direction of a hand movement performed on a horizontal plane attracts the perceived direction of the visual cursor displayed in the fronto-parallel plane and, vice versa, the movement direction of the cursor attracts the perceived direction of the hand movement when the visual movement direction is misaligned with respect to the actual movement direction 4,5 . This and similar findings suggest that body-related and visual action effects are integrated in spite of a clear spatial separation of their origin in a similar fashion like in direct object interactions [6][7][8][9] . ...
Article
Full-text available
Statistically optimal integration of multimodal signals is known to take place in direct interactions with environmental objects. In the present study we tested whether the same mechanism is responsible for perceptual biases observed in a task, in which participants enclose visual objects by manually controlled visual cursors. We manipulated the relative reliability of visual object information and measured the impact of body-related information on object perception as well as the perceptual variability. The results were qualitatively consistent with statistically optimal sensory integration. However, quantitatively, the observed bias and variability measures systematically differed from the model predictions. This outcome indicates a compensatory mechanism similar to the reliability-based weighting of multisensory signals which could underlie action’s effects in visual perception reported in diverse context conditions.
... Some observations from tool use already suggest that sensory integration can occur in spite of a spatial separation of multimodal signals under certain conditions. When participants in a cursor-control task are asked to perform arm movements on a horizontal plane while misaligned visual feedback (i.e., cursor) of the movement direction is displayed in the fronto-parallel plane, the judgment of the felt hand direction shifts toward the seen cursor direction and, vice versa, the judged cursor direction shifts toward the felt hand direction (Debats, Ernst, & Heuer, 2017a, 2017bRand & Heuer, 2013. Moreover, Takahashi and colleagues reported evidence for visual-haptic integration when the hand was offset in respect to the object being grasped-however, only if a tool, such as plier or tong, was used for grasping (Takahashi, Diedrichsen, & Watt, 2009;Takahashi & Watt, 2014. ...
... However, the impact of the finger distance on rectangle judgments was substantially less than the impact of the object size on finger judgments. This observation is in line with previous reports (e.g., Kirsch et al., 2017;Rand & Heuer, 2013) and indicates that visual signals received more weight than the proprioceptive information under the present task conditions. The visual input appears thus to have been generally more reliable in spite of the implemented degradation of the visual stimulus, its short presentation, and proprioceptive information being available during perceptual judgments. ...
Article
This study examined the role of visual reliability and action relevance in mutual visual-proprioceptive attraction in a virtual grasping task. Participants initially enclosed either the width or the height of a visual rectangular object with two cursors controlled by the movements of the index finger and thumb. Then, either the height or the width of this object or the distance between the fingers was judged. The judgments of object’s size were attracted by the felt finger distance, and, vice versa, the judged finger distance was attracted by the size of the grasped object. The impact of the proprioceptive information on object judgments increased, whereas the impact of visual object information on finger judgments decreased when the reliability of the visual stimulus was reduced. Moreover, the proprioceptive bias decreased for the action-relevant stimulus dimension as compared with the action-irrelevant stimulus dimension. These results indicate sensory integration of spatially separated sensory signals in the absence of any direct spatial or kinematic relation between them. We therefore suggest that the basic principles of sensory integration apply to the broad research field on perceptual-motor interactions as well as to many virtual interactions with external objects.
... This is the case in cursor-control tasks (controlling a cursor on a monitor by moving a hand held device) where proprioception refers to the position of the hand in the horizontal plane and vision to the position of the cursor on a roughly vertical monitor screen. In this type of task, estimates of discrepant positions of cursor and hand at the end of a movement, which result, for example, from a rotation of the cursor motion relative to the hand movement, are biased toward each other (Ladwig et al., 2012(Ladwig et al., , 2013Rand and Heuer, 2013Kirsch et al., 2016Kirsch et al., , 2017Debats et al., 2017a,b;Debats and Heuer, 2018a,b,c). According to Debats et al. (2017b), sensory coupling in a cursor-control task can be modeled as weighted average of the unisensory estimates and obeys the "reliability rule" (Colonius and Diederich, 2017), as is typical for multisensory integration where the weights in averaging the unisensory estimates are proportional to their relative reliabilities (e.g., Ernst and Bülthoff, 2004;Cheng et al., 2007). ...
... We designate the resulting measures of the biases as explicit because participants are aware of reporting perceived positions. As in previous studies (Rand and Heuer, 2013; see also Mikula et al., 2018), we also use an additional implicit measure for the assessment of the bias of the sensed hand position. We designate this measure as implicit because the bias is inferred from the movements performed by the participants without them being aware of judging a position. ...
... The reason for including the implicit measure of the bias of the sensed hand position is that in the past, we have identified some variables that affect the explicit and the implicit measure differently, e.g., aging (Rand and Heuer, 2013), a longer duration of the hand at the endpoint of the outward movement, which improves proprioceptive information (Rand and Heuer, 2016), a preceding adaptation to a visuomotor rotation (Rand and Heuer, 2017), and the proportion of trials in which the position of the hand or the cursor is judged (Rand and Heuer, 2018). These differences suggest that there might be two distinct representations of hand positions (e.g., Dijkerman and de Haan, 2007;De Vignemont, 2010), one serving perception (explicit) and the other one serving action or motor control (implicit). ...
Article
Full-text available
The brain generally integrates a multitude of sensory signals to form a unified percept. Even in cursor control tasks, such as reaching while looking at rotated visual feedback on a monitor, visual information on cursor position and proprioceptive information on hand position are partially integrated (sensory coupling), resulting in mutual biases of the perceived positions of cursor and hand. Previous studies showed that the strength of sensory coupling (sum of the mutual biases) depends on the experience of kinematic correlations between hand movements and cursor motions, whereas the asymmetry of sensory coupling (difference between the biases) depends on the relative reliabilities (inverse of variability) of hand-position and cursor-position estimates (reliability rule). Furthermore, the precision of movement control and perception of hand position are known to differ between hands (left, right) and workspaces (ipsilateral, contralateral), and so does the experience of kinematic correlations from daily life activities. Thus, in the present study, we tested whether strength and asymmetry of sensory coupling for the endpoints of reaches in a cursor control task differ between the right and left hand and between ipsilateral and contralateral hemispace. No differences were found in the strength of sensory coupling between hands or between hemispaces. However, asymmetry of sensory coupling was less in ipsilateral than in contralateral hemispace: in ipsilateral hemispace, the bias of the perceived hand position was reduced, which was accompanied by a smaller variability of the estimates. The variability of position estimates of the dominant right hand was also less than for the non-dominant left hand, but this difference was not accompanied by a difference in the asymmetry of sensory coupling – a violation of the reliability rule, probably due a stronger influence of visual information on right-hand movements. According to these results, the long-term effects of the experienced kinematic correlation between hand movements and cursor motions on the strength of sensory coupling are generic and not specific for hemispaces or hands, whereas the effects of relative reliabilities on the asymmetry of sensory coupling are specific for hemispaces but not for hands.