Evaluation of response methods for the localization of nearby objects

Massachusetts Institute of Technology, Cambridge, Massachusetts, United States
Attention Perception & Psychophysics (Impact Factor: 2.17). 04/2012; 62(1):48-65. DOI: 10.3758/BF03212060


Four response methods for indicating the perceived locations of nearby objects were evaluated: the direct-location (DL) method,
where a response pointer is moved directly to the perceived location of the target; the large-head (LH) and small-head (SH)
methods, where the pointer is moved to the target location relative to a full-scale or half-scale manikin head; and the verbal
report (VR) method, where the spherical coordinates of the target location are indicated verbally. Measurements with a visual
target indicated that the DL method was relatively unbiased and considerably more accurate than the other methods, which were
all roughly equivalent. Correcting for bias improved accuracy in the LH, SH, and VR responses, but not to the level of the
uncorrected DL responses. Replacing the visual target with an acoustic stimulus approximately doubled the errors with the
DL response but indicated similar performance in the front and rear hemispheres. The results suggest that DL is the most appropriate
response method for close-range localization experiments.

10 Reads
  • Source
    • "In this procedure, the electromagnetic position sensor at the end of the source wand was used to measure three reference locations on the surface of the subject's bite bar-immobilized head: the opening of the left ear canal, the opening of the right ear canal, and the tip of the nose. These positions were used to define an egocentric spherical coordinate system, with its origin at the midpoint of the left and right ears, its " horizontal plane " defined by the locations of the left and right ears and the nose, and its median plane perpendicular to the interaural axis and passing as close as possible to the tip of the nose (Brungart et al., 2000). Within each session, all of the subject's responses were measured in this egocentrically defined coordinate system. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Evidence indicates that both visual and auditory input may be represented in multiple frames of reference at different processing stages in the nervous system. Most models, however, have assumed that unimodal auditory input is first encoded in a head-centered reference frame. The present work tested this conjecture by measuring the subjective auditory egocenter in six blindfolded listeners who were asked to match the perceived azimuths of sounds that were alternately played between a surrounding arc of far-field speakers and a hand-held point source located three different distances from the head. If unimodal auditory representation is head centered, then "isoazimuth" lines fitted to the matching estimates across distance should intersect near the midpoint of the interaural axis. For frontomedially arranged speakers, isoazimuth lines instead converged in front of the interaural axis for all listeners, often at a point between the two eyes. As far-field sources moved outside the visual field, however, the auditory egocenter location implied by the intersection of the isoazimuth lines retreated toward or even behind the interaural axis. Physiological and behavioral evidence is used to explain this change from an eye-centered to a head-centered auditory egocenter as a function of source laterality.
    The Journal of Neuroscience : The Official Journal of the Society for Neuroscience 10/2004; 24(35):7640-7. DOI:10.1523/JNEUROSCI.0737-04.2004 · 6.34 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We distinguish two representations of visual space: a cognitive representation drives perception, and a sensorimotor representation controls visually guided behavior. Spatial values in the two representations are separated with the Roelofs effect: a target within an off-center frame appears biased in a location opposite the direction of the frame. The effect appears for a verbal measure (cognitive) but not for a jab at the target (sensorimotor). A 2-s response delay induces a Roelofs effect in the motor measure, showing the limit of motor memory. Motor error is not correlated with reaction time. Subjects could strike one of two identical targets, a process involving choice, without intrusion of a Roelofs effect, showing that the sensorimotor system can use its own coordinates even when a cognitive choice initiates the motor processing.
    Vision Research 02/2000; 40(25):3539-52. DOI:10.1016/S0042-6989(00)00193-0 · 1.82 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Although virtual audio displays are capable of realistically simulating relatively distant sound sources, they are not yet able to accurately reproduce the spatial auditory cues that occur when sound sources are located near the listener's head. Researchers have long recognized that the binaural difference cues that dominate auditory localization are inde- pendent of distance beyond 1 m but change systematically with distance when the source approaches within 1 m of the listener's head. Recent research has shown that listeners are able to use these binaural cues to determine the dis- tances of nearby sound sources. However, technical chal- lenges in the collection and processing of near-éeld head- related transfer functions (HRTFs) have thus far prevented the construction of a fully functional near-éeld audio display. This paper summarizes the current state of research in the localization of nearby sound sources and outlines the tech- nical challenges involved in the creation of a near-éeld vir- tual audio display. The potential applications of near-éeld displays in immersive virtual environments and multimodal interfaces are also discussed.
    Presence Teleoperators &amp Virtual Environments 02/2002; 11(1):93-106. DOI:10.1162/105474602317343686 · 0.73 Impact Factor
Show more


10 Reads
Available from