About
34
Publications
2,877
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
336
Citations
Introduction
Additional affiliations
October 2005 - July 2012
September 1999 - April 2005
Publications
Publications (34)
In this chapter, the history of our understanding of the central auditory system is summarized, starting with the early ideas of the Greeks and Romans and proceeding through the shaping of ideas to the 1960s. The history of studies of the central auditory system involves a combination of theoretical conceptualizations, anatomical studies, physiolog...
The limited spectral and temporal resolution of cochlear implants (CIs) negatively affects speech recognition of CI users in background noise and leads to increased listening effort. Studies have suggested that increased listening effort may result in mental fatigue, stress, withdrawal from social communication, and degraded quality of life. After...
No PDF available
ABSTRACT
In the real-world, environmental objects are often both seen and heard. Visual stimuli can influence the accuracy, variability, and timing of the listener’s responses to auditory targets. One well-known example of this visual influence is the Ventriloquist Illusion or visual capture of the perceived sound source location....
No PDF available
ABSTRACT
Most of what is known about sound source localization in reverberant environments rests upon knowledge of how the auditory system processes a pair of brief stimuli presented over headphones that simulate a direct sound and a single reflection. In everyday environments, however, sound sources often emit relatively continuou...
No PDF available
ABSTRACT
One interesting aspect of sound-source localization in reverberant environments is that different stimuli (e.g., speech versus noise) can elicit different spatial sensations at the same delays between the direct sound and its reflections. For example, a given delay between a single simulated direct sound and a delayed copy...
Yost and Brown (2013, JASA 133) investigated the ability of listeners to localize two simultaneously presented, independent noises presented over loudspeakers from different locations. These experiments demonstrated that SAM noises that were out of phase at two spatially-separated loudspeakers led to better localization performance than when the SA...
No PDF available
ABSTRACT
Yost and Brown [JASA 133 (2013)] investigated the ability of listeners to localize two simultaneously presented, independent noises presented over loudspeakers from different locations. These experiments demonstrated that SAM noises that were out of phase at two spatially separated loudspeakers led to better localization p...
The chapter briefly reviews the current information about the various auditory-spatial cues used for localizing sound sources in a three-dimensional space. Then the chapter reviews the possible head-position cues that might be used for sound source localization. Finally, an explanation of the integration of auditory-spatial and head-position cues i...
No PDF available
ABSTRACT
In the spring of 2007, after 30 years of Dr. Yost's leadership at the Parmly Hearing Institute at Loyola University Chicago, Dr. Sid Bacon invited him to join the Department of Speech and Hearing Science (SHS) at Arizona State University. Dr. Yost served as the department chair between 2007 and 2013. Dr. Yost brought a cle...
No PDF available
ABSTRACT
Estimating the elevation of sound sources is a challenging task for the auditory system due to the lack of interaural cues. In the 1930s, Wallach developed the first comprehensive theory on auditory elevation perception by means of auditory motion parallax during head movements. Years later, Blauert found that stationary l...
To perceptually situate a sound source in the context of its surrounding environment, a listener must integrate two spatial estimates, (1), the location, relative to the listener’s head, of the auditory event associated with the sound-source and, (2), the location of the listener’s head relative to the environment. This chapter introduces the gener...
Objectives:
We investigated the ability of single-sided deaf listeners implanted with a cochlear implant (SSD-CI) to (1) determine the front-back and left-right location of sound sources presented from loudspeakers surrounding the listener and (2) use small head rotations to further improve their localization performance. The resulting behavioral...
Gaze shifts, the directing of the eyes to an approaching predator, preferred food source, or potential mate, have universal biological significance for the survival of a species. Our knowledge of gaze behavior is based primarily on visually triggered responses, whereas head orientation triggered by auditory stimuli remains poorly characterized. Com...
This study investigated the effects of unilateral hearing loss (UHL), of either conductive or sensorineural origin, on stereo sound localization and related visual bias in listeners with normal hearing, short-term (acute) UHL, and chronic UHL. Time-delay-based stereophony was used to isolate interaural-time-difference cues for sound source localiza...
Listeners discriminated changes in the spatial configuration of two-to-eight consonant-vowel (CV) stimuli spoken by different talkers, all simultaneously presented from different loudspeakers in various azimuthal spatial configurations. The number of CVs, spatial configuration of the sound sources, and similarity of the talkers speaking the CVs wer...
Auditory spatial perception relies on more than one spatial cue. This study investigated the effects of cue congruence on auditory localization and the extent of visual bias between two binaural cues-interaural time differences (ITDs) and interaural level differences (ILDs). Interactions between these binaural cues were manipulated by stereophonic...
The human visual and auditory systems do not encode an entirely overlapped space when static head and body position are maintained. While visual capture of sound source location in the frontal field is known to be immediate and direct, visual influence in the rear auditory space behind the subject remains under-studied. In this study we investigate...
Recently, we presented the results of a binaural masked detection experiment in which a noise target was temporally embedded within a lead/lag noise masker pair [J. Acoust. Soc. Am. 141, 3639]. The results show that the inter-stimulus interval (ISI) between the masker and its reflection changed the detection threshold significantly. For low ISIs of...
Horizontal sound localization in free field requires integration of interaural time (ITD) and level (ILD) differences, in making accurate spatial judgments. Recently, we showed that listeners demonstrated great variability in localizing a stereo soundsource (Montagne and Zhou, JASA, 2016). We hypothesized that this variability might arise from conf...
Our recent study (Montagne and Zhou, JASA 2016) showed that binaural localization cues—interaural time (ITD) and level (ILD) differences—were more variably distributed when stimuli were presented stereophonically instead of from single speakers. We hypothesized that variability in listeners’ responses is directly related to variability in the binau...
When a leading stimulus is followed shortly thereafter by another similar stimulus coming from a different direction, listeners often report hearing a single auditory event at or near the location of the leading stimulus. This is called the precedence effect (PE). We measured masked detection thresholds for a noise target in the presence of a maske...
Multisensory interactions involve coordination and sometimes competition between multiple senses. Vision usually dominates audition in spatial judgments when light and sound stimuli are presented from two different physical locations. This study investigated the influence of vision on the perceived location of a phantom sound source placed in a ste...
Ambiguity in binaural timing and level information often causes front-back confusions in sound localization. This experiment investigated the extent to which front-back confusions are modulated by concurrent visual stimuli. 15-ms duration noise stimuli were presented over two loudspeakers positioned at ±45o in front or behind a listener with a dela...
This study investigated visual bias in localizing a “phantom” sound source generated by a stereo pair of speakers. The lateral position of the fused auditory image was controlled by varying the time delay or intensity ratio of two 15-ms noise bursts presented from two hidden loudspeakers positioned at + /− 45 degrees in azimuth. Visual stimuli were...
When auditory neurons are stimulated with a pair of sounds, the preceding sound can inhibit the neural responses to the succeeding sound. This phenomenon, referred to as 'forward suppression', has been linked to perceptual forward masking. Previous studies investigating forward suppression typically measured the interaction between masker and probe...
Sound localization in both humans and monkeys is tolerant to changes in sound levels. The underlying neural mechanism, however, is not well understood. This study reports the level dependence of individual neurons' spatial receptive fields (SRFs) in the primary auditory cortex (A1) and the adjacent caudal field in awake marmoset monkeys. We found t...
Slow envelope fluctuations in the range of 2-20 Hz provide important segmental cues for processing communication sounds. For a successful segmentation, a neural processor must capture envelope features associated with the rise and fall of signal energy, a process that is often challenged by the interference of background noise. This study investiga...
The lateral superior olive (LSO) is the first nucleus in the ascending auditory pathway that encodes acoustic level information from both ears, the interaural level difference (ILD). This sensitivity is believed to result from the relative strengths of ipsilateral excitation and contralateral inhibition. The study reported here simulated sound-evok...
A simple, biophysically specified cell model is used to predict responses of binaurally sensitive neurons to patterns of input spikes that represent stimulation by acoustic and electric waveforms. Specifically, the effects of changes in parameters of input spike trains on model responses to interaural time difference (ITD) were studied for low-freq...
This talk describes modeling efforts to understand brainstem neural responses to electrical cochlear stimulation. Our approach is to combine brainstem models developed for acoustic stimulation with descriptions of auditory-nerve (AN) responses to electric stimulation. Predictions for the behavior of neurons at several levels of the brainstem are co...
Although the interaural time delay (ITD) sensitivity of neurons in the medial superior olive has been and continues to be a focus of discussion and modeling, there has been little attention to the effects of amplitude modulation on the ITD sensitivity of these neurons. There are increasing amounts of data related to the effect of amplitude modulati...
Cochlear implants are becoming more available to deaf and hard of hearing people, and are increasingly fitted in bilateral
configurations. There are also increasing numbers of psychophysical experiments with implanted subjects as well as increasing
numbers of physiological experiments with electrical stimulation. From the point of view of binaural...
This study reports simulations of recent physiological results from the gerbil medial superior olive (MSO) that reveal that blocking glycinergic inhibition can shift the tuning for the interaural time difference (ITD) of the cell (Brand et al., 2002). Our simulations indicate that the model proposed in the study by Brand et al. (2002) requires prec...