FIG 3 - uploaded by Yi Zhou
Content may be subject to copyright.
(Color online) Effects of visual stimulation on auditory localization. (A) Timing 0 and Timing þ conditions. (B) Level 0 and Level À conditions. Each panel shows the group average of the mean responses of ten listeners to stereo and single-source stimuli. The results show that the visual stimulation shifts the perceived sound source position toward the LED light stimulation (left, middle, and right). Single-speaker control responses are shown in circles and stereophonic responses are shown in lines. (C) Boxplot of the percentile distribution of visual bias among the ten listeners for each stereo condition. The pair-wise comparison was evaluated by Wilcoxon signedrank test, a ¼ 0.05. (D) Correlation between the change in SD and the change in visual bias in responses of individual listeners for the timingbased (filled circles) and level-based (open circles) stereophony.

(Color online) Effects of visual stimulation on auditory localization. (A) Timing 0 and Timing þ conditions. (B) Level 0 and Level À conditions. Each panel shows the group average of the mean responses of ten listeners to stereo and single-source stimuli. The results show that the visual stimulation shifts the perceived sound source position toward the LED light stimulation (left, middle, and right). Single-speaker control responses are shown in circles and stereophonic responses are shown in lines. (C) Boxplot of the percentile distribution of visual bias among the ten listeners for each stereo condition. The pair-wise comparison was evaluated by Wilcoxon signedrank test, a ¼ 0.05. (D) Correlation between the change in SD and the change in visual bias in responses of individual listeners for the timingbased (filled circles) and level-based (open circles) stereophony.

Source publication
Article
Full-text available
Auditory spatial perception relies on more than one spatial cue. This study investigated the effects of cue congruence on auditory localization and the extent of visual bias between two binaural cues-interaural time differences (ITDs) and interaural level differences (ILDs). Interactions between these binaural cues were manipulated by stereophonic...

Similar publications

Article
Full-text available
Fly Ormia ochracea ears have been well-studied and mimicked to achieve subwavelength directional sensing, but their efficacy in sound source localization in three dimensions, utilizing sound from the X-, Y-, and Z-axes, has been less explored. This paper focuses on a mm-sized array of three Ormia ochracea ear-inspired piezoelectric MEMS directional...
Article
Full-text available
A bio-inspired acoustic sensor for sound source localization is presented, mimicking the internally coupled ears found in many terrestrial vertebrates and insects. It consists of two aluminum diaphragms coupled by a U-shaped cavity and detected by a low-coherence fiber optic interferometer system. A large-scale prototype with a center-to-center sep...
Thesis
Full-text available
Bilateral cochlear implants (BiCIs) result in several benefits, including improvements in speech understanding in noise and sound source localization. However, the amount of benefit received by each patient varies considerably. One reason for this variability is difference between the two ears’ hearing function, i.e., interaural asymmetry. Thus far...
Article
Full-text available
Humans can adapt to changes in the acoustic properties of the head and exploit the resulting novel spectral cues for sound source localization. However, the adaptation rate varies across studies and is not associated with the aftereffects commonly found after adaptation in other sensory domains. To investigate the adaptation’ rate and measure poten...
Article
Full-text available
Visual and audio events simultaneously occur and both attract attention. However, most existing saliency prediction works ignore the influence of audio and only consider vision modality. In this paper, we propose a multi-task learning method for audio–visual saliency prediction and sound source localization on multi-face video by leveraging visual,...

Citations

... Sehingga tabung tertutup tersebut akan mengalami resonansi pada panjang gelombang 9,2 cm atau frekuensi 3700 Hz. Gambar [7]. Secara keseluruhan, kedua grafik tersebut menunjukkan nilai SNR yang meningkat selaras dengan membesarnya sudut pemisah antara target dan masker. ...
Article
Dalam suatu percakapan di pesta cocktail, seseorang mampu memfokuskan pendengarannya pada seorang lawan bicaranya meskipun bunyi musik latar cukup keras dan disertai percakapan beberapa orang lainnya. Fenomena ini dikenal dengan the cocktail party effect. Dalam sebuah studi awal dijelaskan bahwa binaural hearing memberikan kontribusi penting dalam efek tersebut. Penelitian tentang pemisahan suara yang berdasarkan pada lokalisasi bunyi telah dilakukan sebelumnya. Pada penelitian tersebut, pemisahan suara tercampur hanya berdasarkan binaural cue - interaural time difference. Maka pada penelitian ini, akan dilakukan pemisahan suara dari input binaural dengan 2 sensor mikrofon dari dua sumber suara berdasarkan kedua binaural cue tersebut yaitu interaural time difference (ITD) dan interaural level difference (ILD) menggunakan binary mask. Untuk mengestimasi nilai ITD, digunakan metode cross-correlation yang nilai ITD direpresentasikan sebagai nilai time delay pada pergeseran peak pada unit time-frequency. Binary mask diestimasi berdasarkan pola perubahan nilai interaural time difference dan interaural level difference terhadap nilai relative strength dari sinyal target yang dihitung secara statistik menggunakan probability density estimation. Hasil pemisahan sumber suara tercampur menunjukkan performa yang baik dengan SNR sebesar 3 dB.
Article
Objective: Localisation of low- and high-frequency sounds in single-sided deaf cochlear implant users was investigated using noise stimuli designed to mitigate monaural localisation cues. Design: Within subject design. Sound source localisation was tested in the horizontal plane using an array of seven loudspeakers along the azimuthal angle span from -90° to +90°. Stimuli were broadband noise and high- and low-frequency noise. Study sample: Twelve adult subjects with single-sided deafness participated in the study. All had normal hearing in the healthy ear and were supplied with a cochlear implant (CI) in their deaf ear. Results: With broadband noise, the mean angular localisation error was 39° in aided condition as compared to a median angular error of 83.6° when the speech processor was not worn. For high-frequency noise, the median angular error was 30° and for low-frequency noise, it was 46° in the CI-aided condition. Conclusions: Single-sided deaf CI users show the best sound localisation for high-frequency sounds. This supports the view that interaural level differences are dominant for sound localisation in these listeners. Nonetheless, a limited ability to localise low-frequency sounds was observed, which may be based on the supportive perception of interaural time differences.