Emmanuel Biau

Emmanuel Biau
University of Liverpool | UoL · Department of Psychology

PhD

About

20
Publications
2,645
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
205
Citations
Introduction
My research focuses on how the brain processes corresponding visual and auditory information composing episodes, and binds them together to form new multimodal memories. To address this topic, I take advantage of the natural synchronization between speaker’s lip movements and voice modulations at certain dominant rhythms in audiovisual speech. I’ll combine AV clip presentation to MEG and iEEG recording to address the incidence of audiovisual synchrony on endogenous theta oscillations in the sensory cortical areas, and whether such a coordinated activity is potentially reflected deeper in the binding memory sites (i.e. hippocampus).
Additional affiliations
January 2010 - October 2011
University Pompeu Fabra
Position
  • Research Assistant
Education
November 2010 - November 2015
University Pompeu Fabra
Field of study
  • Neuroscience
September 2008 - July 2009
Sorbonne Université
Field of study
  • Biology

Publications

Publications (20)
Article
Full-text available
During public addresses, speakers accompany their discourse with spontaneous hand gestures (beats) that are tightly synchronized with the prosodic contour of the discourse. It has been proposed that speech and beat gestures originate from a common underlying linguistic process whereby both speech prosody and beats serve to emphasize relevant inform...
Article
During multisensory speech perception, slow delta oscillations (∼1 - 3 Hz) in the listener's brain synchronize with the speech signal, likely engaging in speech signal decomposition. Notable fluctuations in the speech amplitude envelope, resounding speaker prosody, temporally align with articulatory and body gestures and both provide complementary...
Article
Audiovisual speech perception relies, among other things, on our expertise to map a speaker's lip movements with speech sounds. This multimodal matching is facilitated by salient syllable features that align lip movements and acoustic envelope signals in the 4–8 Hz theta band. Although non-exclusive, the predominance of theta rhythms in speech proc...
Preprint
Full-text available
During multimodal speech perception, slow delta oscillations (~1 - 3 Hz) in the listener’s brain synchronize with speech signal, likely reflecting signal decomposition at the service of comprehension. In particular, fluctuations imposed onto the speech amplitude envelope by a speaker’s prosody seem to temporally align with articulatory and body ges...
Preprint
Audiovisual speech perception relies on our expertise to map a speaker's lip movements with speech sounds. This multimodal matching is facilitated by salient syllable features that align lip movements and acoustic envelope signals in the 4 - 8 Hz theta band. The predominance of theta rhythms in speech processing has been firmly established by studi...
Article
Full-text available
Background Researchers rely on the specified capabilities of their hardware and software even though, in reality, these capabilities are often not achieved. Considering that the number of experiments examining neural oscillations has increased steadily, easy-to-implement tools for testing the capabilities of hardware and software are necessary. Ne...
Book
Cambridge Core - Biological Psychology - Multisensory Interactions in the Real World - by Salvador Soto-Faraco
Article
Full-text available
The informative value of time and temporal structure often remains neglected in cognitive assessments. However, next to information about stimulus identity we can exploit temporal ordering principles, such as regularity, periodicity, or grouping to generate predictions about the timing of future events. Such predictions may improve cognitive perfor...
Article
Full-text available
How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the role of lower beta activity (~20 Hz), generally asso...
Article
Full-text available
We tested the prosodic hypothesis that the temporal alignment of a speaker's beat gestures in a sentence influences syntactic parsing by driving the listener's attention. Participants chose between two possible interpretations of relative-clause (RC) ambiguous sentences, while their electroencephalogram (EEG) was recorded. We manipulated the alignm...
Article
Full-text available
During natural speech perception, listeners rely on a wide range of cues to support comprehension, from semantic context to prosodic information. There is a general consensus that prosody plays a role in syntactic parsing, but most studies focusing on ambiguous relative clauses (RC) show that prosodic cues, alone, are insufficient to reverse the pr...
Article
Full-text available
During social interactions, speakers often produce spontaneous gestures to accompany their speech. These coordinated body movements convey communicative intentions, and modulate how listeners perceive the message in a subtle, but important way. In the present perspective, we put the focus on the role that congruent non-verbal information from beat...
Presentation
Full-text available
Searching high and low: a prosodic account for relative clause attachment preferences in Spanish
Poster
Full-text available
Taking prosody by the hand: the effect of pauses and gestures on sentence disambiguation

Network

Cited By