Jeroen J Stekelenburg

Jeroen J Stekelenburg
Tilburg University | UVT · Department of Cognitive Neuropsychology

PhD

About

48
Publications
7,372
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
2,045
Citations

Publications

Publications (48)
Article
Full-text available
The amplitude of the auditory N1 component of the event‐related potential (ERP) is typically suppressed when a sound is accompanied by visual anticipatory information that reliably predicts the timing and identity of the sound. While this visually induced suppression of the auditory N1 is considered an early electrophysiological marker of fulfilled...
Article
Full-text available
Bruns et al. (2020) provide new research that suggests that the ventriloquism after‐effect (VAE: an enduring shift of the perceived location of a sound toward a previously seen visual stimulus) and multisensory enhancement (ME: an improvement in the precision of sound localization) may dissociate depending on the rate at which exposure stimuli are...
Article
Full-text available
Lay abstract: Many autistic individuals experience difficulties in processing sensory information (e.g. increased sensitivity to sound). Here we show that these difficulties may be related to an inability to process unexpected sensory stimulation. In this study, 29 older adolescents and young adults with autism and 29 age-matched individuals with...
Article
Different inputs from a multisensory object or event are often integrated into a coherent and unitary percept, despite differences in sensory formats, neural pathways, and processing times of the involved modalities. Presumably, multisensory integration occurs if the cross-modal inputs are presented within a certain window of temporal integration w...
Article
Full-text available
Speech perception is influenced by vision through a process of audiovisual integration. This is demonstrated by the McGurk illusion where visual speech (for example /ga/) dubbed with incongruent auditory speech (such as /ba/) leads to a modified auditory percept (/da/). Recent studies have indicated that perception of the incongruent speech stimuli...
Article
Full-text available
Recent studies suggest that sub-clinical levels of autistic symptoms may be related to reduced processing of artificial audiovisual stimuli. It is unclear whether these findings extent to more natural stimuli such as audiovisual speech. The current study examined the relationship between autistic traits measured by the Autism spectrum Quotient and...
Article
Full-text available
The amplitude of the auditory N1 component of the event‐related potential (ERP) is typically attenuated for self‐initiated sounds, compared to sounds with identical acoustic and temporal features that are triggered externally. This effect has been ascribed to internal forward models predicting the sensory consequences of one's own motor actions. Th...
Article
Numerous studies have demonstrated that the vision of lip movements can alter the perception of auditory speech syllables (McGurk effect). While there is ample evidence for integration of text and auditory speech, there are only a few studies on the orthographic equivalent of the McGurk effect. Here, we examined whether written text, like visual sp...
Article
A rare omission of a sound that is predictable by anticipatory visual information induces an early negative omission response (oN1) in the EEG during the period of silence where the sound was expected. It was previously suggested that the oN1 was primarily driven by the identity of the anticipated sound. Here, we examined the role of temporal predi...
Article
The eye-region conveys important emotional information that we spontaneously attend to. Socially submissive individuals avoid other's gaze which is regarded as avoidance of others' emotional face expressions. But this interpretation ignores the fact that there are other sources of emotional information besides the face. Here we investigate whether...
Article
Full-text available
The Colavita effect refers to the phenomenon that when confronted with an audiovisual stimulus, observers report more often to have perceived the visual than the auditory component. The Colavita effect depends on low-level stimulus factors such as spatial and temporal proximity between the unimodal signals. Here, we examined whether the Colavita ef...
Article
Visual perception can be changed by co-occurring input from other sensory modalities. Here, we explored how self-generated finger movements (left-right or up-down key presses) affect visual motion perception. In Experiment 1, motion perception of a blinking bar was shifted in the direction of co-occurring hand motor movements, indicative of motor-i...
Article
We compared with a new psychophysical method whether flashes and averted eye-gazes of a cartoon face induce a ventriloquist illusion (an illusory shift of the apparent location of a sound by a visual distracter). With standard psychophysical procedures that measure a direct ventriloquist effect and a ventriloquist aftereffect, we found in human sub...
Article
Full-text available
We receive emotional signals from different sources, including the face, the whole body, and the natural scene. Previous research has shown the importance of context provided by the whole body and the scene on the recognition of facial expressions. This study measured physiological responses to face-body-scene combinations. Participants freely view...
Conference Paper
Full-text available
Audiovisual speech integration is reflected in the electrophysiological N1/P2 complex. In this study, we analyzed recordings of electroencephalographic brain activity from 28 subjects who were presented with combinations of auditory, visual, and audiovisual stimuli, using single trial analysis based on an independent component analysis procedure. W...
Article
Full-text available
Background: In many natural audiovisual events (e.g., the sight of a face articulating the syllable /ba/), the visual signal precedes the sound and thus allows observers to predict the onset and the content of the sound. In healthy adults, the N1 component of the event-related brain potential (ERP), reflecting neural activity associated with basic...
Article
The N1 component of the event-related brain potential (ERP), reflecting neural activity associated with basic sound processing, can be suppressed if a sound is accompanied by a video that predicts sound onset. In this study we examined whether visual predictive information induces auditory de-activation in patients with schizophrenia. The electroen...
Article
Full-text available
Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is ampli...
Article
Full-text available
Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is ampli...
Article
Lip-read speech is integrated with heard speech at various neural levels. Here, we investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured with EEG) are indicative of speech-specific audiovisual integration, and we explored to what extent the ERPs were modulated by phonetic audiovisual congruency. In order...
Article
Correctly processing rapid sequences of sounds is essential for developmental milestones, such as language acquisition. We investigated the sensitivity of two-month-old infants to violations of a temporal regularity, by recording event-related brain potentials (ERP) in an auditory oddball paradigm from 36 waking and 40 sleeping infants. Standard to...
Article
Full-text available
In many natural audiovisual events (e.g., a clap of the two hands), the visual signal precedes the sound and thus allows observers to predict when, where, and which sound will occur. Previous studies have reported that there are distinct neural correlates of temporal (when) versus phonetic/semantic (which) content on audiovisual integration. Here w...
Article
The relative timing of a motor-sensory event can be recalibrated after exposure to delayed visual feedback. Here we examined the neural consequences of lag adaptation using event-related potentials (ERPs). Participants tapped their finger on a pad, which triggered a flash either after a short delay (0 ms/50 ms) or a long delay (100 ms/150 ms). Foll...
Article
Research on the development of the auditory system has revealed that temporal processes, such as rhythm detection, may develop very early in human life. The mismatch negativity (MMN) event-related brain potential (ERP), a cortical response elicited by contextually deviant or novel sound events, has been used to study these questions. The present st...
Article
Full-text available
Perception of intersensory temporal order is particularly difficult for (continuous) audiovisual speech, as perceivers may find it difficult to notice substantial timing differences between speech sounds and lip movements. Here we tested whether this occurs because audiovisual speech is strongly paired ("unity assumption"). Participants made tempor...
Article
Full-text available
The neural activity of speech sound processing (the N1 component of the auditory ERP) can be suppressed if a speech sound is accompanied by concordant lip movements. Here we demonstrate that this audiovisual interaction is neither speech specific nor linked to humanlike actions but can be observed with artificial stimuli if their timing is made pre...
Article
Full-text available
Visual motion can affect the perceived direction of auditory motion (i.e., audiovisual motion capture). It is debated, though, whether this effect occurs at perceptual or decisional stages. Here, we examined the neural consequences of audiovisual motion capture using the mismatch negativity (MMN), an event-related brain potential reflecting pre-att...
Article
Full-text available
A question that has emerged over recent years is whether audiovisual (AV) speech perception is a special case of multi-sensory perception. Electrophysiological (ERP) studies have found that auditory neural activity (N1 component of the ERP) induced by speech is suppressed and speeded up when a speech sound is accompanied by concordant lip movements...
Article
Full-text available
The authors examined how principles of auditory grouping relate to intersensory pairing. Two sounds that normally enhance sensitivity on a visual temporal order judgement task (i.e. temporal ventriloquism) were embedded in a sequence of flanker sounds which either had the same or different frequency (Exp. 1), rhythm (Exp. 2), or location (Exp. 3)....
Article
Lexical information can bias categorization of an ambiguous phoneme and subsequently evoke a shift in the phonetic boundary. Here, we explored the extent to which this phenomenon is perceptual in nature. Listeners were asked to ignore auditory stimuli presented in a typical oddball sequence in which the standard was an ambiguous sound halfway betwe...
Article
Observing facial expressions automatically prompts imitation, as can be seen with facial electromyography. To investigate whether this reaction is driven by automatic mimicry or by recognition of the emotion displayed we recorded electromyograph responses to presentations of facial expressions, face-voice combinations and bodily expressions, which...
Article
Full-text available
Sexual arousal can be viewed as an emotional state generating sex-specific autonomic and general somatic motor system responses that prepare for sexual action. In the present study modulation of spinal tendious (T) reflexes by sexual films of varying intensity was investigated. T reflexes were expected to increase as a function of increased film in...
Article
Temporal ventriloquism refers to the phenomenon that a sound presented in close temporal proximity of a visual stimulus attracts its perceived temporal occurrence. Here, we investigate the time-course of the neuronal processes underlying temporal ventriloquism, using event-related brain potentials. To measure shifts in perceived temporal visual occ...
Article
Some elementary aspects of faces can be processed before cortical maturation or after lesion of primary visual cortex. Recent findings suggesting a role of an evolutionary ancient visual system in face processing have exploited the relative advantage of the temporal hemifield (nasal hemiretina). Here, we investigated whether under some circumstance...
Article
The present study investigated the neural correlates of perceiving human bodies. Focussing on the N170 as an index of structural encoding, we recorded event-related potentials (ERPs) to images of bodies and faces (either neutral or expressing fear) and objects, while subjects viewed the stimuli presented either upright or inverted. The N170 was enh...
Article
The ventriloquist illusion arises when sounds are mislocated towards a synchronous but spatially discrepant visual event. Here, we investigated the ventriloquist illusion at a neurophysiological level. The question was whether an illusory shift in sound location was reflected in the auditory mismatch negativity (MMN). An 'oddball' paradigm was used...
Article
Full-text available
We have earlier found that voluntary attention to weak auditory stimuli induces inhibition of respiration, heart rate, and electromyographic (EMG) activity of masticatory and lower facial muscles and that these responses lower the auditory threshold for low-frequency sounds. In the current study, we examined whether this inhibitory response pattern...
Article
Full-text available
We investigated whether previously observed inhibition of pericranial electromyographic (EMG) activity, respiration, and heart rate during sensory intake processes improves auditory sensitivity. Participants had to detect weak auditory stimuli. We found that EMG activity in masticatory and lower facial muscles, respiration, and heart rate were more...

Projects

Projects (2)
Project
This project has three primary aims: (1) to examine the impact of stimulus predictability (e.g. timing, identity) on electrophysiological markers of predictive coding in multisensory integration in individuals with typical development, (2) to examine the extent to which subclinical levels of autistic symptoms in the general population are related to alterations in multisensory integration, and (3) to examine whether the neural correlates of predictive coding in multisensory integration are altered in autism spectrum disorder
Project
It is well known that perception of synchrony between one's own action (e.g. a finger tap) and the sensory feedback thereof (e.g. a flash or click) can be shifted after brief exposure to feedback delay (temporal recalibration: TR, Stetson et al. 2006; Heron et al. 2009; Sugano et al. 2010). This project seeks for: (1) if the sensorimotor temporal recalibration (TR) is modality-specific or not (amodal) (2) if it is specific to body part or not, (3) a neural correlates of TR using EEG REFS Heron J, Hanson JVM, Whitaker D (2009) Effect before cause: supramodal recalibration of sensorimotor timing. PLoS ONE 4(11):e7681. doi:10.1371/journal.pone.0007681 Stetson C, Cui X, Montague PR, Eagleman DM (2006) Motor-sen- sory recalibration leads to an illusory reversal of action and sen- sation. Neuron 51(5):651–659 Sugano Y, Keetels M, Vroomen J (2010) Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities. Exp Brain Res 201(3):393–39