Olivier Collignon

Olivier Collignon
  • PhD
  • Professor (Associate) at Catholic University of Louvain

About

232
Publications
53,012
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
6,899
Citations
Current institution
Catholic University of Louvain
Current position
  • Professor (Associate)

Publications

Publications (232)
Article
Resting-state functional connectivity (RSFC) studies have provided strong evidences that visual deprivation influences the brain's functional architecture. In particular, reduced RSFC coupling between occipital (visual) and temporal (auditory) regions has been reliably observed in early blind individuals (EB) at rest. In contrast, task-dependent ac...
Article
Full-text available
Is a short and transient period of visual deprivation early in life sufficient to induce lifelong changes in how we attend to, and integrate, simple visual and auditory information [1, 2]? This question is of crucial importance given the recent demonstration in both animals and humans that a period of blindness early in life permanently affects the...
Article
Several studies suggest that serial order in working memory (WM) is grounded on space. For a list of ordered items held in WM, items at the beginning of the list are associated with the left side of space and items at the end of the list with the right side. This suggests that maintaining items in verbal working memory is performed in strong analog...
Article
Full-text available
Animal and human studies have demonstrated that transient visual deprivation early in life, even for a very short period, permanently alters the response properties of neurons in the visual cortex and leads to corresponding behavioral visual deficits [1-7]. While it is acknowledged that early-onset and longstanding blindness leads the occipital cor...
Article
Full-text available
Visual deprivation leads to massive reorganization in both the structure and function of the occipital cortex, raising crucial challenges for sight-restoration. We tracked the behavioral, structural and neurofunctional changes occurring in an early and severely visually impaired patient before, 1.5 and 7 months after sight restoration using magneti...
Preprint
Full-text available
Moving events on the skin can be perceived through vision and touch. How does the brain create a unified multisensory representation of motion directions initially acquired in different coordinate systems? We show that the middle occipito-temporal region (hMT+/V5), along with a fronto-parietal network, encodes visual and tactile directions using a...
Article
High-level perception results from interactions between hierarchical brain systems responsive to gradually increasing feature complexities. During reading, the initial evaluation of simple visual features in the early visual cortex (EVC) is followed by orthographic and lexical computations in the ventral occipitotemporal cortex (vOTC). While simila...
Preprint
Full-text available
The cortex is organized along macroscale structural and functional gradients that extend from unimodal to transmodal association areas and from somatosensory to visual regions. It has not been tested whether this organization represents an intrinsic neuro-architecture immune to sensory experience or depends on sensory input. Here, we conducted conn...
Article
Full-text available
The magnitude dimensions of visual stimuli, such as their numerosity, duration, and size, are intrinsically linked, leading to mutual interactions across them. However, it remains debated whether such interactions, or “magnitude integration” effects, arise from perceptual processes that are independent from the task performed, or whether they arise...
Article
Full-text available
The animal brain is endowed with an innate sense of number allowing to intuitively perceive the approximate quantity of items in a scene, or “numerosity.” This ability is not limited to items distributed in space, but also to events unfolding in time and to the average numerosity of dynamic scenes. How the brain computes and represents the average...
Preprint
Full-text available
How does experience shape the development of visual brain regions? We demonstrate that a transient period of visual deprivation early in life in humans leads to permanent alteration in the function of the early visual cortex (EVC), while leaving the categorical coding of downstream ventral occipito-temporal cortex (VOTC) mostly unaffected. We used...
Article
Full-text available
Philosophers and experimentalists have long debated whether bodily representation of emotion is grounded in our sensory experience. Indeed, we are used to observe emotional reactions expressed through the bodies of others, yet it is still unknown whether this observation influences how we experience affective states in our own bodies. To delve into...
Preprint
Full-text available
How does sensory experience influence the brain connectome? Early acquired blindness is known to trigger a large-scale alteration in brain connectivity. Previous studies have traditionally assessed stationary functional connectivity across all timepoints in scanning sessions. In this study we compared the dynamic nature of functional connectivity i...
Preprint
Full-text available
Speech is a multisensory signal that can be extracted from the voice and the lips. Previous studies suggested that occipital and temporal regions encode both auditory and visual speech features but their precise location and nature remain unclear. We characterized brain activity using fMRI (in male and female) to functionally and individually defin...
Article
Recent studies have shown that during the typical resting‐state, echo planar imaging (EPI) time series obtained from the eye orbit area correlate with brain regions associated with oculomotor control and lower‐level visual cortex. Here, we asked whether congenitally blind (CB) shows similar patterns, suggesting a hard‐wired constraint on connectivi...
Preprint
Full-text available
Learning to read assigns linguistic value to an abstract visual code. Whether regions of the reading network tune to visual properties common to most scripts or code for more abstracted units of language remains debated. Here we investigate this question using visual Braille, a script developed for touch that does not share the typical explicit sha...
Preprint
Full-text available
The magnitude dimensions of visual stimuli, such as their numerosity, duration, and size, are intrinsically linked, leading to mutual interactions across them. Namely, the perception of one magnitude is influenced by the others, so that a large stimulus is perceived to last longer in time compared to a smaller one, and vice versa. The nature of suc...
Preprint
Full-text available
The human brain is endowed with an intuitive sense of number allowing to perceive the approximate quantity of items in a scene, or “numerosity.” This ability is not limited to items distributed in space, but also to events unfolding in time and to the average numerosity of dynamic scenes. How the brain computes and represents the average numerosity...
Article
Full-text available
The ability to reliably discriminate vocal expressions of emotion is crucial to engage in successful social interactions. This process is arguably more crucial for blind individuals, since they cannot extract social information from faces and bodies, and therefore chiefly rely on voices to infer the emotional state of their interlocutors. Blind hav...
Article
Voices are the most relevant social sounds for humans and therefore have crucial adaptive value in development. Neuroimaging studies in adults have demonstrated the existence of regions in the superior temporal sulcus that respond preferentially to voices. Yet, whether voices represent a functionally specific category in the young infant’s mind is...
Preprint
Full-text available
The ability to reliably discriminate vocal expressions of emotion is crucial to engage in successful social interactions. This process is arguably more crucial for blind individuals, since they cannot extract social information from faces and bodies, and therefore chiefly rely on voices to infer the emotional state of their interlocutors. Blind hav...
Article
Full-text available
Seamlessly extracting emotional information from voices is crucial for efficient interpersonal communication. However, it remains unclear how the brain categorizes vocal expressions of emotion beyond the processing of their acoustic features. In our study, we developed a new approach combining electroencephalographic recordings (EEG) in humans with...
Article
Full-text available
We can sense an object’s shape by vision or touch. Previous studies suggested that the inferolateral occipitotemporal cortex (ILOTC) implements supramodal shape representations as it responds more to seeing or touching objects than shapeless textures. However, such activation in the anterior portion of the ventral visual pathway could be due to the...
Article
Dual Coding Theories (DCT) suggest that meaning is represented in the brain by a double code: a language-derived code in the Anterior Temporal Lobe (ATL) and a sensory-derived code in perceptual and motor regions. Concrete concepts should activate both codes, while abstract ones rely solely on the linguistic code. To test these hypotheses, the pres...
Preprint
Full-text available
Dual coding theories of knowledge suggest that meaning is represented in the brain by a double code, which comprises language-derived representations in the Anterior Temporal Lobe and sensory-derived representations in perceptual and motor regions. This approach predicts that concrete semantic features should activate both codes, whereas abstract f...
Preprint
Full-text available
In the past three decades, multiple studies revealed that congenital blindness is associated with functional and structural reorganization in early visual areas and its interaction with other neural systems. Among the most reproducible findings is the weaker connectivity between the visual and sensorimotor cortices, which in sighted individuals pla...
Preprint
Successfully engaging in social communication requires efficient processing of subtle so-cio-communicative cues. Voices convey a wealth of social information, such as gender, identity and the emotional state of the speaker. We tested whether our brain can systematically and automatically differentiate and track a periodic stream of emotional uttera...
Article
Alpha oscillatory activity is thought to contribute to visual expectancy through the engagement of task-relevant occipital regions. In early blindness, occipital alpha oscillations are systematically reduced, suggesting that occipital alpha depends on visual experience. However, it remains possible that alpha activity could serve expectancy in non-...
Preprint
Full-text available
We can sense an object's shape by vision or touch. Previous studies suggested that the inferolateral occipitotemporal cortex (ILOTC) implements supramodal shape representations as it responds more to seeing or touching objects than shapeless textures. However, such activation in the anterior portion of the ventral visual pathway could be due to the...
Preprint
Seamlessly extracting emotional information from voices is crucial for efficient interpersonal communication. However, it remains unclear how the brain categorizes vocal expressions of emotion beyond the processing of their acoustic features. In our study, we developed a new approach combining electroencephalographic recordings (EEG) in humans with...
Article
Full-text available
The ventral occipito-temporal cortex (VOTC) reliably encodes auditory categories in people born blind using a representational structure partially similar to the one found in vision (Mattioni et al.,2020). Here, using a combination of uni- and multivoxel analyses applied to fMRI data, we extend our previous findings, comprehensively investigating h...
Article
It is well documented that early sensory loss typically alters brain morphology in the areas associated with the lost sense. However, much less is known about the impact of early sensory loss on the remainder of the sensory regions. Therefore, we investigated whether congenitally blind (CB) individuals show brain alterations in the olfactory system...
Preprint
Alpha oscillatory activity is thought to contribute to the cueing of visual attention through the engagement of task-relevant occipital regions. In early blindness, occipital alpha oscillations are systematically reduced, suggesting that occipital alpha depends on visual experience. However, it is still unknown if alpha activity could serve attenti...
Article
hMT+/V5 is a region in the middle occipito-temporal cortex that responds preferentially to visual motion in sighted people. In case of early visual deprivation, hMT+/V5 enhances its response to moving sounds. Whether hMT+/V5 contains information about motion directions and whether the functional enhancement observed in the blind is motion specific,...
Article
Synesthesia represents an atypical merging of percepts, in which a given sensory experience (e.g., words, letters, music) triggers sensations in a different perceptual domain (e.g., color). According to recent estimates, the vast majority of the reported cases of synesthesia involve a visual experience. Purely non-visual synesthesia is extremely ra...
Chapter
In congenitally deaf people, temporal regions typically believed to be primarily auditory enhance their response to nonauditory information. The neural mechanisms and functional principles underlying this phenomenon, as well as its impact on auditory recovery after sensory restoration, yet remain debated. In this chapter, we demonstrate that the cr...
Article
Full-text available
Patients treated for bilateral congenital cataracts provide a unique model to test the role of early visual input in shaping the development of the human cortex. Previous studies showed that brief early visual deprivation triggers long-lasting changes in the human visual cortex. However, it remains unknown if such changes interact with the developm...
Article
Full-text available
words are typically more difficult to identify than concrete words in lexical-decision, word-naming, and recall tasks. This behavioral advantage, known as the concreteness effect, is often considered as evidence for embodied semantics, which emphasizes the role of sensorimotor experience in the comprehension of word meaning. In this view, online se...
Preprint
Full-text available
hMT+/V5 is a region in the middle occipito-temporal cortex that responds preferentially to visual motion in sighted people. In case of early visual deprivation, hMT+/V5 enhances its response to moving sounds. Whether hMT+/V5 contains information about motion directions and whether the functional enhancement observed in the blind is motion specific,...
Article
Full-text available
Our brain constructs reality through narrative and argumentative thought. Some hypotheses argue that these two modes of cognitive functioning are irreducible, reflecting distinct mental operations underlain by separate neural bases; Others ascribe both to a unitary neural system dedicated to long-timescale information. We addressed this question by...
Article
Full-text available
Voices are arguably among the most relevant sounds in humans' everyday life, and several studies have suggested the existence of voice-selective regions in the human brain. Despite two decades of research, defining the human brain regions supporting voice recognition remains challenging. Moreover, whether neural selectivity to voices is merely driv...
Article
Studies involving congenitally blind adults shows that visual experience is not a mandatory prerequisite for the emergence of efficient numerical abilities. It remains however unknown whether blind adults developed lifelong strategies to compensate for the absence of foundations vision would provide in infancy. We therefore assessed basic numerical...
Preprint
Full-text available
Voices are arguably among the most relevant sounds in humans’ everyday life, and several studies have suggested the existence of voice-selective regions in the human brain. Despite two decades of research, defining the human brain regions supporting voice recognition remains challenging. Moreover, whether neural selectivity to voices is merely driv...
Article
Full-text available
Center for Mind/Brain Sciences, University of Trento, Rovereto, 38068, Italy, and 3Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Charlestown, Massachusetts 01129 In humans, the occipital middle-temporal region (hMT1/V5) specializes in the processing of visual motion, while the planum te...
Article
Full-text available
In early deaf individuals, the auditory deprived temporal brain regions become engaged in visual processing. In our study we tested further the hypothesis that intrinsic functional specialization guides the expression of cross-modal responses in the deprived auditory cortex. We used functional MRI to characterize the brain response to horizontal, r...
Preprint
Studies involving congenitally blind adults demonstrated that visual experience is not a mandatory prerequisite for the emergence of efficient numerical abilities. It remains however unknown whether blind adults developed lifelong strategies to compensate for the absence of foundations vision would provide in infancy. We therefore assessed basic nu...
Article
Full-text available
Humans, and several non-human species, possess the ability to make approximate but reliable estimates of the number of objects around them. Alike other perceptual features, numerosity perception is susceptible to adaptation: exposure to a high number of items causes underestimation of the numerosity of a subsequent set of items, and vice versa. Sev...
Preprint
Full-text available
Visual deprivation triggers enhanced dependence on auditory representation. It was suggested that (auditory) temporal regions sharpen their response to sounds in visually deprived people. In contrast with such view, we show that the coding of sound categories is enhanced in the occipital but, importantly, reduced in the temporal cortex of early and...
Preprint
Full-text available
In early deaf individuals, the auditory deprived temporal brain regions become engaged in visual processing. In our study we tested further the hypothesis that intrinsic functional specialization guides the expression of cross-modal responses in the deprived auditory cortex. We used functional MRI to characterize the brain response to horizontal, r...
Article
Although often considered a non-dominant sense for spatial perception, chemosensory perception can be used to localize the source of an event and potentially help us navigate through our environment. Would blind people who lack the dominant spatial sense -vision- develop enhanced spatial chemosensation or, alternatively, suffer from the lack of vis...
Article
Full-text available
Vision is thought to support the development of spatial abilities in the other senses. If this is true, how does spatial hearing develop in people lacking visual experience? We comprehensively addressed this question by investigating auditory-localization abilities in 17 congenitally blind and 17 sighted individuals using a psychophysical minimum-a...
Article
Full-text available
Is vision a necessary building block for the foundations of mathematical cognition? A straightforward model to test the causal role visual experience plays in the development of numerical abilities is to study people born without sight. In this review we will demonstrate that congenitally blind people can develop numerical abilities that equal or e...
Preprint
Full-text available
In humans, the occipital middle-temporal region (hMT+/V5) specializes in the processing of visual motion, while the Planum Temporale (hPT) specializes in auditory motion processing. It has been hypothesized that these regions might communicate directly to achieve fast and optimal exchange of multisensory motion information. In this study, we invest...
Preprint
Full-text available
How is conceptual knowledge organized and retrieved by the brain? Recent evidence points to the anterior temporal lobe (ATL) as a crucial semantic hub integrating both abstract and concrete conceptual features according to a dorsal-to-medial gradient. It is however unclear when this conceptual gradient emerges and how semantic information reaches t...
Article
Full-text available
If conceptual retrieval is partially based on the simulation of sensorimotor experience, people with a different sensorimotor experience, such as congenitally blind people, should retrieve concepts in a different way. However, studies investigating the neural basis of several conceptual domains (e.g., actions, objects, places) have shown a very lim...
Article
Full-text available
Data analysis workflows in many scientific domains have become increasingly complex and flexible. Here we assess the effect of this flexibility on the results of functional magnetic resonance imaging by asking 70 independent teams to analyse the same dataset, testing the same 9 ex-ante hypotheses¹. The flexibility of analytical approaches is exempl...
Article
Severe alcohol use disorders (SAUD) are associated with a large variety of affective disturbances, among which a well-established decoding deficit for facial and vocal emotional expressions. This deficit has recently been found to be increased in cross-modal settings, namely when inputs from different sensory modalities have to be combined. Compare...
Article
Background: Severe alcohol use disorder (SAUD) is associated with impaired discrimination of emotional expressions. This deficit appears increased in crossmodal settings, when simultaneous inputs from different sensory modalities are presented. However, so far, studies exploring emotional crossmodal processing in SAUD relied on static faces and unm...
Article
The human occipito-temporal region hMT⁺/V5 is well known for processing visual motion direction. Here, we demonstrate that hMT⁺/V5 also represents the direction of auditory motion in a format partially aligned with the one used to code visual motion. We show that auditory and visual motion directions can be reliably decoded in individually localize...
Article
Full-text available
Humans share with other animals a number sense, a system allowing a rapid and approximate estimate of the number of items in a scene. Recently it has been shown that numerosity is shared between action and perception as the number of repetitions of self-produced actions affects the perceived numerosity of subsequent visual stimuli presented around...
Preprint
Humans share with other animals a number sense, a system allowing a rapid and approximate estimate of the number of items in a scene. Recently it has been shown that numerosity is shared between action and perception as the number of repetitions of self-produced actions affects the perceived numerosity of subsequent visual stimuli presented around...
Article
Full-text available
Is vision necessary for the development of the categorical organization of the Ventral Occipito-Temporal Cortex (VOTC)? We used fMRI to characterize VOTC responses to eight categories presented acoustically in sighted and early blind individuals, and visually in a separate sighted group. We observed that VOTC reliably encodes sound categories in si...
Article
Full-text available
Is vision necessary for the development of the categorical organization of the Ventral Occipito-Temporal Cortex (VOTC)? We used fMRI to characterize VOTC responses to eight categories presented acoustically in sighted and early blind individuals, and visually in a separate sighted group. We observed that VOTC reliably encodes sound categories in si...
Article
Full-text available
Is vision necessary for the development of the categorical organization of the Ventral Occipito-Temporal Cortex (VOTC)? We used fMRI to characterize VOTC responses to eight categories presented acoustically in sighted and early blind individuals, and visually in a separate sighted group. We observed that VOTC reliably encodes sound categories in si...
Article
Unequivocally demonstrating the presence of multisensory signals at the earliest stages of cortical processing remains challenging in humans. In our study, we relied on the unique spatio-temporal resolution provided by intracranial stereotactic electroencephalographic (SEEG) recordings in patients with drug-resistant epilepsy to characterize the si...
Article
Full-text available
Although impairment in sensory integration is suggested in the autism spectrum (AS), empirical evidences remain equivocal. We assessed the integration of low-level visual and tactile information within and across modalities in AS and typically developing (TD) individuals. TD individuals demonstrated increased redundancy gain for cross-modal relativ...
Preprint
Full-text available
Data analysis workflows in many scientific domains have become increasingly complex and flexible. To assess the impact of this flexibility on functional magnetic resonance imaging (fMRI) results, the same dataset was independently analyzed by 70 teams, testing nine ex-ante hypotheses. The flexibility of analytic approaches is exemplified by the fac...
Article
Recent studies have suggested that multisensory redundancy may improve cognitive learning. According to this view, information simultaneously available across two or more modalities is highly salient and, therefore, may be learned and remembered better than the same information presented to only one modality. In the current study, we wanted to eval...
Article
Full-text available
Individuals with autism are reported to integrate information from visual and auditory channels in an idiosyncratic way. Multisensory integration (MSI) of simple, non-social stimuli (i.e., flashes and beeps) was evaluated in adolescents and adults with (n = 20) and without autism (n = 19) using a reaction time (RT) paradigm using audio, visual, and...
Preprint
Full-text available
The Ventral Occipito-Temporal Cortex (VOTC) shows reliable category selective response to visual information. Do the development, topography and information content of this categorical organization depend on visual input or even visual experience? To further address this question, we used fMRI to characterize the brain responses to eight categories...
Conference Paper
Full-text available
Deafness is usually accompanied by functional brain alterations that may be thought as an alteration to connectome scaffolding1,2,3. The general goal of this study was to investigate brain structural network organization in early and profoundly deaf subjects (ED). The specific goal was to apply the structural white matter connectome formalism to ev...
Preprint
Vision is thought to scaffold the development of spatial abilities in the other senses. How does spatial hearing therefore develop in people lacking visual experience? We comprehensively addressed this question by investigating auditory localization abilities in congenitally blind and sighted control people using a psychophysical minimum audible an...
Conference Paper
Full-text available
Atypical sensory perception is now recognized as one of the key characteristics of autism (APA, 2013), with research suggesting that disrupted multi-sensory integration (MSI) may underlie the sensory behaviours seen in this population (Feldman et al., 2018). Further, the integration of social information (such as faces and facial expressions of emo...
Article
F Humans seamlessly extract and integrate the emotional content delivered by the face and the voice of others. It is however poorly understood how perceptual decisions unfold in time when people discriminate the expression of emotions transmitted using dynamic facial and vocal signals, as in natural social context. In this study, we relied on a gat...
Article
Non-arbitrary sound-shape correspondences (SSC), such as the "bouba-kiki" effect, have been consistently observed across languages and together with other sound-symbolic phenomena challenge the classic linguistic dictum of the arbitrariness of the sign. Yet, it is unclear what makes a sound "round" or "spiky" to the human mind. Here we tested the h...
Preprint
Full-text available
Unequivocally demonstrating the presence of multisensory signals at the earliest stages of cortical processing remains challenging in humans. In our study, we relied on the unique spatio-temporal resolution provided by intracranial stereotactic electroencephalographic (SEEG) recordings in patients with drug-resistant epilepsy to characterize the si...
Article
Full-text available
Recent studies proposed that the use of internal and external coordinate systems for perception and action may be more flexible in congenitally blind when compared to sighted individuals. To investigate this hypothesis further, we asked congenitally blind and sighted people to perform, with the hands uncrossed and crossed over the body midline, a t...
Article
Full-text available
The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is, however, poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to sounds moving left, right, up...
Preprint
Full-text available
The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is however poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to left, right, up and down moving...
Article
Full-text available
Abstract The discovery of intrinsically photosensitive retinal ganglion cells (ipRGCs) marked a major shift in our understanding of how light information is processed by the mammalian brain. These ipRGCs influence multiple functions not directly related to image formation such as circadian resetting and entrainment, pupil constriction, enhancement...
Article
Full-text available
The brain has separate specialized computational units to process faces and voices located in occipital and temporal cortices. However, humans seamlessly integrate signals from the faces and voices of others for optimal social interaction. How are emotional expressions, when delivered by different sensory modalities (faces and voices), integrated i...
Preprint
Humans seamlessly extract and integrate the emotional content delivered by the face and the voice of others. It is however poorly understood how perceptual decisions unfold in time when people discriminate the expression of emotions transmitted using dynamic facial and vocal signals, as in natural social context. In this study, we relied on a gatin...
Preprint
Full-text available
The brain has separate specialized computational units to process faces and voices located in occipital and temporal cortices. However, humans seamlessly integrate signals from the faces and voices of others for optimal social interaction. How are emotional expressions, when delivered by different sensory modalities (faces and voices), integrated i...
Preprint
Full-text available
We investigated the experiential bases of knowledge by asking whether people that perceive the world in a different way also show a different neurobiology of concepts. We characterized the brain activity of early-blind and sighted individuals during a conceptual retrieval task in which participants rated the perceptual similarity between color and...
Preprint
Recent studies suggested that multisensory training schemes could boost the development of abstract concepts. In the present study, we wanted to evaluate whether training arithmetic with a multisensory intervention could induce larger learning improvements than a visual intervention alone. Moreover, as a left-to-right oriented mental number line wa...

Network

Cited By