[Show abstract][Hide abstract] ABSTRACT: Though a clear interaction between finger and number representations has been demonstrated, what drives the development of this intertwining remains unclear. Here we tested early blind, late blind and sighted control participants in two counting tasks, each performed under three different conditions: a resting condition, a condition requiring hands movements and a condition requiring feet movements. In the resting condition, every sighted and late blind spontaneously used their fingers, while the majority of early blind did not. Sighted controls and late blind were moreover selectively disrupted by the interfering hand condition, while the early blind who did not use the finger-counting strategy remained unaffected by the interference conditions. These results therefore demonstrate that visual experience plays an important role in implementing the sensori-motor habits that drive the development of finger–number interactions.
[Show abstract][Hide abstract] ABSTRACT: Objectives Speech recognition varies considerably following cochlear implantation for reasons that are still poorly understood. Considering the role of frequency discrimination in normal speech recognition, the aim of this study was to investigate the association between deficits in auditory frequency discrimination and speech recognition in cochlear implant users. Methods Frequency discrimination thresholds and speech recognition were assessed in a group of 20 cochlear implant users and 16 normally hearing controls. Results Based on their results on the speech recognition task, the cochlear implant users were categorized either as proficient (n = 10) or non-proficient users (n = 10). The non-proficient cochlear implant users had poorer auditory frequency discrimination compared to the normal hearing participants and proficient cochlear implant users (both P < 0.05). No significant difference was found between the proficient cochlear implant users and the normally hearing group (P > 0.05). Furthermore, a bivariate correlation analysis revealed a relationship between speech recognition and frequency discrimination. Conclusions The present findings suggest an association between auditory frequency discrimination and speech recognition proficiency in cochlear implant users. Although no causal link can be drawn from these data, possible reasons for this association are discussed.
[Show abstract][Hide abstract] ABSTRACT: Though a clear interaction between finger and number representations has been demonstrated, what drives the development of this intertwining remains unclear. Here we tested early blind, late blind and sighted control participants in two counting tasks, each performed under three different conditions: a resting condition, a condition requiring hands movements and a condition requiring feet movements. In the resting condition, every sighted and late blind spontaneously used their fingers, while the majority of early blind did not. Sighted controls and late blind were moreover selectively disrupted by the interfering hand condition, while the early blind who did not use the finger-counting strategy remained unaffected by the interference conditions. These results therefore demonstrate that visual experience plays an important role in implementing the sensori-motor habits that drive the development of finger-number interactions.
[Show abstract][Hide abstract] ABSTRACT: Little is known about the relation of alpha rhythms and object recognition. Alpha has been generally proposed to be associated with attention and memory and to be particularly important for the mediation of long-distance communication between neuronal populations. However, how these apply to object recognition is still unclear. This study aimed at describing the spatio-temporal dynamics of alpha rhythms while recognizing fragmented images of objects presented for the first time and presented again 24 hr later. Intracranial electroencephalography was performed in six epileptic patients undergoing presurgical evaluation. Time-frequency analysis revealed a strong alpha activity, mainly of the evoked type, propagating from posterior cerebral areas to anterior regions, which was similar whether the objects were recognized or not. Phase coherence analysis, however, showed clear phase synchronization specific for the moment of recognition. Twenty-four hr later, frontal regions displayed stronger alpha activity and more distributed phase synchronization than when images were presented for the first time. In conclusion, alpha amplitude seems to be related to nonspecific mechanism. Phase coherence analysis suggests a communicational role of alpha activity in object recognition, which may be important for the comparison between bottom-up representations and memory templates.
Journal of Cognitive Neuroscience 01/2014; · 4.49 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: The global level of hierarchical stimuli (Navon's stimuli) is typically processed quicker and better than the local level; further differential hemispheric dominance is described for local (left hemisphere, LH) and global (right hemisphere, RH) processing. However, neuroimaging and behavioral data indicate that stimulus category (letter or object) could modulate the hemispheric asymmetry for the local level processing. Besides, when the targets are unpredictably displayed at the global or local level, the participant has to switch between levels, and the magnitude of the switch cost increases with the number of repeated-level trials preceding the switch. The hemispheric asymmetries associated with level switching is an unresolved issue. LH areas may be involved in carrying over the target level information in case of level repetition. These areas may also largely participate in the processing of level-changed trials. Here we hypothesized that RH areas underly the inhibitory mechanism performed on the irrelevant level, as one of the components of the level switching process. In an experiment using a within-subject design, hierarchical stimuli were briefly presented either to the right or to the left visual field. 32 adults were instructed to identify the target at the global or local level. We assessed a possible RH dominance for the non-target level inhibition by varying the attentional demands through the manipulation of level repetitions (two or gour repeated-level trials before the switch). The behavioral data confirmed a LH specialization only for the local level processing of letter-based stimuli, and detrimental effect of increased level repetitions before a switch. Further, data provides evidence for a RH advantage in inhibiting the non-target level. Taken together, the data supports the notion of the existence of multiple mechanisms underlying level-switch effects.
Frontiers in Psychology 01/2014; 5:252. · 2.80 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Emotional visual perception deficits constitute a major problem in alcohol-dependence. Indeed, the ability to assess the affective content of external cues is a key adaptive function, as it allows on the one hand the processing of potentially threatening or advantageous stimuli, and on the other hand the establishment of appropriate social interactions (by enabling rapid decoding of the affective state of others from their facial expressions). While such deficits have been classically considered as reflecting a genuine emotion decoding impairment in alcohol-dependence, converging evidence suggests that underlying visual deficits might play a role in emotional alterations. This hypothesis appears to be relevant especially as data from healthy populations indicate that a coarse but fast analysis of visual inputs would allow emotional processing to arise from early stages of perception. After reviewing those findings and the associated models, the present paper underlines data showing that rapid interactions between emotion and vision could be impaired in alcohol-dependence and provides new research avenues that may ultimately offer a better understanding of the roots of emotional deficits in this pathological state.
Frontiers in Human Neuroscience 01/2014; 8. · 2.91 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Plasticity in the human and animal brain is the rule, the base for development, and the way to deal effectively with the environment for making the most efficient use of all the senses. When the brain is deprived of one sensory modality, plasticity becomes compensatory: the exception that invalidates the general loss hypothesis giving the opportunity of effective change. Sensory deprivation comes with massive alterations in brain structure and function, behavioral outcomes, and neural interactions. Blind individuals do as good as the sighted and even more, show superior abilities in auditory, tactile and olfactory processing. This behavioral enhancement is accompanied with changes in occipital cortex function, where visual areas at different levels become responsive to non-visual information. The intact senses are in general used more efficiently in the blind but are also used more exclusively. New findings are disentangling these two aspects of compensatory plasticity. What is due to visual deprivation and what is dependent on the extended use of spared modalities? The latter seems to contribute highly to compensatory changes in the congenitally blind. Short-term deprivation through the use of blindfolds shows that cortical excitability of the visual cortex is likely to show rapid modulatory changes after few minutes of light deprivation and therefore changes are possible in adulthood. However, reorganization remains more pronounced in the congenitally blind. Cortico-cortical pathways between visual areas and the areas of preserved sensory modalities are inhibited in the presence of vision, but are unmasked after loss of vision or blindfolding as a mechanism likely to drive cross-modal information to the deafferented visual cortex. The development of specialized higher order visual pathways independently from early sensory experience is likely to preserve their function and switch to the intact modalities. Plasticity in the blind is also accompanied with neurochemical and morphological changes; both intrinsic connectivity and functional coupling at rest are altered but are likewise dependent on different sensory experience and training.
Frontiers in Human Neuroscience 01/2014; 8:340. · 2.91 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Past research has shown that odor perception can be affected by how we label odors. The aim of this study was to expand on previous work by investigating the impact of labels on edibility, pleasantness, and intensity ratings as well as on reaction times when detecting labeled odors. We tested 50 subjects. Five odorants were administered, each with a positive and a negative label. Participants had to detect odors as fast as possible and then rate their edibility, pleasantness, and intensity. Because of a lack of fit, only 4 of the initial 5 odorants were analyzed. All odorants presented with positive labels were rated as being more edible than when they were presented with negative labels. Specifically, the effect was also seen for the 2 nonfood odorants suggesting an unbiased effect. All odorants presented with positive labels were rated as being more pleasant than when they were presented with negative labels. Labels also modulated intensity ratings and reaction times for some odors. In summary, odor labels affect pleasantness ratings and edibility perception. Although labels appear to also influence intensity ratings and reaction times, this seems to be a more complex relationship that could be modulated by additional factors such as odor valence, label fit, and possibly the edibility attributed to an odor or a label.
[Show abstract][Hide abstract] ABSTRACT: To verify if a mismatch negativity (MMN) paradigm based on speech syllables can differentiate between good and poorer cochlear implant (CI) users on a speech recognition task.
Twenty adults with a CI and 11 normal hearing adults participated in the study. Based on a speech recognition test, ten CI users were classified as good performers and ten as poor performers. We measured the MMN with /da/ as the standard stimulus and /ba/ and /ga/ as the deviants. Separate analyses were conducted on the amplitude and latency of the MMN.
A MMN was evoked by both deviant stimuli in all normal hearing participants and in well performing CI users, with similar amplitudes for both groups. However, the amplitude of the MMN was significantly reduced for the poorer CI users compared to the normal hearing group and the good CI users. The latency was longer for both groups of cochlear implant users. A bivariate correlation showed a significant positive correlation between the speech recognition score and the amplitude of the MMN.
The MMN can distinguish between CI users who have good versus poor speech recognition as assessed with conventional tasks.
Our findings suggest that the MMN can be use to assess speech recognition proficiency in CI users who cannot be tested with regular speech recognition tasks, like infants and other non-verbal populations.
Clinical neurophysiology: official journal of the International Federation of Clinical Neurophysiology 10/2013; · 3.12 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: In the absence of visual input, the question arises as to how complex spatial abilities develop and how the brain adapts to the absence of this modality. As such, the aim of the current study was to investigate the relationship between visual status and an important brain structure with a well established role in spatial cognition and navigation, the caudate nucleus. We conducted a volumetric analysis of the caudate nucleus in congenitally and late blind individuals, as well as in matched sighted control subjects.
No differences in the volume of the structure were found either between congenitally blind (CB) and matched sighted controls or between late blind (LB) and matched sighted controls. Moreover, contrary to what was expected, no significant correlation was found between caudate volume and performance in a spatial navigation task. Finally, consistent with previously published reports, the volume of the caudate nucleus was found to be negatively correlated with age in the sighted; however such correlations were not significant in the blind groups.
Although there were no group differences, the latter finding suggests that visual deprivation may still have an effect on the developmental changes that occur in the caudate nucleus.
[Show abstract][Hide abstract] ABSTRACT: Light regulates multiple non-image-forming (or nonvisual) circadian, neuroendocrine, and neurobehavioral functions, via outputs from intrinsically photosensitive retinal ganglion cells (ipRGCs). Exposure to light directly enhances alertness and performance, so light is an important regulator of wakefulness and cognition. The roles of rods, cones, and ipRGCs in the impact of light on cognitive brain functions remain unclear, however. A small percentage of blind individuals retain non-image-forming photoreception and offer a unique opportunity to investigate light impacts in the absence of conscious vision, presumably through ipRGCs. Here, we show that three such patients were able to choose nonrandomly about the presence of light despite their complete lack of sight. Furthermore, 2 sec of blue light modified EEG activity when administered simultaneously to auditory stimulations. fMRI further showed that, during an auditory working memory task, less than a minute of blue light triggered the recruitment of supplemental prefrontal and thalamic brain regions involved in alertness and cognition regulation as well as key areas of the default mode network. These results, which have to be considered as a proof of concept, show that non-image-forming photoreception triggers some awareness for light and can have a more rapid impact on human cognition than previously understood, if brain processing is actively engaged. Furthermore, light stimulates higher cognitive brain activity, independently of vision, and engages supplemental brain areas to perform an ongoing cognitive process. To our knowledge, our results constitute the first indication that ipRGC signaling may rapidly affect fundamental cerebral organization, so that it could potentially participate to the regulation of numerous aspects of human brain function.
Journal of Cognitive Neuroscience 07/2013; · 4.49 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Contrasting the impact of congenital versus late-onset acquired blindness provides a unique model to probe how experience at different developmental periods shapes the functional organization of the occipital cortex. We used functional magnetic resonance imaging to characterize brain activations of congenitally blind, late-onset blind and two groups of sighted control individuals while they processed either the pitch or the spatial attributes of sounds. Whereas both blind groups recruited occipital regions for sound processing, activity in bilateral cuneus was only apparent in the congenitally blind, highlighting the existence of region-specific critical periods for crossmodal plasticity. Most importantly, the preferential activation of the right dorsal stream (middle occipital gyrus and cuneus) for the spatial processing of sounds was only observed in the congenitally blind. This demonstrates that vision has to be lost during an early sensitive period in order to transfer its functional specialization for space processing toward a non-visual modality. We then used a combination of dynamic causal modelling with Bayesian model selection to demonstrate that auditory-driven activity in primary visual cortex is better explained by direct connections with primary auditory cortex in the congenitally blind whereas it relies more on feedback inputs from parietal regions in the late-onset blind group. Taken together, these results demonstrate the crucial role of the developmental period of visual deprivation in (re)shaping the functional architecture and the connectivity of the occipital cortex. Such findings are clinically important now that a growing number of medical interventions may restore vision after a period of visual deprivation.
[Show abstract][Hide abstract] ABSTRACT: Anxiety can either impair or enhance performance depending on the context. Increased sensitivity to threat seems to be an important feature of sensory processing in anxiety since anxious individuals tend to be more attentive to threatening visual stimuli. Evidence of anxiety effects in olfaction is rare; though alterations of olfactory performance in psychiatric patients and some effects of trait and state anxiety on olfactory performance have been reported. Our main objective was thus to investigate whether olfactory processing speed varies as a function of trait anxiety levels. We additionally investigated a possible preferential bias for unpleasant odors in highly anxious participants. Thirty-eight healthy adults participated in a simple odor detection task, where response times (RTs) and anxiety levels were measured. We compared RTs to a pleasant and an unpleasant food odor between high- and low-trait anxiety participants. We found that high-trait anxiety participants detected both odors faster than low-trait anxiety participants, independently of odor pleasantness. Moreover, trait anxiety levels significantly correlated with reaction times to both odors, indicating that trait anxiety but not odor pleasantness influences olfactory detection speed. These findings provide new insights into olfactory processing in healthy adults showing how various levels of trait anxiety affect the olfactory modality.
[Show abstract][Hide abstract] ABSTRACT: Plasticity resulting from early sensory deprivation has been investigated in both animals and humans. After sensory deprivation, brain areas that are normally associated with the lost sense are recruited to carry out functions in the remaining intact modalities. Previous studies have reported that it is almost exclusively the visual dorsal pathway which is affected by auditory deprivation. The purpose of the current study was to further investigate the possible reorganization of visual ventral stream functions in these individuals in both the auditory and the visual cortices. Fifteen pre-lingual profoundly deaf subjects were compared with a group of sixteen hearing subjects. We used fMRI to explore the areas underlying the processing of two similar visual motion stimuli that however were designed to evoke different types of processing: 1) a global motion stimulus (GMS) which preferentially activates regions of the dorsal visual stream, and 2) a form-from-motion (FFM) stimulus which is known to recruit regions from both visual streams. No significant differences between deaf and hearing individuals were found in target visual and auditory areas when the motion and form components of the stimuli were isolated (contrasted with a static visual image). However, increases in activation were found in the deaf group in the superior temporal gyrus (BA 22 and 42) and in an area located at the junction of the parieto-occipital sulcus and the calcarine fissure (encompassing parts of the cuneus, precuneus and the lingual gyrus) for the GMS and FFM conditions as well as for the static image, relative to a baseline condition absent of any visual stimulation. These results suggest that the observed cross-modal recruitment of auditory areas in deaf individuals does not appear to be specialized for motion processing, but rather is present for both motion and static visual stimuli.
[Show abstract][Hide abstract] ABSTRACT: The ability to recognize and integrate emotions from another person's facial and vocal expressions is a fundamental cognitive skill involved in the effective regulation of social interactions. Deficits in such abilities have been suggested as a possible source for certain atypical social behaviours manifested by persons with autistic spectrum disorders (ASD). In the present study, we assessed the recognition and integration of emotional expressions in individuals with ASD using a validated set of ecological stimuli comprised of dynamic visual and auditory (non-verbal) vocal clips. Participants with ASD and typically developing controls (TD) were asked to discriminate between clips depicting expressions of disgust and fear either presented visually, auditorily or bimodally (audio-visual). The group of participants with ASD was less efficient to discriminate emotional expressions across all conditions (unimodal and bimodal). Moreover, they necessitated a higher signal-to-noise ratio for the discrimination of visual or auditory presentations of disgust versus fear expressions. These results suggest an altered sensitivity to emotion expressions in this population that is not modality-specific. In addition, the group of participants with ASD benefited from exposure to bimodal information to a lesser extent than did the TD group, indicative of a decreased multisensory gain in this population. These results are the first to compellingly demonstrate joint difficulties for both the perception and the integration of multisensory emotion expressions in persons with ASD.
[Show abstract][Hide abstract] ABSTRACT: Although significant progress has been made over the last decades, the chemical senses remain less well explored than vision or audition. One method to assess participants' ability to identify or localize odors consists in the application of dichotomous stimuli (e.g. left and right sided stimulation). In this study we aimed to explore localization and identification mechanisms by investigating whether response times and response accuracy were correlated, with the aim of establishing the pertinence of response times as an additional measure for assessment of the olfactory function (1). We further examined an advantage of the right nostril which has been reported in several publications (2). We delivered two mixed olfactory/ trigeminal odors (benzaldehyde and eucalyptol) to one nostril at a time in a pseudorandomized order to 23 normosmic participants; the other nostril received an odor free airpuff. In half of the trials we asked the participants to detect the stimulated nostril; in the other half, they indicated which odor they had received. We recorded response accuracy and response times. Participants reached higher accuracy for odor identification than for localization, driven by benzaldehyde. For the stimulus eucalyptol exclusively, we observed that participants were faster to respond after stimulation of the right nostril than to the left nostril, in the localization task. Finally, response times were correlated with response accuracy for the identification task, but not for localisation. Our findings suggest that odor identification is easier than odor localization. In addition, we find further support for an advantage of the right nostril over the left nostril. Moreover, the measurement of reaction times may supplement other techniques of the assessment of odor identification.
[Show abstract][Hide abstract] ABSTRACT: Induced gamma-band response (iGBR) has been linked to coherent perception of images and is thought to represent the synchronisation of neuronal populations mediating binding of elements composing the image and the comparisons with memory for proper recognition. This study uses fragmented images with intracranial electroencephalography to investigate the precise spatio-temporal dynamic of iGBR elicited by the recognition of objects presented for the first time and 24hours later. Results show an increased iGBR at recognition in regions involved in bottom-up processes such as the cuneus and the lateral occipital complex. Top-down facilitation involved the lingual gyrus, the precuneus and the superior parietal lobule when images were presented for the first time. Twenty-four hours later, top-down facilitation was mediated by frontal areas involved in retrieval from episodic memory. This study showed that the classically reported iGBR is related to object recognition and that top-down processes vary according to task demand.
[Show abstract][Hide abstract] ABSTRACT: The relative reliability of separate sensory estimates influences the way they are merged into a unified percept. We investigated how eccentricity-related changes in reliability of auditory and visual stimuli influence their integration across the entire frontal space. First, we surprisingly found that despite a strong decrease in auditory and visual unisensory localization abilities in periphery, the redundancy gain resulting from the congruent presentation of audio-visual targets was not affected by stimuli eccentricity. This result therefore contrasts with the common prediction that a reduction in sensory reliability necessarily induces an enhanced integrative gain. Second, we demonstrate that the visual capture of sounds observed with spatially incongruent audio-visual targets (ventriloquist effect) steadily decreases with eccentricity, paralleling a lowering of the relative reliability of unimodal visual over unimodal auditory stimuli in periphery. Moreover, at all eccentricities, the ventriloquist effect positively correlated with a weighted combination of the spatial resolution obtained in unisensory conditions. These findings support and extend the view that the localization of audio-visual stimuli relies on an optimal combination of auditory and visual information according to their respective spatial reliability. All together, these results evidence that the external spatial coordinates of multisensory events relative to an observer's body (e.g., eyes' or head's position) influence how this information is merged, and therefore determine the perceptual outcome.
Journal of Vision 01/2013; 13(12). · 2.48 Impact Factor