Article

Congenitally blind humans use different stimulus selection strategies in hearing: An ERP study of spatial and temporal attention

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Superior temporal processing skills, both in the auditory and tactile system have been reported in blind as compared to sighted humans. The present experiment tested whether blind people prefer, as a possible consequence, temporal rather than spatial stimulus selection strategies. Eight congenitally blind adults were tested in a selective attention experiment that simultaneously manipulated spatial and temporal attention. Participants had to attend to an auditory offset stimulus demarcating the end of a short (600 ms) or long (1200 ms) interval. They had to detect slightly less intense offset markers at the attended point in time presented in the left (half of the trials) or right (other half of the trials) hemifield. Event-related potentials (ERPs) were recorded throughout the experiment. Whereas the N1 was significantly enhanced by both spatial and temporal attention in the sighted (see Lange, Kramer, & Röder, 2006), only a temporal attention effect was found in the blind. Moreover, in both groups a second, longer lasting negativity was observed for offset markers presented at the attended as compared to the unattended spatial location. This negativity was modulated by temporal attention only in the blind. These results are consistent with the assumption of a higher priority of time for stimulus selection in the absence of vision from birth.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Der Studie von Coull und Nobre (1998) (Correa et al., 2006;Coull & Nobre, 1998;Griffin et al., 2002;Miniussi et al., 1999 Lange et al., 2003;Röder et al., 2007;Sanders & Astheimer, 2008), ähnlich wie es auch für räumliche Aufmerksamkeit berichtet wurde (z.B. Giard, Perrin, Pernier, & Peronnet, 1988;Hillyard et al., 1973;Näätänen, Gaillard, & Mäntysalo, 1978;Woldorff & Hillyard, 1991, siehe auch Näätänen & Picton, 1987 Eine Modulation sensorischer Prozesse wird hingegen nicht beobachtet, wenn die Aufmerksamkeit zwischen der angekündigten und der nicht angekündigten räumlichen Position aufgeteilt werden muss (Eimer, 1994(Eimer, , 1996Schröger & Eimer, 1997 (Hillyard et al., 1973;Näätänen, 1990 (Coull & Nobre, 1998;Nobre, 2001 ...
... Coull & Nobre, 1998;Griffin et al., 2002, Experiment 1, Miniussi et al., 1999) war der Aufmerksam-keitseffekt in den Reaktionszeiten auf das kurze Intervall beschränkt, so dass es scheinbar zu einer Neuorientierung der Aufmerksamkeit in der lang-invalide-Bedingung gekommen ist (vergleiche Coull & Nobre, 1998 (Correa et al., 2006;Griffin et al., 2002, Experiment 2;Miniussi et al., 1999). Das Ausbleiben eines N1-Aufmerksamkeitseffektes widerspricht somit den Befunden anderer Studien zur auditiven zeitlichen Aufmerksamkeit, die konsistent eine N1-Modulation berichteten Lange et al., 2003;Röder et al., 2007;Sanders & Astheimer, 2008 (Walter, Winter, Cooper, McCallum, & Aldridge, 1964). Es konnte gezeigt werden, dass sich die Amplitude der CNV verändert, wenn das Maß an benötigter Aufmerksamkeit zur Ausführung einer Aufgabe variiert (Tecce, 1972;Walter et al., 1964). ...
... Unbeantwortet bleibt die Frage, was dazu führte, dass die vorausgegangenen Studien zur zeitlichen Aufmerksamkeitsorientierung in der auditiven Modalität bisher konsistent eine Modulation der N1-Komponente berichteten (z.B. Lange et al., 2003;Röder et al., 2007;Sanders & Astheimer, 2008, eine Übersicht bietet Lange & Röder, 2010) Lange et al., 2003;Sanders & Astheimer), während in Studien, die lediglich späte Aufmerksamkeitseffekte berichteten, die Aufmerksamkeit für jedes Trials neu orientiert werden musste, jeweils zu dem Zeitpunkt hin, welcher durch den Cue angekündigt wurde (z.B. Griffin et al., 2002, Experiment 2;Miniussi et al., 1999). ...
... Interestingly, a growing number of studies have shown improved auditory localization skills in blind individuals using behavioral, electrophysiological and brain imaging studies (Collignon et al., 2006;Doucet et al., 2005;Muchnik et al., 1991;Röder et al., 2007Röder et al., , 1999. In some of those studies, blind and sighted participants attended to a sound source in space and detected rare target stimuli while ignoring more frequent auditory standards and other (task-irrelevant) rare deviant stimuli presented at the same or other loudspeakers (e.g., Röder et al., 1999a). ...
... In a previous study, Röder et al. (2007) asked 8 congenitally blind individuals and 12 sighted controls to attend either to the left or right loudspeaker at which auditory stimuli were presented and to concentrate either on a long or short time interval which separated the two auditory stimuli (S1 and S2) from each other. The authors examined the length of the auditory refractory period across the two groups. ...
... Refractory periods are defined as time periods during which the cell is not able to generate further action potentials. Interestingly, congenitally blind individuals showed a more pronounced ERP negativity for the second auditory stimulus (S2), suggesting shorter auditory refractory periods in the blind compared to the sighted controls (Röder et al., 2007). This implies that blind participants had an advantage in the processing of auditory stimuli. ...
Article
The question whether spatial selective attention is necessary in order to process vocal affective prosody has been controversially discussed in sighted individuals: whereas some studies argue that attention is required in order to process emotions, other studies conclude that vocal prosody can be processed even outside the focus of spatial selective attention. Here, we asked whether spatial selective attention is necessary for the processing of affective prosodies after visual deprivation from birth. For this purpose, pseudowords were presented at the left or right loudspeaker and spoken in happy, neutral, fearful or threatening prosodies. Congenitally blind individuals (N = 8) and sighted controls (N=13) had to attend to one of the loudspeakers and detect rare pseudowords presented at the attended loudspeaker during EEG recording. Emotional prosody of the syllables was task-irrelevant. Blind individuals outperformed sighted controls by being more efficient in detecting deviant syllables at the attended loudspeaker. Higher auditory N1 amplitude was observed in blind individuals compared to sighted controls. Additionally, sighted controls showed enhanced attention-related ERP amplitudes in response to fearful and threatening voices during the time range of the N1. By contrast, blind individuals revealed enhanced ERP amplitudes in attended relative to unattended locations irrespective of the affective valence in all time windows (110-350 ms). These effects were mainly observed at posterior electrodes. The results provide evidence for “emotion-general“ auditory spatial selective attention effects in congenitally blindness and provide further indirect support for the idea of reorganization of the voice processing brain system following visual deprivation from birth.
... By contrast, visually deprived individuals initially tend to filter stimuli on the basis of their modality, and only at later stages of information processing are they able to suppress the processing of inputs from the task-irrelevant modality at the attended location. A later study additionally showed that, when stimuli should be selected according to more than one dimension, congenitally blind individuals tend to filter them first on the basis of nonspatial features, such their expected time point of occurrence rather than their position in space, as done by sighted individuals (Röder, Krämer, & Lange, 2007). At later stages, blind individuals adopt a parallel selection based on both spatial and temporal stimulus dimensions (Röder, Krämer, & Lange, 2007). ...
... A later study additionally showed that, when stimuli should be selected according to more than one dimension, congenitally blind individuals tend to filter them first on the basis of nonspatial features, such their expected time point of occurrence rather than their position in space, as done by sighted individuals (Röder, Krämer, & Lange, 2007). At later stages, blind individuals adopt a parallel selection based on both spatial and temporal stimulus dimensions (Röder, Krämer, & Lange, 2007). In both studies, however, the selection strategies adopted by the blind participants resulted in performance that was both faster and more accurate (e.g., Röder et al., 1996). ...
... The idea that the improved performance observed in the spared sensory modalities in visually deprived individuals compared to the sighted can, at least partially, be subserved by top-down modulation has also been explored (e.g., Röder, Krämer, & Lange, 2007;. For instance, two electroencephalography (EEG) studies have used tactile and auditory oddball tasks in order to compare the responses elicited by the stimuli within these spared sensory modalities in blind participants with those of sighted participants Röder et al., 1996). ...
Article
We highlight the results of those studies that have investigated the plastic reorganization processes that occur within the human brain as a consequence of visual deprivation, as well as how these processes give rise to behaviorally observable changes in the perceptual processing of auditory and tactile information. We review the evidence showing that visual deprivation affects the establishment of the spatial coordinate systems involved in the processing of auditory and tactile inputs within the peripersonal space around an individual. In blind individuals, the absence of a conjoint activation of external coordinate systems across modalities co-occurs with a higher capacity to direct auditory and tactile attentional resources to different spatial locations and to ignore irrelevant distractors. Both processes could thus contribute to the reduced spatial multisensory binding that has been observed in those who are blind. The interplay between auditory and tactile information in visually deprived individuals is modulated by attentional factors. Blind individuals typically outperform sighted people in those tasks where the target is presented in one sensory modality (and the other modality acts as a distractor). By contrast, they are less efficient in tasks explicitly requiring the combination of information across sensory modalities. The review highlights how these behavioral effects are subserved by extensive plastic changes at the neural level, with brain areas traditionally involved in visual functioning switching and being recruited for the processing of stimuli within the intact residual senses. We also discuss the roles played by other intervening factors with regard to compensatory mechanisms, such as previous visual experience, age at onset of blindness, and learning effects. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
... temporal order processing (6,8)) and higher-level cognitive tasks (e.g. auditory (9) and verbal memory (10,11); temporal attention (12), musical meter perception (13,14), temporal order verbal working memory (15), the perception of ultra-fast speech (16)(17)(18)(19)), when compared to normally sighted controls. ...
... One such study, investigating resting state MEG, found increased connectivity in the delta and gamma ranges within visual cortex in the CB (62). Interestingly, despite the reduction in visual alpha power, the alpha connectivity between visual cortex and other cortical areas was preserved (Note, however, that the alpha band in this study was defined as a broader frequency band including traditional alpha-and beta-bands, [8][9][10][11][12][13][14][15][16][17][18][19][20]. In a sound categorization task, the auditory and visual areas were more strongly connected in the blind, as measured by correlations of gamma band power (64), providing support for the notion of the visual cortex being incorporated into the intact sensory systems carrying out non-visual tasks. ...
... A large amount of studies connected temporal processing to the entrainment of auditory cortex oscillations (50,(85)(86)(87). Thus, more generally, our findings of frequency increases of spectral power peaks might be related to increased temporal processing abilities, as often reported for congenitally blind individuals (6)(7)(8)12,15,18,19,88). In line with these assumptions, on the other side of the plasticity spectrum, age related decline in processing fast speech has been related to a slowing of theta-oscillations (89), additionally supporting the association of spectral dynamics within auditory cortex with temporal (speech) processesing. ...
Preprint
Full-text available
The human brain exhibits rhythms that are characteristic for anatomical areas and presumably involved in diverse perceptual and cognitive processes. Visual depriva-tion results in behavioral adaptation and cortical reorganization, particularly affect-ing sensory cortices. Whether these plasticity-related changes are accompanied by altered spectral properties of neural signals and whether certain brain areas are particularly targeted by these changes is unknown. With a recently introduced ap-proach, we analyzed MEG resting state data of a group of congenitally blind and matched sighted individuals. First, using clustering procedures (k-means and Gaussian Mixture Models) we identified brain region-specific spectral clusters. Sec-ond, a classifier was employed to test the specificity of the spectral profiles within and the differences between groups. We replicated the previously reported finding of area-specific spectral profiles, indicated by high classification performance in the sighted. Additionally, we found high classification performance in the blind, suggest-ing that after deprivation-related restructuring, area-specific spectral profiles can be consistently identified. Crucially, in the cross-group classification (sighted vs. blind), several sensory (visual and auditory) and right frontal brain areas were classified significantly worse compared to the control condition. Overall the spectral profiles of those brain areas showed increased neuronal power in higher frequency-bands, possibly reflecting acceleration of the regionally prevalent brain rhythms in the blind compared to the sighted. We provide evidence that visual deprivation-related plas-ticity selectively alters the spectral profiles of right frontal and sensory brain areas, possibly reflecting increased temporal processing capabilities (auditory, frontal cor-tices) and changes in the visual inhibitory-excitatory circuits in the blind.
... However, less is known about how temporal expectancies are formed in blind adults and whether compensatory mechanisms related to the loss of sight result in increased sensitivity to auditory temporal patterns (see Kupers & Ptito, 2014;Pascual-Leone, Cohen, Brasil-Neto, Valls-Sole, & Hallett, 1994;Pascual-Leone, 1996;Pascual-Leone & Hamilton, 2001;Ptito & Kupers, 2005;Wan, Wood, Reutens, & Wilson, 2010). Previous research on time perception and blindness has focused on explicit processes in temporal discrimination tasks (e.g., Lerens, Araneda, Renier, & De Volder, 2014;Röder, Krämer, & Lange, 2007;Röder, Rösler, & Spence, 2004;Stevens & Weaver, 2005; Van der Lubbe, Van Mierlo, & Postma, 2010). As blind individuals are more accurate than sighted individuals in these temporal discrimination tasks, and temporal expectancies are necessary to accurately discriminate timing, it is possible that the blind also form temporal expectancies more readily than the sighted. ...
... In fact, the enhancement of auditory perception in blindness has already been demonstrated with regard to auditory spatial detection (Collignon, Voss, Lassonde, & Lepore, 2009;Röder et al., 1999b), auditory attention (Fieger et al., 2006;Röder et al., 1999a), and temporal discrimination tasks (e.g., Lerens et al., 2014;Van der Lubbe et al., 2010). Studies on auditory rhythms have shown improved temporal processing in early blind individuals compared to sighted individuals (Lerens et al., 2014;Röder et al., 2004Röder et al., , 2007Stevens & Weaver, 2005;Van der Lubbe et al., 2010). Van der Lubbe et al. (2010) found that duration discrimination was more accurate for early blind compared to sighted individuals for both auditory and tactile modalities. ...
... The fact that the blind group only showed a learning advantage over the sighted group for nonmetrical rhythms could be explained by the evidence for conscious control over the learned rhythms demonstrated in the generation task by the blind group. Previous experiments reporting enhanced auditory processing in the blind compared to the sighted have used tasks that require explicit attention to differences in duration or between rhythms, or to deviations from isochrony (Lerens et al., 2014;Röder et al., 1999aRöder et al., , 2004Röder et al., , 2007Stevens & Weaver, 2005;Van der Fig. 4 Mean response time increase in test blocks relative to preceding and following blocks for sighted (blue) and blind (red) groups in metrical (solid bars) and nonmetrical (empty bars) conditions. In the metrical condition, test A refers to a strongly metrical (SM) test rhythm and test B refers to a weakly metrical (WM) test rhythm. ...
Article
Full-text available
Forming temporal expectancies plays a crucial role in our survival as it allows us to identify the occurrence of temporal deviants that might signal potential dangers. The dynamic attending theory suggests that temporal expectancies are formed more readily for rhythms that imply a beat (i.e., metrical rhythms) compared to those that do not (i.e., nonmetrical rhythms). Moreover, metrical frameworks can be used to detect temporal deviants. Although several studies have demonstrated that congenital or early blindness correlates with modality-specific neural changes that reflect compensatory mechanisms, few have examined whether blind individuals show a learning advantage for auditory rhythms and whether learning can occur unintentionally and without awareness, that is, implicitly. We compared blind to sighted controls in their ability to implicitly learn metrical and nonmetrical auditory rhythms. We reasoned that the loss of sight in blindness might lead to improved sensitivity to rhythms and predicted that the blind learn rhythms more readily than the sighted. We further hypothesized that metrical rhythms are learned more readily than nonmetrical rhythms. Results partially confirmed our predictions; the blind group learned nonmetrical rhythms more readily than the sighted group but the blind group learned metrical rhythms less readily than the sighted group. Only the sighted group learned metrical rhythms more readily than nonmetrical rhythms. The blind group demonstrated awareness of the nonmetrical rhythms while learning was implicit for all other conditions. Findings suggest that improved deviant-sensitivity might have provided the blind group a learning advantage for nonmetrical rhythms. Future research could explore the plastic changes that affect deviance-detection and stimulus-specific adaptation in blindness.
... This would provide support for the hypothesis that hearing (the remaining sense) is indeed enhanced in functions that are already specialized (temporal domain; see Figure 1(b)). An investigation of both the spatial and temporal processing abilities of blind individuals showed that blind individuals indeed rely more on temporal than spatial aspects when selecting a stimulus [77]. Enhanced N1 ERP activation during temporal, but not spatial, stimulus selection strategies in blind individuals supports this notion [77]. ...
... An investigation of both the spatial and temporal processing abilities of blind individuals showed that blind individuals indeed rely more on temporal than spatial aspects when selecting a stimulus [77]. Enhanced N1 ERP activation during temporal, but not spatial, stimulus selection strategies in blind individuals supports this notion [77]. (Figures 1(a) and 1(b)). ...
Article
Full-text available
One of the most significant effects of neural plasticity manifests in the case of sensory deprivation when cortical areas that were originally specialized for the functions of the deprived sense take over the processing of another modality. Vision and audition represent two important senses needed to navigate through space and time. Therefore, the current systematic review discusses the cross-modal behavioral and neural consequences of deafness and blindness by focusing on spatial and temporal processing abilities, respectively. In addition, movement processing is evaluated as compiling both spatial and temporal information. We examine whether the sense that is not primarily affected changes in its own properties or in the properties of the deprived modality (i.e., temporal processing as the main specialization of audition and spatial processing as the main specialization of vision). References to the metamodal organization , supramodal functioning , and the revised neural recycling theory are made to address global brain organization and plasticity principles. Generally, according to the reviewed studies, behavioral performance is enhanced in those aspects for which both the deprived and the overtaking senses provide adequate processing resources. Furthermore, the behavioral enhancements observed in the overtaking sense (i.e., vision in the case of deafness and audition in the case of blindness) are clearly limited by the processing resources of the overtaking modality. Thus, the brain regions that were previously recruited during the behavioral performance of the deprived sense now support a similar behavioral performance for the overtaking sense. This finding suggests a more input-unspecific and processing principle-based organization of the brain. Finally, we highlight the importance of controlling for and stating factors that might impact neural plasticity and the need for further research into visual temporal processing in deaf subjects.
... These negative ERP components reflect modality-specific sensory processing within the auditory cortex 26 and, as expected e.g. 27 , are enhanced in blind compared to sighted individuals. With regards to the posterior scalp regions (see Fig. 3), the earlier components are still more pronounced in blind than in sighted individuals, in line with previous literature reporting a posterior shift in the scalp topography of the auditory ERP responses following blindness 18,28,29 . ...
... As expected for the processing of central sounds (e.g. 18,27 ),there are no evident differences between the two groups at central scalp sites. However, the amplitude of the N1 elicited by central tones is higher than the amplitude of the same component elicited by unpredictable task-irrelevant noise bursts. ...
Article
Full-text available
Previous research has shown that peripheral, task-irrelevant sounds elicit activity in contralateral visual cortex of sighted people, as revealed by a sustained positive deflection in the event-related potential (ERP) over the occipital scalp contralateral to the sound’s location. This Auditory-evoked Contralateral Occipital Positivity (ACOP) appears between 200–450 ms after sound onset, and is present even when the task is entirely auditory and no visual stimuli are presented at all. Here, we investigate whether this cross-modal activation of contralateral visual cortex is influenced by visual experience. To this end, ERPs were recorded in 12 sighted and 12 blind subjects during a unimodal auditory task. Participants listened to a stream of sounds and pressed a button every time they heard a central target tone, while ignoring the peripheral noise bursts. It was found that task-irrelevant noise bursts elicited a larger ACOP in blind compared to sighted participants, indicating for the first time that peripheral sounds can enhance neural activity in visual cortex in a spatially lateralized manner even in visually deprived individuals. Overall, these results suggest that the cross-modal activation of contralateral visual cortex triggered by peripheral sounds does not require any visual input to develop, and is rather enhanced by visual deprivation.
... Focker et al., 2012;Focker et al., 2015), 指的是 100 ms 之后的脑电反应, 例如 N1、P2)上的潜伏期都更短(Elbert et al., 2002;Manjunath et al., 1998;Naveen et al., 1997;Roder et al., 1996), N1 振幅更大(Roder et al., 2007;Roder et al., 1996;Topalidis et al., 2020)和不应期更短(Roder et al.TVA (temporal voice areas) 和 FFA (fusiform face area)在结构和功能上均存在着连接 通路(Blank et al., 2011; Blank et al., 2015; Blank et al., 2014; Schall et al., 2013; von Kriegstein et al., 2005), 由于 FFA 脑区又可以根据不同任务诱发的 激活情况划分为以下两种功能脑区:在听觉环境 下由熟悉嗓音再认任务诱发的 FFA 脑区即 cFFA (cross modal-FFA), 和在人类视觉面孔选择性加 工任务中诱发的 FFA 脑区即 vFFA (visual-FFA) (Blank et al., 2011; von Kriegstein et al., 2008), TVA 和 cFFA 的连接模式。面孔失认症患者经 过实验前的(单模态)嗓音熟悉学习之后, 虽然嗓 音再认率明显差于常人, 但其 FFA 脑区表征出对于 熟 悉 嗓 音 信 息 的 跨 模 态 反 应 , 并 且 FFA (即 cFFA)与 STS 的 功 能 连 接 强 度 与 常 人 无 异 (vonKriegstein et al., 2006), 说明在没有任何视觉面孔 信息提示的情况下(包括实验前的学习阶段和正 式实验阶段), cFFA 可以独立于面孔表征能力而响 应于熟悉嗓音信息, 并与 STS 构建功能连接作用 于嗓音再认活动。 另一方面, 还有一些研究也强调了熟悉嗓音 再认过程中 vFFA 脑区的卷入以及 vFFA 和 TVA 的连接模式, 这些研究使用(已知面孔)个体熟悉 的嗓音(von Kriegstein et al., 2005)或者嗓音−面孔 整体学习后的嗓音材料(von Kriegstein et al., 2008; Schall et al., 2013), 基于独立的视觉面孔定位任 务中的 FFA 脑区(即 vFFA)作为感兴趣区, 发现了 其在嗓音再认过程中有所卷入(von Kriegstein et al., 2008; Schall et al., 2013)并与 STS 脑区存在功 能连接模式(von Kriegstein et al., 2005), 等人(2013)使用脑磁图(magnetoencephalography, MEG)技术发现相比于"职业(非面孔类图 片)−嗓音"的控制条件, 面孔−嗓音的前期学习经 历使得在嗓音再认加工的 100ms 左右诱发了右侧 FFA 的激活, 200ms 时听觉感觉成分 M200 的响应 Matsuzaki et al., 2012), 但 Schall 等人(., 2009), 并在这一加工过程中与左侧 STS 构 建了功能连接(Dormal et al., 2018), 早期盲人和 晚期盲人右侧 aFG 可以诱发对于熟悉嗓音身份的 匹配效应 (Holig et al., 2014a, 2014b), 综合这些 结果可以发现涉及盲人嗓音加工的 FG 脑区主要 位于右侧 aFG。 aFG 不 仅 是 视 觉 面 孔 的 选 择 性 加 工 脑 区 (Jonas et al., 2016), 同时也是熟悉面孔身份加工 的一个关键节点(Jonas et al., 2015)。Jonas 等人 (2016)利用颅内脑电(intracranial electroencephalo-Holig et al., 2014a, 2014b) 和 嗓 音 选 择 性 加 工(Gougoux et al., 2009; Dormal et al., 2018), Kupers et al., 2006Ptito et al., 2008;Qin & Yu, 2013), fMRI 和正电子放射 断层造影(positron emission tomography, PET)研究 已经证明, 一种感觉通道的刺激可以抑制其他感 觉 通 道 对 相 应 脑 区 的 反 应 (Haxby et al., 1994; Johnson & Zatorre, 2005; Kawashima et al., 1995; Laurienti et al., 2002)。由于缺乏视觉输入, 盲人视 觉皮层只能征用和增强已有的听觉或触觉输入以 处理(非视觉)信息, 这种现 象被称为"去 掩蔽效 应", 例如盲人枕叶中的 V1 增强了与 A1 的结构和 功能连接以执行听觉任务(Klinge et al.Benetti et al., 2018), 说明感觉缺失人群并未表现 出 更 强 的 "TVA−FFA" 连 接 通 路 , 而 且 相 比 于 "TVA−FFA", 早 期 聋 人 的 "V2/3−TVA" ...
Article
The human voice, as an important part of one's auditory environment, contains a large amount of paralinguistic information to help identify individuals. Especially for blind individuals, the lack of visual face experience makes voice information the main source of perceiving another person's individual characteristics. By combining the research on voice-selective and voice-identity processing among both sighted and blind individuals, the present study attempts to reveal the general human voice processing mode and specific voice processing mechanism among blind persons. It also shows that the fusiform gyrus related to visual face processing are involved in voice tasks in both groups. The auditory-visual integration model and cross-modal reorganization hypothesis based on the “unmasking effect”, provide respective explanation mechanisms for the different neural bases of voice processing between the two subjects. Future, researchers should further investigate the voice processing strategies of blind individuals and the role of the left superior temporal sulcus/gyrus in voice processing.
... Studies on auditory and tactile modalities support this sensory advantage attributed to the blind. Several studies have indeed shown superior performance of the blind in tactile discrimination [1,5,13,15,35,39,42], sound localization and pitch discrimination [9,7,12,17,18,20,26,[29][30][31]40,41]. ...
Article
It is generally acknowledged that people blind from birth develop supra-normal sensory abilities in order to compensate for their visual deficit. While extensive research has been done on the somatosensory and auditory modalities of the blind, information about their sense of smell remains scant. The goal of this study was therefore to compare odor perception and odor awareness in a group of 11 congenitally blind and 14 sighted control subjects. We measured odor detection threshold, odor discrimination and odor identification using the Sniffin'Sticks test. Participants also filled in the Odor Awareness Scale (OAS) to assess consciousness of olfactory sensations. Our data showed that blind subjects had a lower odor detection threshold compared to the sighted. However, no group differences were found for odor discrimination and odor identification. Interestingly, the OAS revealed that blind participants scored higher for odor awareness. The largest group differences were found for items of the OAS that measure responses to body odors and fragrances. We conclude that blind subjects rely more on their sense of smell than the sighted in order to assess their environment and to recognize places and other people.
... First, the contingent negative variation (CNV; Walter, Winter, Cooper, McCallum, & Aldridge, 1964), which is known to reflect event anticipation and/or motor preparation (for a review, see Brunia, 2004) is larger prior to an attended time point (e.g., Lange et al., 2003;Sanders & Astheimer, 2008). Moreover, it has been shown that temporal attention modulates both early and late stages of auditory processing, as indicated by enhancements in the time ranges of the N1 and the P3 of the ERPs to temporally attended sounds (Lange, 2012;Lange et al., 2003Röder, Krämer, & Lange, 2007;Sanders & Astheimer, 2008). It remains undetermined so far which mechanisms underlie these effects (but see Lange, 2012, for an investigation of the P3 effect of temporal orienting). ...
Article
It has been repeatedly shown that the auditory N1 is enhanced for sounds presented at an attended time point. The present study investigated the underlying mechanisms using a temporal cuing paradigm. In each trial, an auditory cue indicated at which time point a second sound could be relevant for response selection. Crucially, in addition to temporal attention, two physical sound features with known effects on the sensory N1 were manipulated: location and intensity. Positive evidence for conjoint effects of attention and location or attention and intensity would corroborate the notion that the sensory N1 was modulated by temporal attention, thus supporting a gain mechanism. However, the N1 effect of temporal attention was not similarly lateralized as the sensory N1, and, moreover, it was independent of sound intensity. Thus, the present results do not provide compelling evidence that temporal attention involves an increase in sensory gain.
... been demonstrated when temporal orienting was combined with spatial orienting (e.g., Mathewson, Fabiani, Gratton, Beck, & Lleras, 2010;Rohenkohl, Cravo, Wyart, & Nobre, 2012; see also Cravo, Rohenkohl, Wyart, & Nobre, 2013), the effect on RTs is robust and has been replicated in both vision and audition, as well as across various paradigms (e.g., Coull & Nobre, 1998;Lange, 2009;Sanabria et al., 2011; see also Nobre, 2001;Sanabria & Correa, 2013). At a physiological level, temporal orienting was repeatedly found to enhance late, probably decision-or responserelated, brain potentials (e.g., Lange & Röder, 2006;Lange et al., 2003;Röder, Krämer, & Lange, 2007; see also Griffin, Miniussi, & Nobre, 2002;Miniussi, Wilding, Coull, & Nobre, 1999;. Moreover, at least for the auditory modality, early brain potentials were also shown to be affected by temporal orienting (enhancements, e.g., Lange et al., 2003;Rimmele, Jolsvai, & Sussman, 2011;Sanders & Astheimer, 2008; see also Lange, 2009, for a reduction of the N1 by temporal orienting). ...
Article
Temporal orienting-that is, selective attention to instants in time-has been shown to modulate performance in terms of faster responses in a variety of paradigms. Electrophysiological recordings have shown that temporal orienting modulates neural processing at early, probably perceptual, and late, probably decision- or response-related, stages. Recently, it was shown that the effect of temporal orienting on early auditory brain potentials is independent of the effect of the physical sound feature intensity. This indicates that temporal orienting might not affect stimulus processing by increasing the sensory gain of attended stimuli. In the present study, we investigated whether the independence of temporal-orienting and sound-intensity effects could be replicated behaviorally. Sequences were presented that were either rhythmic, most likely creating temporal expectations, or arrhythmic, presumably not creating such expectations. As hypothesized, the main effects of temporal expectation and sound intensity on reaction times were independent (Experiment 1). The exact pattern of results was replicated with a slightly altered paradigm (Experiment 2) and with a different kind of task (Experiment 3). In sum, these results corroborate the notion that the effect of temporal orienting might not rely on the same processes as the effect of sound intensity does.
... Most of the studies using this approach in the domain of temporal orienting have employed auditory stimuli and found evidence that temporal attention operates early in the processing chain, as evidenced by an enhancement of the auditory N1 around 100 ms post-stimulus (see Figure 2A; Lange et al., 2003Röder et al., 2007;Sanders and Astheimer, 2008;Lange, 2012a,b; see also Chait et al., 2010 for related data; but see Griffin et al., 2002, Experiment 2 for visual data suggesting later effects). Hence, the data obtained with filter paradigms consistently show that temporal orienting modulates early auditory processing as reflected in the amplitude enhancement of the auditory N1. ...
Article
Full-text available
The temporal orienting of attention refers to the process of focusing (neural) resources on a particular time point in order to boost the processing of and the responding to sensory events. Temporal attention is manipulated by varying the task-relevance of events at different time points or by inducing expectations that an event occurs at a particular time point. Notably, the electrophysiological correlates of these manipulations at early processing stages are not identical: Auditory studies operationalizing temporal attention through task-relevance consistently found enhancements of early, sensory processing, as shown in the N1 component of the auditory event-related potential (ERP). By contrast, previous work on temporal orienting based on expectations showed mixed results: early, sensory processing was either enhanced or attenuated or not affected at all. In the present work, I will review existing findings on temporal orienting with a special focus on the auditory modality and present a working model to reconcile the previously heterogeneous results. Specifically, I will suggest that when expectations are used to manipulate attention, this will lead both to an orienting of attention and to the generation of precise predictions about the upcoming event. Attention and prediction are assumed to have opposite effects on early auditory processing, with temporal attention increasing and temporal predictions decreasing the associated ERP correlate, the auditory N1. The heterogeneous findings of studies manipulating temporal orienting by inducing expectations may thus be the consequence of differences in the relative contribution of attention and prediction processes. The model’s predictions will be discussed in the context of a functional interpretation of the auditory N1 as an attention call signal, as presented in a recent model on auditory processing.
... In the CB subjects, electrophysiological or neuroimaging studies have revealed that the top-down attention modulation was strengthened when they performing tactile/auditory attention-demanding tasks [75,111,112,[212][213][214][215]. Additionally, the occipital cortical areas were activated in the CB subjects by attention tasks through nonvisual modalities [60,96,194,216,217]. ...
Article
Full-text available
The visual cortex has been traditionally considered as a stimulus-driven, unimodal system with a hierarchical organization. However, recent animal and human studies have shown that the visual cortex responds to non-visual stimuli, especially in individuals with visual deprivation congenitally, indicating the supramodal nature of the functional representation in the visual cortex. To understand the neural substrates of the cross-modal processing of the non-visual signals in the visual cortex, we firstly showed the supramodal nature of the visual cortex. We then reviewed how the nonvisual signals reach the visual cortex. Moreover, we discussed if these non-visual pathways are reshaped by early visual deprivation. Finally, the open question about the nature (stimulus-driven or top-down) of non-visual signals is also discussed.
... Enhanced auditory processing in blind subjects appears to be associated with improved encoding of timing aspects of the acoustic signals. For example, congenitally blind individuals seem to preferentially pay attention to temporal as compared to spatial cues (Röder et al., 2007), and they outperform sighted subjects with respect to temporal resolution capabilities in psychoacoustic backward masking experiments (Stevens and Weaver, 2005). Furthermore, early as well as late blind subjects can acquire the ability to comprehend time-compressed speech at syllable rates up to ca. 20 syllables/s (normal range: ca. ...
Article
Full-text available
In blind people, the visual channel cannot assist face-to-face communication via lipreading or visual prosody. Nevertheless, the visual system may enhance the evaluation of auditory information due to its cross-links to (1) the auditory system, (2) supramodal representations, and (3) frontal action-related areas. Apart from feedback or top-down support of, for example, the processing of spatial or phonological representations, experimental data have shown that the visual system can impact auditory perception at more basic computational stages such as temporal signal resolution. For example, blind as compared to sighted subjects are more resistant against backward masking, and this ability appears to be associated with activity in visual cortex. Regarding the comprehension of continuous speech, blind subjects can learn to use accelerated text-to-speech systems for "reading" texts at ultra-fast speaking rates (>16 syllables/s), exceeding by far the normal range of 6 syllables/s. A functional magnetic resonance imaging study has shown that this ability, among other brain regions, significantly covaries with BOLD responses in bilateral pulvinar, right visual cortex, and left supplementary motor area. Furthermore, magnetoencephalographic measurements revealed a particular component in right occipital cortex phase-locked to the syllable onsets of accelerated speech. In sighted people, the "bottleneck" for understanding time-compressed speech seems related to higher demands for buffering phonological material and is, presumably, linked to frontal brain structures. On the other hand, the neurophysiological correlates of functions overcoming this bottleneck, seem to depend upon early visual cortex activity. The present Hypothesis and Theory paper outlines a model that aims at binding these data together, based on early cross-modal pathways that are already known from various audiovisual experiments on cross-modal adjustments during space, time, and object recognition.
... In accord with the findings in dark-reared cats (Wallace et al., 2004 ), these data suggest that crossmodal interactions based on spatial congruency might be compromised by congenital blindness and the resulting visual deprivation. These results in blind humans 616 DAVID J. LEWKOWICZ AND BRIGITTE RÖDER have been related to the use of different default spatial representations by congenitally blind adults (Collignon, Charbonneau, Lassonde, & Lepore, 2009; Röder, Krämer, & Lange, 2007 ). That is, congenitally blind individuals do not seem to code all sensory events always in external coordinates as do sighted individuals , suggesting that visual input may play a crucial role in setting up the spatial representations necessary for (automatic) multisensory interactions and crossmodal facilitation (Collignon, Voss, et al., 2009). ...
... CB but not LB participants experienced more auditory dream impressions and rated these as being more distinct than those in the SC group. These findings can be reconciled with reports of increased performance of CB participants in auditory tasks such as pitch discrimination and sound localization capacities in waking cognition [52][53][54][55][56][57][58][59][60][61][62][63]. Moreover, brain imaging studies have shown that CB participants activated their occipital cortex during auditory tasks and that this activation correlated with improved auditory performance [64,65]. ...
... Because usually the auditory system (and not the visual one) is the most precise sense in the temporal domain (Berger, Martelli, & Pelli, 2003;Burr, Banks, & Morrone, 2009;Holcombe, 2009;Shams, Kamitani, & Shimojo, 2000), subjects with congenital visual impairment should rely mainly on their spared audition and may therefore develop better abilities in temporal perception. Interestingly, improved temporal processing was reported in EB individuals in comparison with sighted controls (Röder, Krämer, & Lange, 2007;Röder, Rösler, & Spence, 2004;Stevens & Weaver, 2005; Van der Lubbe, Van Mierlo, & Postma, 2010). In particular, Van der Lubbe and colleagues (2010) reported superior auditory and vibrotactile duration discrimination abilities in EB individuals in comparison with sighted individuals. ...
Article
Full-text available
Although early blind (EB) individuals are thought to have a better musical sense than sighted subjects, no study has investigated the musical rhythm and beat processing abilities in EB individuals. Using an adaptive 'up and down' procedure, we measured the beat asynchrony detection threshold and the duration discrimination threshold, in the auditory and vibrotactile modalities in both EB and sighted control (SC) subjects matched for age, gender, and musical experience. We observed that EB subjects were better than SC in the beat asynchrony detection task; that is, they showed lower thresholds than SC, both in the auditory and in the vibrotactile modalities. In addition, EB subjects had a lower threshold than SC for duration discrimination in the vibrotactile modality only. These improved beat asynchrony detection abilities may contribute to the known excellent musical abilities often observed in many blind subjects.
... Some studies support this view, showing that blind individuals rely more on route navigation which is based on kinematic strategies relative to the experienced movement by using an egocentric reference (Bigelow, 1992). They encode stimuli using allocentric rather than egocentric, coordinates (Roder et al., 2007(Roder et al., , 2004Pasqualotto and Proulx, 2012). Impaired abilities in blind individuals associated with inferential navigation (Seemungal et al., 2007) and lower sensitivity to changes in perspective structure when walking without vision (Rieser et al., 1986) have been also reported. ...
Article
Full-text available
Different sensory systems interact to generate a representation of space and to navigate. Vision plays a critical role in the representation of space development. During navigation, vision is integrated with auditory and mobility cues. In blind individuals, visual experience is not available and navigation therefore lacks this important sensory signal. In blind individuals, compensatory mechanisms can be adopted to improve spatial and navigation skills. On the other hand, the limitations of these compensatory mechanisms are not completely clear. Both enhanced and impaired reliance on auditory cues in blind individuals have been reported. Here, we develop a new paradigm to test both auditory perception and navigation skills in blind and sighted individuals and to investigate the effect that visual experience has on the ability to reproduce simple and complex paths. During the navigation task, early blind, late blind and sighted individuals were required first to listen to an audio shape and then to recognize and reproduce it by walking. After each audio shape was presented, a static sound was played and the participants were asked to reach it. Movements were recorded with a motion tracking system. Our results show three main impairments specific to early blind individuals. The first is the tendency to compress the shapes reproduced during navigation. The second is the difficulty to recognize complex audio stimuli, and finally, the third is the difficulty in reproducing the desired shape: early blind participants occasionally reported perceiving a square but they actually reproduced a circle during the navigation task. We discuss these results in terms of compromised spatial reference frames due to lack of visual input during the early period of development.
... Importantly, in all these studies as in the present one, amplitude reduction was observed for temporal predictable compared to unpredictable auditory stimuli. By contrast, an enhancement of N1 amplitude was observed in EEG studies investigating temporal auditory attention, mostly by manipulating the time interval between two stimuli (Lange et al. 2003Röder et al. 2007;Lange 2013). For example, Lange et al. (2003) presented participants with a sequence of two sounds separated by a shorter or longer temporal interval. ...
Article
Full-text available
An impressive number of theoretical proposals and neurobiological studies argue that perceptual processing is not strictly feedforward but rather operates through an interplay between bottom-up sensory and top-down predictive mechanisms. The present EEG study aimed to further determine how prior knowledge on auditory syllables may impact speech perception. Prior knowledge was manipulated by presenting the participants with visual information indicative of the syllable onset (when), its phonetic content (what) and/or its articulatory features (how). While when and what predictions consisted of unnatural visual cues (i.e., a visual timeline and a visuo-orthographic cue), how prediction consisted of the visual movements of a speaker. During auditory speech perception, when and what predictions both attenuated the amplitude of N1/P2 auditory evoked potentials. Regarding how prediction, not only an amplitude decrease but also a latency facilitation of N1/P2 auditory evoked potentials were observed during audiovisual compared to unimodal speech perception. However, when and what predictability effects were then reduced or abolished, with only what prediction reducing P2 amplitude but increasing latency. Altogether, these results demonstrate the influence of when, what and how visually induced predictions at an early stage on cortical auditory speech processing. Crucially, they indicate a preponderant predictive role of the speaker’s articulatory gestures during audiovisual speech perception, likely driven by attentional load and focus.
... Since we focused our analyses on a very early time window (50−90 ms time), it is plausible that higher-level and multisensory areas, such as in the case of the parietal cortex, also engage during the auditory spatial task but show up later in neural processing. Similarly, differences in auditory cortex between sighted and blind participants could involve later auditoryevoked ERP components, perhaps revealing higher amplitude following blindness (this would be in line with previous research such as Roder et al., 1999 andRoder et al., 2007). The fact that in early blind individuals the occipital laterality within the early time window was absent could indicate that early visual experience mediates contralateral early occipital response development. ...
Article
Visual modality dominates spatial perception and, in lack of vision, space representation might be altered. Here we review our work showing that blind individuals have a strong deficit when performing spatial bisection tasks (Gori et al., 2014). We also describe the neural correlates associated with this deficit, as blind individuals do not show the same ERP response mimicking the visual C1 reported in sighted people during spatial bisection (Campus et al., 2019). Interestingly, the deficit is not always evident in late blind individuals, and it is dependent on blindness duration. We report that the deficit disappears when one presents coherent temporal and spatial cues to blind people. This suggests that they may use time information to infer spatial maps (Gori et al., 2018). Finally, we propose a model to explain why blind individuals are impaired in this task, speculating that a lack of vision drives the construction of a multi-sensory cortical network that codes space based on temporal, rather than spatial, coordinates.
... Investigating changes in brain rhythms in congenital blindness can provide crucial insights into the neuronal mechanisms of behavioral changes as found in this population. Congenitally blind (CB) individuals perform better on various auditory, tactile, and higher cognitive tasks compared with sighted controls (Bull et al. 1983;Lessard et al. 1998;Roeder et al. 2001;Amedi et al. 2003;Gougoux et al. 2005;Foecker et al. 2012), including tasks where certain aspects of temporal processing are essential, such as in temporal order processing tasks, task involving musical meter, or ultra-fast speech processing (Hoetting et al. 2004;Roeder et al. 2004;Stevens and Weaver 2005;Moos and Trouvain 2007;Roeder et al. 2007;Trouvain 2007;Hertrich et al. 2009;Dietrich et al. 2013;Lerens et al. 2014; Carrara-Augustenborg and Schultz 2019; Zhang et al. 2019). Improved behavioral performance in auditory or tactile tasks in CB individuals is accompanied by cross-modal plasticity. ...
Article
Full-text available
Congenital blindness has been shown to result in behavioral adaptation and neuronal reorganization, but the underlying neuronal mechanisms are largely unknown. Brain rhythms are characteristic for anatomically defined brain regions and provide a putative mechanistic link to cognitive processes. In a novel approach, using magnetoencephalography resting state data of congenitally blind and sighted humans, deprivation-related changes in spectral profiles were mapped to the cortex using clustering and classification procedures. Altered spectral profiles in visual areas suggest changes in visual alpha-gamma band inhibitory-excitatory circuits. Remarkably, spectral profiles were also altered in auditory and right frontal areas showing increased power in theta-to-beta frequency bands in blind compared with sighted individuals, possibly related to adaptive auditory and higher cognitive processing. Moreover, occipital alpha correlated with microstructural white matter properties extending bilaterally across posterior parts of the brain. We provide evidence that visual deprivation selectively modulates spectral profiles, possibly reflecting structural and functional adaptation.
... These data would suggest a lack of implicit crossmodal spatial attention effects between audition and touch when the stimulation occurs from spatially discrepant locations (see also Kujala, Lehtokoski, Alho, Kekoni, & Näätänen, 1997;Collignon et al., 2006). Furthermore, while sighted participants preferentially select sensory input based on space, congenitally blind individuals preferentially select sensory input first according to their expected time point of occurrence (Röder, Krämer, & Lange, 2007). The more proficient selective attention/inhibition for auditory and tactile information in congenitally blindness, adding to the overall reduced crossmodal interaction induced by the absence of a conjoint activation of external coordinate systems, could have thus contributed to the differences observed in the patterns of performance of CB and CBC. ...
Article
In the ventriloquism effect, the presentation of spatially discrepant visual information biases the localization of simultaneously presented sounds. Recently, an analogous spatial influence of touch on audition has been observed. By manipulating hand posture, it has been demonstrated that this audiotactile ventriloquist effect predominantly operates in an external frame of reference. In the present study, we examined the contribution of developmental vision to audiotactile interactions as indicated by the ventriloquism effect. Congenitally blind, late blind and sighted adults were asked to report the perceived location of sounds presented from a left, a central or a right location. Auditory stimuli were either delivered alone or concurrently with touches at the left or the right hand. The hands were located to the right and to the left of the lateral speakers and participants either adopted an uncrossed or a crossed hand posture. While sighted controls and late blind participants similarly mislocalized auditory stimuli toward the concurrent tactile stimuli in bimodal trials, the congenitally blind showed a reduced ventriloquism effect. All groups showed a reduced audiotactile ventriloquism effect in the crossed hand condition. However, the magnitude of the reduction was significantly larger in the group of congenitally blind than in the group of sighted controls. These results suggest reduced audio-tactile interactions in spatial processing following a lack of visual input from birth.
Article
Full-text available
The journal Restorative Neurology and Neuroscience (RNN) now celebrates its 20th anniversary. Since 1989 RNN has published scientific findings in the emerging fields of brain plasticity, repair and rehabilitation via original scientific publications and review papers in basic research (animal experiments, in vitro studies) and clinical science. During the last decade RNN had a steady progress in reference value and scientific impact, reaching an ISI-impact factor of 1.978 (2008) and has published a total of 717 papers. The journal's success can be explained by different factors: (1) neuroplasticity, regeneration, recovery and rehabilitation have developed to main stream subjects with a worldwide increase in the number of publications and their citation rate, (2) RNN has published numerous special issues which summarize the work of leading experts in specialized sub-fields, (3) a dedicated, highly qualified editorial board (4) the quality of papers submitted to RNN has increased over time. RNN has now become a visible and leading source of original scientific information in the space of brain plasticity, rehabilitation and repair.
Article
Temporal-cuing studies show faster responding to stimuli at an attended versus unattended time point. Whether the mechanisms involved in this temporal orienting of attention are located early or late in the processing stream has not been answered unequivocally. To address this question, we measured event-related potentials in two versions of an auditory temporal cuing task: Stimuli at the uncued time point either required a response (Experiment 1) or did not (Experiment 2). In both tasks, attention was oriented to the cued time point, but attention could be selectively focused on the cued time point only in Experiment 2. In both experiments, temporal orienting was associated with a late positivity in the timerange of the P3. An early enhancement in the timerange of the auditory N1 was observed only in Experiment 2. Thus, temporal attention improves auditory processing at early sensory levels only when it can be focused selectively.
Article
It has been shown recently (Lange, 2009) that the N1 of the auditory event related potential (ERP) is attenuated when the eliciting stimulus predictably follows a regular vs. an irregular sequence. This may be a sign of temporal orienting induced by the regular sequence. Alternatively, the attenuated N1 may have been due to sensory predictability of target timing. The present study investigated whether presenting a regular sequence still attenuates target N1 when target timing is unpredictable. A regular (vs. irregular) tone sequence was presented prior to a target tone, which appeared unpredictably at one of three different time points after the sequence. For the regular sequence, targets either continued regularity (on-time targets) or were early or late with respect to this regular time point. ERPs to on-time targets were compared as a function of sequence regularity. Consistent with the assumption that N1 attenuation reflects sensory predictability of target timing, an attenuated N1 was not observed in the present study, where target timing was uncertain.
Article
When a brief flash is quickly presented aligned with a moving target, the flash typically appears to lag behind the moving stimulus. This effect is widely known in the literature as a flash-lag illusion (FLI). The flash-lag is an example of a motion-induced position shift. Since auditory deprivation leads to both enhanced visual skills and impaired temporal abilities, both crucial for the perception of the flash-lag, here we hypothesized that lack of audition could influence the FLI. 13 early deaf and 18 hearing individuals were tested in a visual flash-lag illusion (FLI) paradigm to investigate this hypothesis. As expected, results demonstrated a reduction of the flash-lag effect following early deafness, both in the central and peripheral visual fields. Moreover, only for deaf individuals, there is a positive correlation between the flash-lag effect in the peripheral and central visual field, suggesting that the mechanisms underlying the effect in the center of the visual field expand to the periphery following deafness. Overall, these findings reveal that lack of audition early in life profoundly impacts early visual processing underlying the flash-lag effect.
Article
The rubber hand illusion (RHI) occurs when a person misattributes a fake hand as his or her own hand. Previously, the RHI has been examined with both the rubber hand and the participant's real hand uncrossed with regards to the participant's midline. The present study examined the strength of the illusion when the real hand, the fake hand or both hands are placed across the body midline. The illusion was induced by stroking the rubber hand and the real hand simultaneously. Asynchronous brushing served as a comparable condition since the RHI is not seen under these circumstances. Participants indicated where they felt their real hand was located by marking a sheet of paper under the table on which their unseen hand was placed. A significant RHI was observed with both the hands uncrossed. In contrast, no RHI was present when either hand was crossed over the midline. Additionally, a shift in hand judgment towards the midline was observed when participants crossed their real hand. These results indicate the importance of the midline in understanding representations of the body.
Article
Temporal orienting enhances early (N1) and late (P3) stages of auditory processing. However, the functional significance of these effects has not been settled yet. The present study tested a motor inhibition account on the temporal orienting P3 effect to non-target stimuli. A temporal cuing paradigm was used, where the level of motor preparation (high vs. low) was varied: If motor preparation is higher, more inhibition is necessary to withhold a response when a non-target is presented at the attended time point. Consequently, if the enhanced P3 to temporally attended non-targets reflected increased motor inhibition, higher motor preparation should further enhance the P3. Overall, temporal orienting enhanced both the N1 and the P3, thus replicating earlier findings. Moreover, the temporal orienting P3 effect was larger when motor preparation was higher. Inconsistent with the motor-inhibition account, however, the P3 to temporally attended non-targets did not differ as a function of motor preparation.
Article
Full-text available
We investigated if a blind person uses his visual system to understand moderately fast speech (8 syll/sec) and ultra-fast speech (16 syll/sec) based on texts spoken by a male person and produced by a speech synthesis, respectively. Whereas the blind subject had no problems understanding ultra-fast speech, six sighted control subjects were not able to understand it. Functional magnetic resonance imaging (fMRI) of the brain activity proved that moderately fast speech activated posterior and anterior 'language zones' in all subjects. Regarding ultra-fast tokens, the controls showed exclusive activation of supratemporal regions whereas the blind participant exhibited enhanced left inferior frontal and temporoparietal responses as well as significant hemodynamic activation of left fusiform gyrus (FG) and right primary visual cortex. Since left FG is known to be involved in phonological processing, this structure, presumably, provides the functional link between the central-auditory and -visual systems.
Chapter
This chapter examines the possible role of temporal orienting of attention in modulating early perceptual processing. It discusses the results of four studies that provide evidence that explicit temporal orienting enhances early, perceptual stages of auditory processing. This chapter shows that early tactile processing stages are also enhanced by temporal attention and suggests that future related studies should focus on investigating the preconditions for enhancement of perceptual processing steps by temporal orienting.
Article
Processing auditory scenes requires the automatic detection of unexpected acoustic irregularities which allows to reorient the attentional focus for further in-depth analysis. Even if cochlea implants (CI) may partly restore hearing capabilities in patients suffering from profound peripheral deafness, CI users complain about difficulties in identifying novel and unexpected acoustic events. To assess whether this impairment is attributable to preattentive auditory deficits, impaired automatic orienting to novel events and/or to deficits in attentional processing we recorded event-related potentials (ERPs) in an auditory novelty oddball paradigm. ERPs were obtained in 7 postlingually deafened CI patients and their age-matched controls in passive and active listening conditions. Subjects had to press a button for infrequent novel sounds but not for frequent standard sounds in the active condition. In the active condition patients and controls did not differ with regard to hit-rates and reaction times. ERPs to novel stimuli in the active condition were characterized by enhanced N2b and P3b components that did not differ between groups. By contrast, the P3a component to novel sounds in the passive condition, an index of automatic orienting of attention, was greatly attenuated in the CI users. CI-users are impaired in the preattentive registration of novel auditory events while attentive processing of a designated auditory stream appears intact.
Article
Full-text available
This article reports on a study of the performance by congenitally blind, adventitiously blind, and sighted persons on three types of tasks: manipulatory, simple locomotion, and complex locomotion. The three groups of subjects tended to perform equivalently, and the results offer little evidence of a set of spatial processes that rely on past visual experience and are applicable to a broad variety of tasks.
Article
Full-text available
A study is reported of the effect of early visual experience on the development of auditory space perception. The spatial hearing of thirty-five children with visual disabilities (twenty-two with congenital total blindness) was compared with that of eighteen sighted children and seventeen sighted adults. The tests provided a comprehensive assessment of spatial-hearing ability, including psychophysical estimates of spatial resolution in the horizontal, vertical, and distance dimensions, as well as measures of reaching and walking to the locations of sound sources. The spatial hearing of the children with visual disabilities was comparable to or somewhat better than that of the sighted children and adults. This pattern held even when the group with visual disabilities was restricted to those children with congenital total blindness; in fact, some of those children had exceptionally good spatial hearing. These findings imply that the developmental calibration of human spatial hearing is not dependent on a history of visual experience. It seems likely that this calibration arises from the experience of changes in sound-localization cues arising from self-motion, such as turning the head or walking. As a practical matter, orientation and mobility instructors may reasonably assume that individuals with visual disabilities can use their hearing effectively in day-to-day travel situations.
Article
Full-text available
Despite reports of improved auditory discrimination capabilities in blind humans and visually deprived animals, there is no general agreement as to the nature or pervasiveness of such compensatory sensory enhancements. Neuroimaging studies have pointed out differences in cerebral organization between blind and sighted humans, but the relationship between these altered cortical activation patterns and auditory sensory acuity remains unclear. Here we compare behavioural and electrophysiological indices of spatial tuning within central and peripheral auditory space in congenitally blind and normally sighted but blindfolded adults to test the hypothesis (raised by earlier studies of the effects of auditory deprivation on visual processing) that the effects of visual deprivation might be more pronounced for processing peripheral sounds. We find that blind participants displayed localization abilities that were superior to those of sighted controls, but only when attending to sounds in peripheral auditory space. Electrophysiological recordings obtained at the same time revealed sharper tuning of early spatial attention mechanisms in the blind subjects. Differences in the scalp distribution of brain electrical activity between the two groups suggest a compensatory reorganization of brain areas in the blind that may contribute to the improved spatial resolution for peripheral sound sources.
Article
Full-text available
When determining the position of sounds, people perceive their subjective auditory median plane of the head (SAMP) displaced towards the side to which the head is turned. The present study investigated the influence of visual input on this effect. A group of congenitally blind and two groups of sighted adults, one with eyes open, one with eyes closed, adjusted the location of a tone presented through headphones in such a way that they perceived it to originate at their SAMP, both with the head straight ahead and turned 60 degrees to the right. While both sighted groups showed a reliable SAMP displacement in the 60 degrees condition, the congenitally blind did not. It is speculated that the blind have learned to integrate proprioceptive information more precisely contributing to superior auditory localization skills.
Article
Full-text available
A part of the core area of the auditory cortex was examined in nine blind and 10 sighted individuals by magnetic source imaging and was found to be enlarged by a factor of 1.8 in the blind compared with the sighted humans. Moreover, the latency of the N1m component of the auditory-evoked magnetic response was significantly decreased in the blind. The development of use-dependent cortical reorganization may be a consequence of the absence of visual input in combination with enhanced auditory activity generated by the long-term concentration by blind individuals on nonvisual cues to interact appropriately with the environment. It is consistent with and well suited to mediate the demonstrated increased ability of the blind to accurately localize acoustic sources in peripheral auditory fields and to decode speech.
Article
Full-text available
Musician's brains constitute an interesting model for neuroplasticity. Imaging studies demonstrated that sensorimotor cortical representations are altered in musicians, which was assumed to arise from the development of skilled performance. However, the perceptual consequences of the cortical changes remain elusive. Here we ask whether cortical reorganization induced by professional musical skill training is paralleled by the evolution of other, unrelated perceptual abilities. We therefore studied psychophysically tactile spatial acuity as an indirect marker of cortical changes in professional pianists and non-musician control subjects using a simultaneous two-point discrimination paradigm. We show that long-lasting piano practising resulted in lower spatial discrimination thresholds in comparison to non-musicians. In musicians, individual discrimination thresholds were linearly correlated with the daily training duration, indicating a direct link between tactile acuity and the degree of piano practising. To investigate whether the superior acuity in pianists is subject to further improvement, we used a Hebbian stimulation protocol of tactile coactivation known to improve spatial tactile acuity. Three hours of coactivation further reduced their discrimination thresholds. The coactivation-induced gain in pianists was significantly larger in comparison to control subjects and correlated with the years of heavy daily practising (>3 h/day), but not with the total years including casual playing. Our results suggest that despite already high-level performance in pianists, Hebbian learning was more effective in musicians than in controls. This implies stronger capacities for plastic reorganization and points to enhanced learning abilities implicating a form of meta-plasticity in professional pianists.
Article
Full-text available
Do blind people develop superior abilities in auditory perception to compensate for their lack of vision? They are known to be better than sighted people at orientating themselves by sound, but it is not clear whether this enhanced awareness extends to other auditory domains, such as listening to music or to voices. Here we show that blind people are better than sighted controls at judging the direction of pitch change between sounds, even when the speed of change is ten times faster than that perceived by the controls--but only if they became blind at an early age. The younger the onset of blindness, the better is the performance, which is in line with cerebral plasticity being optimal during the early years.
Article
The capability of the central nervous system (CNS) to adapt its functional and structural organization to current requirements is known as neural plasticity. Such changes can be examined at different organizational levels of the CNS; changes at the molecular-, synaptic-, neural-, system-, and behavioral level are mutually dependent (Shaw & McEachern, 2001). Plastic changes are triggered by learning, e.g., perceptual and motor training and by injuries, e.g., a deafferentation of parts or of all afferents of a sensory system. Moreover, the capacity to change is a characteristic feature of the CNS throughout life although there are qualitative and quantitative differences between developmental and adult plasticity. This chapter reports major findings on training- and lesion-induced plasticity. Results from animal and human research in the somatosensory, auditory, visual and motor system are reviewed and the possibly mechanisms underlying brain plasticity are discussed. Moreover, possible differences between developmental and adult plasticity are considered.
Chapter
This chapter reviews results from event-related potential (ERP) studies that investigated crossmodal links in endogenous spatial attention between vision, audition, and touch. It discusses the effects of such links in endogenous attention on the processing of visual, auditory, and tactile stimuli at currently attended or unattended locations, with respect to the questions of whether these links primarily affect modality-specific perceptual or post-perceptual processes. It then addresses the question of whether behavioural and electrophysiological effects of crossmodal links in spatial attention merely result from optional processing strategies, or reflect genuine constraints on the processing of sensory information within and across modalities. It discusses the spatial coordinate frames involved in crossmodal links in spatial attention. The final section reviews ERP studies investigation attentional control processes.
Article
Two fundamental cognitive functions, selective attention and processing of time, have been simultaneously explored in recent studies of temporal orienting of attention. A temporal-orienting procedure may consist of a temporal analogue to the Posner's paradigm, such that symbolic cues indicate the most probable moment for target arrival. Behavioral measures suggest that performance is improved for events appearing at expected vs. unexpected moments. However, there is no agreement on the locus of stimulus processing at which temporal attention operates. Thus, it remains unclear whether early perceptual or just late motor processes can be modulated. This article reviews current ERP research on temporal orienting, with an emphasis on factors that might determine the modulation of temporal orienting at early stages of processing. We conclude that: First, late components (N2 and P300) are consistently modulated by temporal orienting, which suggests attentional preparation of decision and/or motor processes. Second, early components (e.g., N1) seem to be modulated only when the task is highly demanding in perceptual processing. Hence, we conducted an ERP experiment which aimed to observe a modulation of early visual processing by using a perceptually demanding task, such as letter discrimination. The results show, for the first time, that targets appearing at attended moments elicited a larger P1 component than unattended targets. Moreover, temporal attention modulated the amplitude and latency of N2 and P300 components. This suggests that temporal orienting of attention not only modulates late motor processing, but also early visual processing when perceptually demanding tasks are used.
Article
It has been suggested that when the variance assumptions of a repeated measures ANOVA are not met, the df of the mean square ratio should be adjusted by the sample estimate of the Box correction factor, ?. This procedure works well when ? is low, but the estimate is seriously biased when this is not the case. An alternate estimate is proposed which is shown by Monte Carlo methods to be less biased for moderately large ?.
Article
While several studies have reported a deviation from the normal time course of language acquisition in blind children others have provided evidence for a more efficient processing of the language input in blind than sighted adults. The present study used a semantic and morpho-syntactic priming paradigm to address the question at which processing stage the advantage of blind adults may arise. Congenitally blind adults and sighted controls matched for age, gender and education, first heard an adjective followed by a noun or a pseudo-word. The adjective was or was not semantically associated with the target and it was either correctly or incorrectly inflected for gender with respect to the following noun. Participants decided whether or not the target noun was a legal German word. Nouns primed semantically and morpho-syntactically had shorter lexical decision times than those primed only semantically or only morpho-syntactically and decision times for the latter two conditions were shorter than in a condition without a semantically or morpho-syntactically congruent context. This response pattern did not differ between groups. However, blind participants had shorter reaction times than sighted for pseudo-words, and overall decision times for words tended to be shorter in the blind as well. It is concluded that the faster speech comprehension skills of blind adults most likely originate from a more efficient perceptual analysis rather than from a more extended use of semantic or morpho-syntactic context information.
Article
Introduction: questions and terms 1. Modality and cognition in developmental theories and evidence 2. The modalities as convergent sources of spatial information 3. Neuropsychological evidence on convergence 4. Shape coding by vision and touch 5. Spatial coding: studies in small-scale space 6. Information and understanding large-scale space 7. Non-verbal representation: images, drawings, maps, and memory 8. Some practical implications 9. A theory of spatial understanding and development
Article
Recent behavioural and event-related potential (ERP) studies reported cross-modal links in spatial attention between vision, audition and touch. Such links could reflect differences in hemispheric-activation levels associated with spatial attention to one side, or more abstract spatial reference-frames mediating selectivity across modalities. To distinguish these hypotheses, ERPs were recorded to lateral tactile stimuli, plus visual (experiment 1) or auditory stimuli (experiment 2), while participants attended to the left or right hand to detect infrequent tactile targets, and ignored other modalities. In separate blocks, hands were either in a crossed or uncrossed posture. With uncrossed hands, visual stimuli on the tactually attended side elicited enhanced N1 and P2 components at occipital sites, and an enhanced negativity at midline electrodes, reflecting cross-modal links in spatial attention from touch to vision. Auditory stimuli at tactually attended locations elicited an enhanced negativity overlapping with the N1 component, reflecting cross-modal links from touch to audition. An analogous pattern of results arose for crossed hands, with tactile attention enhancing auditory or visual responses on the side where the attended hand now lay (i.e. in the opposite visual or auditory hemifield to that enhanced by attending the same hand when uncrossed). This suggests that cross-modal attentional links are not determined by hemispheric projections, but by common external locations. Unexpectedly, somatosensory ERPs were strongly affected by hand posture in both experiments, with attentional effects delayed and smaller for crossed hands. This may reflect the combined influence of anatomical and external spatial codes within the tactile modality, while cross-modal links depend only on the latter codes.
Article
Developmental data were gathered on the relative importance of vision, audition, and proprioception in determining spatial direction in a conflict situation. Age trends did not support the hypothesis that information from different modalities becomes better differentiated with age. In a follow-up study, blind children of different ages were tested under auditory-proprioceptive conflict conditions. No age changes were found. The possibility of a visual involvement in auditory and proprioceptive localization is discussed.
In a selective attention experiment, event-related potentials (ERPs) were recorded from 12 subjects who listened to sequences of high and low frequency tone pips, presented binaurally in random order at a rapid rate. Their task was to attend selectively to either the high or low frequency tones and to press a button upon detection of occasional longer-duration tones at the appropriate frequency. The effect of attention was manifest in difference wave forms obtained by subtracting ERPs to the shorter tones when they were unattended from the ERPs to the same stimuli when they were attended. The difference waves consisted of a broad, multiphasic negative component (Nd) with its later phase (300–400 msec) being more frontally distributed on the scalp than either its early phase (100–300 msec) or the N1 component that was evoked in reading condition. The amplitude and latency of onset of Nd varied systematically as a function of the frequency separation between the tones.
Article
We studied the organization of the somatosensory cortex in proficient Braille readers, recording somatosensory evoked potentials (SEPs) in 10 subjects and using transcranial magnetic stimulation (TMS) in five subjects, and compared the results with those of 15 control subjects. Somatosensory evoked potentials were elicited by a focal electrical stimulus to the tip of the index finger and recorded from a contralateral 4×4 grid of scalp electrodes centred around C3' or C4'. Transcranial magnetic stimulation, with an 8-shaped coil centred over the same scalp positions, was delivered simultaneously with, and at different intervals after, the finger stimulus. The results of the right index (reading) finger in Braille readers were compared with those of their left index (non-reading) finger and of the right and left index fingers of the control subjects. The scalp areas from which we recorded N20 and P22 components of the SEP with an amplitude of at least 70% of the maximal amplitude recorded in each trial were significantly larger in SEPs evoked from the reading fingers. Detection of the stimulus applied to the reading finger was blocked by TMS delivered over a larger contralateral scalp area and during a longer time window after the stimulus. These experiments suggest that reading Braille is associated with expansion of the sensorimotor cortical representation of the reading finger.
Analysis of variance (ANOVA) interactions involving electrode location are often used to assess the statistical significance of differences between event-related potential (ERP) scalp distributions for different experimental conditions, subject groups, or ERP components. However, there is a fundamental incompatibility between the additive model upon which ANOVAs are based and the multiplicative effect on ERP voltages produced by differences in source strength. Using potential distributions generated by dipole sources in spherical volume conductor models, we demonstrate that highly significant interactions involving electrode location can be obtained between scalp distributions with identical shapes generated by the same source. Therefore, such interactions cannot be used as unambiguous indications of shape differences between distributions and hence of differences in source configuration. This ambiguity can be circumvented by scaling the data to eliminate overall amplitude differences between experimental conditions before an ANOVA is performed. Such analyses retain sensitivity to genuine differences in distributional shape, but do not confuse amplitude and shape differences.
Article
Based on the findings of the preceding paper, it is known that auditory and visual signals have been translated into common coordinates at the level of the superior colliculus (SC) and share a motor circuit involved in the generation of saccadic eye movements. It is not known, however, whether the translation of sensory signals into motor coordinates occurs prior to or within the SC. Nor is it known in what coordinates auditory signals observed in the SC are encoded. The present experiment tested two alternative hypotheses concerning the frame of reference of auditory signals found in the deeper layers of the SC. The hypothesis that auditory signals are encoded in head coordinates predicts that, with the head stationary, the response of auditory neurons will not be affected by variations in eye position but will be determined by the location of the sound source. The hypothesis that auditory responses encode the trajectory of the eye movement required to look to the target (motor error) predicts that the response of auditory cells will depend on both the position of the sound source and the position of the eyes in the orbit. Extracellular single-unit recordings were obtained from neurons in the SC while monkeys made delayed saccades to auditory or visual targets in a darkened room. The coordinates of auditory signals were studied by plotting auditory receptive fields while the animal fixated one of three targets placed 24 degrees apart along the horizontal plane. For 99 of 121 SC cells, the spatial location of the auditory receptive field was significantly altered by the position of the eyes in the orbit. In contrast, the responses of five sound-sensitive cells isolated in the inferior colliculus were not affected by variations in eye position. The possibility that systematic variations in the position of the pinnae associated with different fixation positions could account for these findings was controlled for by plotting auditory receptive fields while the pinnae were mechanically restrained. Under these conditions, the position of the eyes in the orbit still had a significant effect on the responsiveness of collicular neurons to auditory stimuli. The average magnitude of the shift of the auditory receptive field with changes in eye position (12.9 degrees) did not correspond to the magnitude of the shift in eye position (24 degrees). Alternative explanations for this finding were considered. One possibility is that, within the SC, there is a gradual transition from auditory signals in head coordinates to signals in motor error coordinates.(ABSTRACT TRUNCATED AT 400 WORDS)
Article
Contends that tests in repeated-measures designs based on MANOVA are free of sphericity assumptions, and with modern computing software, MANOVA is straightforward to use, even for complex designs and nontraditional hypotheses. A general strategy for implementing MANOVA within statistical computing packages is presented. Regular (preplanned) tests and simultaneous (post hoc) tests are illustrated for a variety of designs and hypotheses. Optimal contrasts for unbalanced repeated measures designs are appended. (33 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Event-related brain potentials (ERPs) elicited by multidimensional auditory stimuli were recorded from the scalp in a selective-attention task. Subjects listened to tone pips varying orthogonally between two levels each of pitch, location, and duration and responded to longer duration target stimuli having specific values of pitch and location. The discriminability of the pitch and location attributes was varied between groups. By examining the ERPs to tones that shared pitch and/or locational cues with the designated target, we inferred interrelationships among the processing of these attributes. In all groups, stimuli that failed to match the target tone in an easily discriminable cue elicited only transitory ERP signs of selective processing. Tones sharing the "easy" but not the "hard" cue with the target elicited ERPs that indicated more extensive processing, but not as extensive as stimuli sharing both cues. In addition, reaction times and ERP latencies to the designated targets were not influenced by variations in the discriminability of pitch and location. This pattern of results is consistent with parallel, self-terminating models and holistic models of processing and contradicts models specifying either serial or exhaustive parallel processing of these dimensions. Both the parallel, self-terminating models and the holistic models provide a generalized mechanism for hierarchical stimulus selections that achieve an economy of processing, an underlying goal of classic, multiple-stage theories of selective attention.
Article
A research programme has been carried out that concerns the accuracy with which listeners can identify a speaker heard once before. The present study examined the voice-recognition abilities of blind listeners, and it was found that they could more accurately select target voices from the test arrays than could sighted people. However, the degree of blindness, the age at onset of blindness, and the number of years of blindness all failed to relate to voice-recognition accuracy.
Article
Clinical experience from humans deprived of vision at an early age indicates that recovery of vision is often rudimentary after late correction of the abnormalities in the eyes. Binocular deprivation in cats does not change the function of the visual cortex equally much as does monocular deprivation. Therefore, the behavioural blindness observed after binocular deprivation is probably caused by changes central to area 17. We studied four monkeys deprived of vision by binocular lid closure from shortly after birth until the ages of 7 to 11 months and three control animals of the same age. After opening, the eyes of the deprived animals were normal, optokinetic nystagmus could be elicited, and microelectrode recordings of multiple unit activity in areas 17 and 19 indicated brisk responses to visual stimulation. Behaviourally, all deprived animals were blind, however. They bumped into obstacles, fell from tables and used their somatic sense for exploration. Only minute recovery of visual orientation was observed during several months after the opening of the eyes. Approximately 400 multiple unit recording penetrations were made in Brodmann's area 7 in the deprived monkeys and a similar number in the control monkeys using the transdural recording technique in conscious, behaving animals. The results indicated a profound decrease in the representation of vision in this area: the representation of visual mechanisms was reduced by 92% and the combined visual and somatic representation was reduced by 97% in the deprived monkeys. On the other hand, the representation of active somatic movement had increased by 117% and that of passive somaesthesia by 53%. Also the proportion of cell groups that could not be activated (only spontaneously active) increased fourfold. These results show that early visual deprivation alters the associative systems of the brain by reducing the efficiency of transmission along pathways that mediate visual influences. Inputs from different sensory systems may compete for influence on the association cortex, disuse in one leading to its functional deterioration. Such changes may explain the lasting behavioural alterations that take place in man and monkeys after prolonged visual deprivation at an early age.
Article
The same 18 normally hearing students and 18 matched normal-sighted students, as in part I of this study, were compared in regard to pure-tone integration, speech discrimination ability and late cortical-evoked potentials. The blind subjects showed increased disinhibition ('cleaning') with broad-band noise and a decreased inhibition with the same stimulus at the contralateral ear, better speech discrimination, especially with regard to sentence tests without and with competing environment-simulating noise, and, by electric response audiometry (ERA), a shortened N1 latency. Thus, the hypothesis of a better utilization of auditory information after the loss of the visual information channel could be confirmed, and may be ascribed to the plasticity of the central nervous system.
Article
18 normally hearing blind students and 18 matched normal-sighted students were compared with regard to differential auditory sensitivity and acoustic reflex thresholds. Evaluation of interaural time differences for directional hearing and acoustic reflex thresholds showed no differences between the groups. The difference limen for intensity when interpreted by the SISI test, and the difference limen for frequency revealed a slightly optimizing trend in the blind group. However, differences missed statistical significance as to difference limen for intensity and, in the 3-orange, also as to difference limen for frequency. Any superior hearing of the blind was not mirrored by a corresponding superiority in peripheral supraliminal performance.
Article
Cats deprived visually from birth show few overt impairments in their natural behavior. Therefore, they seem well suited as an animal model for the study of compensatory plasticity after early vision loss. It can be demonstrated that binocularly deprived cats show improved abilities of auditory localization, and at least equal tactile behavior compared to normal controls. Within the anterior ectosylvian cortex of binocularly deprived cats, where different sensory modalities come together, the anterior ectosylvian visual area is completely taken over by auditory and somatosensory inputs. Furthermore, the auditory spatial tuning of single units in this cortical region is sharpened significantly as a result of visual deprivation. Somatosensory compensation for early loss of vision can be demonstrated by a hypertrophy of the facial vibrissae, and a corresponding expansion of their central representation in the somatosensory cortex of binocularly deprived animals. The compensatory changes in the cortex can be explained by a reorganization of sensory representations under the guidance of sensorimotor feedback rather than by instruction through an extraneous 'supervisory' signal. These processes might form the neural basis of sensory substitution in blind humans.
Article
Previous event-related potential (ERP) studies have suggested a possible participation of the visual cortex of the blind in auditory processing. In the present study, somatosensory and auditory ERPs of blind and sighted subjects were recorded when subjects were instructed to attend to stimuli of one modality and to ignore those of the other. Both modalities were stimulated with frequent ("standard") and infrequent ("deviant") stimuli, which differed from one another in their spatial locus of origin. In the sighted, deviant stimuli of the attended modality elicited N2 type of deflections (auditory N2b and somatosensory N250) over the lateral scalp areas. In contrast, in the blind, these ERP components were centroposteriorly distributed, suggesting an involvement of posterior brain areas in auditory and somatosensory stimulus discrimination. In addition, the mismatch negativity, elicited by deviant auditory stimuli even when the somatosensory stimuli were attended, was larger in the blind than in the sighted. This appears to indicate enhanced automatic processing of auditory stimulus changes in the blind. Thus, the present data suggest several compensatory changes in both auditory and somatosensory modalities after the onset of early visual deprivation.
Auditory event-related potentials (ERPs) were recorded in early blind subjects and sighted controls when they attended to stimuli delivered to a designated ear under dichotic conditions. The scalp distribution of the processing negativity (PN), the endogenous negativity elicited by attended stimuli, was in the blind posterior to that in the sighted. This suggests that posterior brain areas normally involved in vision participate in auditory selective attention in the early blind. Furthermore, occasional higher-frequency tones in the to-be-ignored ear elicited a negativity (presumably the mismatch negativity; MMN) that had a posterior scalp distribution in the blind as compared to controls. This suggests that the posterior brain areas of the blind also participate in processing of auditory stimulus changes occurring outside the focus of attention.
Article
The objective of the present study was to test if and to what extent phasic and tonic event-related potentials of the human EEG may reflect phenomena of cortical plasticity. In particular, it was tested if the occipital cortex of blind subjects participates in the processing of non-visual stimuli. To this end, 12 blind and 12 blindfolded sighted subjects were tested in an auditory and a somatosensory discrimination task with 2 levels of discrimination difficulty. Slow and fast event-related potentials were recorded from 18 scalp electrodes. In addition to the negative slow waves found in sighted subjects over frontal and central sites during auditory and somatosensory discrimination, a pronounced negative wave was revealed in the blind also over occipital brain areas. These negative shifts were time-locked to the train of stimuli which had to be monitored with sustained attention, i.e. they rised and resolved with the beginning and the end of a 20-s discrimination time epoch. The P300 complex, on the other hand, which is a slow positive deflection over the posterior part of the scalp and which follows rare and task-relevant events 200-800 ms after stimulus onset was significantly smaller at occipital electrodes in the blind than in the sighted subjects. Combined with neurophysiological and neuronanatomical evidence originating from studies with visually deprived animals, these data suggest that the occipital cortex of blind human subjects is coactivated whenever the system is engaged in a task which requires sustained attention and is less effectively inhibited at the end of a perceptual time epoch. In total, the data cast doubt on the hypothesis that the occipital cortex of blind subjects participates in modality-specific non-visual information processing.
Article
Although attention is distributed across time as well as space, the temporal allocation of attention has been less well researched than its spatial counterpart. A temporal analog of the covert spatial orientation task [Posner MI, Snyder CRR, Davidson BJ (1980) Attention and the detection of signals. J Exp Psychol Gen 109:160-174] was developed to compare the neural systems involved in directing attention to spatial locations versus time intervals. We asked whether there exists a general system for allocating attentional resources, independent of stimulus dimension, or whether functionally specialized brain regions are recruited for directing attention toward spatial versus temporal aspects of the environment. We measured brain activity in seven healthy volunteers by using positron emission tomography (PET) and in eight healthy volunteers by using functional magnetic resonance imaging (fMRI). The task manipulated cued attention to spatial locations (S) and temporal intervals (T) in a factorial design. Symbolic central cues oriented subjects toward S only (left or right), toward T only (300 msec or 1500 msec), toward both S and T simultaneously, or provided no information regarding S or T. Subjects also were scanned during a resting baseline condition. Behavioral data showed benefits and costs for performance during temporal attention similar to those established for spatial attention. Brain-imaging data revealed a partial overlap between neural systems involved in the performance of spatial versus temporal orientation of attention tasks. Additionally, hemispheric asymmetries revealed preferential right and left parietal activation for spatial and temporal attention, respectively. Parietal cortex was activated bilaterally by attending to both dimensions simultaneously. This is the first direct comparison of the neural correlates of attending to spatial versus temporal cues.
Article
Do blind persons develop capacities of their remaining senses that exceed those of sighted individuals? Besides anecdotal suggestions, two views based on experimental studies have been advanced. The first proposes that blind individuals should be severely impaired, given that vision is essential to develop spatial concepts. The second suggests that compensation occurs through the remaining senses, allowing them to develop an accurate concept of space. Here we investigate how an ecologically critical function, namely three-dimensional spatial mapping, is carried out by early-blind individuals with or without residual vision. Subjects were tested under monaural and binaural listening conditions. We find that early-blind subjects can map the auditory environment with equal or better accuracy than sighted subjects. Furthermore, unlike sighted subjects, they can correctly localize sounds monaurally. Surprisingly, blind individuals with residual peripheral vision localized sounds less precisely than sighted or totally blind subjects, confirming that compensation varies according to the aetiology and extent of blindness. Our results resolve a long-standing controversy in that they provide behavioural evidence that totally blind individuals have better auditory ability than sighted subjects, enabling them to compensate for their loss of vision.
Article
To test the hypothesis of auditory compensation after early visual deprivation, congenitally blind and sighted adults performed an auditory discrimination task. They had to detect a rare target tone among frequent standard tones. Stimuli were presented with different interstimulus intervals (ISIs) (200, 1000, 2000 ms) and the auditory-event related potentials to all tones and reaction times to targets were recorded. Increasing ISIs resulted in an increasing amplitude of the vertex response (N1-P2) in both groups, but this amplitude recovery was more pronounced in the blind. Furthermore, targets elicited larger and more posteriorly distributed N2 responses in the blind than in the sighted. Since target detection times were shorter in the blind as well, these findings imply compensatory adaptations within the auditory modality in humans blind from birth.
Article
Our attentional systems orient reflexively to novel environmental stimuli. Such attentive orienting is typically followed by a prolonged period of inhibition, known as inhibition of return (IOR), thought to be linked to the eye movement system. It is widely believed that IOR may provide a tagging mechanism that prevents perseveration, and thus facilitates attentional search. Using a tactile variant of the peripheral spatial cuing paradigm, we show IOR in congenitally blind adults and in an individual who had no eyes. These results demonstrate for the first time that spatial IOR can occur in the absence of oculomotor control.
Article
Blind people must rely more than sighted people on auditory input in order to acquire information about the world. The present study was designed to test the hypothesis that blind people have better memory than sighted individuals for auditory verbal material and specifically to determine whether memory encoding and/or retrieval are improved in blind adults. An incidental memory paradigm was employed in which 11 congenitally blind people and 11 matched sighted controls first listened to 80 sentences which ended either with a semantically appropriate or inappropriate word. Immediately following, the recognition phase occurred, in which all sentence terminal words were presented again randomly intermixed with the same number of new words. Participants indicated whether or not they had heard the word in the initial study phase. Event-related brain potentials (ERPs) were recorded from 28 electrode positions during both the encoding and the retrieval phase. Blind participants' memory performance was superior to that of sighted controls. In addition, during the recognition phase, previously presented words elicited ERPs with larger positive amplitudes than new words, particularly over the right hemisphere. During the study phase, words that would subsequently be recognized elicited a more pronounced late positive potential than words that were not subsequently recognized. These effects were reliable in the congenitally blind participants but could only be obtained in the subgroup of sighted participants who had the highest memory performance. These results imply that blind people encode auditory verbal material more efficiently than matched sighted controls and that this in turn allows them to recognize these items with a higher probability.
Article
My colleagues and I have investigated whether the temporal framework can be used to guide selective attention, and have applied non-invasive methodology to reveal the brain systems and mechanisms involved. Our findings show that we are able to orient attention selectively to different points in time, enhancing behavioral performance. These effects are mediated by a left-hemisphere dominant parietal-frontal system, which partially overlaps with the networks involved in spatial orienting. The neural system for temporal orienting also includes brain areas associated with motor preparation and anticipation, suggesting that sensorimotor areas with different specializations can contribute to attentional orienting depending on the stimulus attributes guiding selection. The optimization of behavior by temporal orienting involves enhancement of the latency and amplitude of event-related potentials that are associated with motor responses and decisions. The effects are distinct from those during visual spatial attention, indicating that behavioral advantages can be conferred by multiple types of neural mechanisms. Taken together, the findings illustrate the flexibility of attentional functions in the human brain.
Article
ERP researchers use differences in scalp distributions to infer differences in spatial configurations of neuroelectric generators. Since McCarthy and Wood (1985) demonstrated that a spatially fixed current source varying only in strength can yield a significant Condition x Electrode interaction in ANOVA, the recommended approach has been to normalize ERP amplitudes, for example, by vector length, prior to testing for interactions. The assumptions of this procedure are examined and it is shown via simulations that this application of vector scaling is both conceptually flawed and unsound in experimental practice. Because different spatial configurations of neural generators cannot reliably be inferred from different scalp topographies even after amplitude normalization, it is recommended that the procedure no longer be used for this purpose.
Article
An increasing number of animal and human studies suggests that different sensory systems share spatial representations in the brain. The aim of the present study was to test whether attending to auditory stimuli presented at a particular spatial location influences the processing of tactile stimuli at that position and vice versa (crossmodal attention). Moreover, it was investigated which processing stages are influenced by orienting attention to a certain stimulus modality (intermodal attention). Event-related brain potentials (ERPs) were recorded from 15 participants while tactile and auditory stimuli were presented at the left or right side of the body midline. The task of the participants was to attend to either the auditory or to the tactile modality, and to respond to infrequent double-stimuli of either the left or right side. Results showed that spatial attention modulated both early and late somatosensory and auditory ERPs when touch and tones were relevant, respectively. Moreover, early somatosensory (N70-100, N125-175) and auditory (N100-170) potentials, but not later deflections, were affected by spatial attention to the other modality, suggesting bi-directional crossmodal links between hearing and touch. Additionally, ERPs were modulated by intermodal selection mechanisms: stimuli elicited enhanced negative early and late ERPs when they belonged to the attended modality compared to those that belonged to the unattended modality. The present results provide evidence for the parallel influence of spatial and intermodal selection mechanisms at early processing stages while later processing steps are restricted to the relevant modality.
Article
Several recent reports suggest compensatory performance changes in blind individuals. It has, however, been argued that the lack of visual input leads to impoverished semantic networks resulting in the use of data-driven rather than conceptual encoding strategies on memory tasks. To test this hypothesis, congenitally blind and sighted participants encoded environmental sounds either physically or semantically. In the recognition phase, both conceptually as well as physically distinct and physically distinct but conceptually highly related lures were intermixed with the environmental sounds encountered during study. Participants indicated whether or not they had heard a sound in the study phase. Congenitally blind adults showed elevated memory both after physical and semantic encoding. After physical encoding blind participants had lower false memory rates than sighted participants, whereas the false memory rates of sighted and blind participants did not differ after semantic encoding. In order to address the question if compensatory changes in memory skills are restricted to critical periods during early childhood, late blind adults were tested with the same paradigm. When matched for age, they showed similarly high memory scores as the congenitally blind. These results demonstrate compensatory performance changes in long-term memory functions due to the loss of a sensory system and provide evidence for high adaptive capabilities of the human cognitive system.
Article
The present study investigated with event-related potentials whether attending to a moment in time modulates the processing of auditory stimuli at a similar early, perceptual level as attending to a location in space. The participants listened to short (600 ms) and long (1,200 ms) intervals marked by white noise bursts. The task was to attend in alternating runs either to the short or to the long intervals and to respond to rare offset markers that differed in intensity from the frequent standard offset markers. Prior to the to-be-attended moment, a slow negative potential developed over the frontal scalp. Stimuli presented at the attended compared to the unattended moments in time elicited an enhanced N1 and an enhanced posteriorly distributed positivity (300-370 ms). The results show that attention can be flexibly controlled in time and that not only late but also early perceptual processing stages are modulated by attending to a moment in time.
Article
Researchers have known for more than a century that crossing the hands can impair both tactile perception and the execution of appropriate finger movements. Sighted people find it more difficult to judge the temporal order when two tactile stimuli, one applied to either hand, are presented and their hands are crossed over the midline as compared to when they adopt a more typical uncrossed-hands posture. It has been argued that because of the dominant role of vision in motor planning and execution, tactile stimuli are remapped into externally defined coordinates (predominantly determined by visual inputs) that takes longer to achieve when external and body-centered codes (determined primarily by somatosensory/proprioceptive inputs) are in conflict and that involves both multisensory parietal and visual cortex. Here, we show that the performance of late, but not of congenitally, blind people was impaired by crossing the hands. Moreover, we provide the first empirical evidence for superior temporal order judgments (TOJs) for tactile stimuli in the congenitally blind. These findings suggest a critical role of childhood vision in modulating the perception of touch that may arise from the emergence of specific crossmodal links during development.
Article
The performance in letter recognition of 21 blind participants was compared with that of 16 age-matched sighted participants in an n-back working memory task. Blind participants were tested tactually with series of raised letters and Braille characters, and sighted participants tactually with series of raised letters and visually with series of letters presented on a computer screen. With this approach, we wanted to compare the tactual performances by trained (blind) and non-trained (sighted) participants, and tactual and visual performances by trained (blind and sighted, respectively) participants. Increments of mnemonic load increased the number of incorrect responses significantly in all n-back tasks. As expected, the blind participants outperformed the sighted ones statistically significantly in the tactile raised letters n-back task. The sighted participants produced significantly fewer incorrect responses in the visual task than the blind participants in the raised letters task, whereas there was no such difference between sighted subjects' visual performance and blind subjects' tactile performance in the Braille task. These results demonstrate the degree and limits to which everyday practice develops perceptual skills either in persons without sensory deficits (vision and visual environment), or persons with severe sensory loss (blindness and tactile environment). The performance level of blind persons relying on their tactile skills is just about the same as that of sighted subjects relying on their visual skills.
Article
Absolute pitch (AP) is possessed by only a small percentage of musicians (typically < 20%). From a sample of 46 early blind subjects, we identified 21 who had musical training, 12 of whom (57.1%) reported having AP, reflecting markedly increased prevalence compared to sighted musicians, despite the fact that mean age of commencement of musical training was significantly later among blind than sighted AP musicians in our database. MR images acquired in a subset of blind AP musicians revealed greater variability in planum temporale asymmetry compared with the increased left-sided asymmetry previously described in sighted AP musicians. This suggests that neural mechanisms underlying AP in blind musicians could differ from those in sighted musicians.
Article
It has been shown that stimuli of a task-irrelevant modality receive enhanced processing when they are presented at an attended location in space (crossmodal attention). The present study investigated the effects of visual deprivation on the interaction of the intact sensory systems. Random streams of tactile and auditory stimuli were presented at the left or right index finger of congenitally blind participants. They had to attend to one modality (auditory or tactile) of one side (left or right) and had to respond to deviant stimuli of the attended modality and side. While in a group of sighted participants, early event-related potentials (ERPs) were negatively displaced to stimuli presented at the attended position, compared to the unattended, for both the task-relevant and the task-irrelevant modality, starting as early as 80 ms after stimulus onset (unimodal and crossmodal spatial attention effects, respectively), corresponding crossmodal effects could not be detected in the blind. In the sighted, spatial attention effects after 200 ms were only significant for the task-relevant modality, whereas a crossmodal effect for this late time window was observed in the blind. This positive rather than negative effect possibly indicates an active suppression of task-irrelevant stimuli at an attended location in space. The present data suggest that developmental visual input is essential for the use of space to integrate input of the non-visual modalities, possibly because of its high spatial resolution. Alternatively, enhanced perceptual skills of the blind within the intact modalities may result in reduced multisensory interactions ("inverse effectiveness of multisensory integration").
Article
Timing is essential to human behaviour, but the neural mechanisms underlying time perception are still unclear. New findings from a brain-imaging study by Coull et al. show that activity in a network of motor-related areas varies parametrically with attention to time. Given that a system in which timing is important (but not the primary function) is recruited when temporal judgements are required, we should perhaps reassess the notion of a dedicated timing system in the brain.
Article
Blind individuals manifest remarkable abilities in navigating through space despite their lack of vision. They have previously been shown to perform normally or even supra-normally in tasks involving spatial hearing in near space, a region that, however, can be calibrated with sensory-motor feedback. Here we show that blind individuals not only properly map auditory space beyond their peri-personal environment but also demonstrate supra-normal performance when subtle acoustic cues for target location and distance must be used to carry out the task. Moreover, it is generally postulated that such abilities rest in part on cross-modal cortical reorganizations, particularly in the immature brain, where important synaptogenesis is still possible. Nonetheless, we show for the first time that even late-onset blind subjects develop above-normal spatial abilities, suggesting that significant compensation can occur in the adult.
Article
Early-onset blindness (EB) produces measurable advantages in auditory perception, attention, memory and language. Neville and Bavelier [Neville, H. J., & Bavelier, D. (2001) Variability of developmental plasticity. In J. L. McClelland, R. S. Siegler (Eds.) Mechanisms of cognitive development: Behavioral andellon symposia on cognition (pp. 271-301)] hypothesized that faster temporal processing underlies many auditory compensatory effects in the blind. We tested this hypothesis by comparing early-onset blind individuals and sighted counterparts (SC) by assessing their rates of perceptual consolidation, the accurate perceptual representation of auditory stimuli. Firstly, we first tested both groups on a temporal-order judgment task (TOJ). EB subjects had significantly lower TOJ thresholds than the SC subjects. Secondly, we assessed perceptual consolidation speed using auditory backward masking tasks, taking into account individual TOJ thresholds. Discrimination performance was unaffected at all mask delays in the EB group while the SC subjects needed a mask delay of 160 ms to perform comparably. A backward masking task using single tone stimuli found no differences between the EB and SC groups any mask delay. A simultaneous masking task demonstrated that the mask effectively impaired discrimination in EB subjects at sensory stages. These results suggest that advantages in perceptual consolidation may reflect a mechanism responsible for the short response times and better performance reported in early blind individuals across a number of complex auditory tasks.