Different categories of living and non-living sound-sources activate distinct cortical networks.
ABSTRACT With regard to hearing perception, it remains unclear as to whether, or the extent to which, different conceptual categories of real-world sounds and related categorical knowledge are differentially represented in the brain. Semantic knowledge representations are reported to include the major divisions of living versus non-living things, plus more specific categories including animals, tools, biological motion, faces, and places-categories typically defined by their characteristic visual features. Here, we used functional magnetic resonance imaging (fMRI) to identify brain regions showing preferential activity to four categories of action sounds, which included non-vocal human and animal actions (living), plus mechanical and environmental sound-producing actions (non-living). The results showed a striking antero-posterior division in cortical representations for sounds produced by living versus non-living sources. Additionally, there were several significant differences by category, depending on whether the task was category-specific (e.g. human or not) versus non-specific (detect end-of-sound). In general, (1) human-produced sounds yielded robust activation in the bilateral posterior superior temporal sulci independent of task. Task demands modulated activation of left lateralized fronto-parietal regions, bilateral insular cortices, and sub-cortical regions previously implicated in observation-execution matching, consistent with "embodied" and mirror-neuron network representations subserving recognition. (2) Animal action sounds preferentially activated the bilateral posterior insulae. (3) Mechanical sounds activated the anterior superior temporal gyri and parahippocampal cortices. (4) Environmental sounds preferentially activated dorsal occipital and medial parietal cortices. Overall, this multi-level dissociation of networks for preferentially representing distinct sound-source categories provides novel support for grounded cognition models that may underlie organizational principles for hearing perception.
Article: Intensity-related performances are modified by long-term hearing aid use: a functional plasticity?[show abstract] [hide abstract]
ABSTRACT: It is now well established that the adult central nervous system can reorganize following various environmental changes. In particular, it has been hypothesized that auditory rehabilitation of sensorineural hearing-impaired adults may involve functional plasticity. The present study sought to compare intensity-related performance between two groups of subjects paired for age, gender and absolute thresholds in both ears. One group comprised long-term binaural hearing aid (HA) users and the other non-HA users. The effect of HA use was measured in two intensity tasks, a discrimination-limen-for-intensity task (DLI) and a loudness-scaling task. Results indicated that significant differences exist in loudness perception between long-term HA users and non-HA users, the latter rating intensity as louder than the former. Concerning intensity discrimination performance, a statistical tendency to lower, i.e. better, DLIs in long-term than in non-HA users was revealed. Moreover, significant differences between ears were observed in the loudness-scaling task, with the right ear showing greater inter-group difference than the left ear. This additional result points to a lateralization of the acclimatization effect. Finally, this study suggests significant perceptual modification and thus a possible functional plasticity entailed by HA use.Hearing Research 04/2002; 165(1-2):142-51. · 2.70 Impact Factor
Article: Functional connectivity in the resting brain: a network analysis of the default mode hypothesis.[show abstract] [hide abstract]
ABSTRACT: Functional imaging studies have shown that certain brain regions, including posterior cingulate cortex (PCC) and ventral anterior cingulate cortex (vACC), consistently show greater activity during resting states than during cognitive tasks. This finding led to the hypothesis that these regions constitute a network supporting a default mode of brain function. In this study, we investigate three questions pertaining to this hypothesis: Does such a resting-state network exist in the human brain? Is it modulated during simple sensory processing? How is it modulated during cognitive processing? To address these questions, we defined PCC and vACC regions that showed decreased activity during a cognitive (working memory) task, then examined their functional connectivity during rest. PCC was strongly coupled with vACC and several other brain regions implicated in the default mode network. Next, we examined the functional connectivity of PCC and vACC during a visual processing task and show that the resultant connectivity maps are virtually identical to those obtained during rest. Last, we defined three lateral prefrontal regions showing increased activity during the cognitive task and examined their resting-state connectivity. We report significant inverse correlations among all three lateral prefrontal regions and PCC, suggesting a mechanism for attenuation of default mode network activity during cognitive processing. This study constitutes, to our knowledge, the first resting-state connectivity analysis of the default mode and provides the most compelling evidence to date for the existence of a cohesive default mode network. Our findings also provide insight into how this network is modulated by task demands and what functions it might subserve.Proceedings of the National Academy of Sciences 02/2003; 100(1):253-8. · 9.68 Impact Factor
[show abstract] [hide abstract]
ABSTRACT: One of the properties that most conspicuously distinguishes human language from any other form of animal communication is generativity. Language with this property therefore presumably evolved with the Homo line somewhere between H. habilis and H. sapiens sapiens. Some have suggested that it emerged relatively suddenly and completely with H. sapiens sapiens, and this view is consistent with (a) linguistic estimates as to when vocal language emerged, (b) the relatively late "explosion" of manufacture and cultural artifacts such as body ornamentation and cave drawings, and (c) evidence on changes in the vocal apparatus. However, evidence on brain size and developmental patterns of growth suggests an earlier origin and a more continuous evolution. I propose that these scenarios can be reconciled if it is supposed that generative language evolved, perhaps from H. habilis on, as a system of manual gestures, but switched to a predominantly vocal system with H. sapiens sapiens. The subsequent "cultural explosion" can then be attributed to the freeing of the hands from primary involvement in language, so that they could be exploited, along with generativity, for manufacture, art, and other activities.Cognition 10/1992; 44(3):197-26. · 3.16 Impact Factor