CNS activation and regional connectivity during pantomime observation: No engagement of the mirror neuron system for deaf signers

Laboratory for Language and Cognitive Neuroscience, San Diego State University, 6495 Alvarado Road, San Diego, CA 92120, USA.
NeuroImage (Impact Factor: 6.36). 09/2009; 49(1):994-1005. DOI: 10.1016/j.neuroimage.2009.08.001
Source: PubMed


Deaf signers have extensive experience using their hands to communicate. Using fMRI, we examined the neural systems engaged during the perception of manual communication in 14 deaf signers and 14 hearing non-signers. Participants passively viewed blocked video clips of pantomimes (e.g., peeling an imaginary banana) and action verbs in American Sign Language (ASL) that were rated as meaningless by non-signers (e.g., TO-DANCE). In contrast to visual fixation, pantomimes strongly activated fronto-parietal regions (the mirror neuron system, MNS) in hearing non-signers, but only bilateral middle temporal regions in deaf signers. When contrasted with ASL verbs, pantomimes selectively engaged inferior and superior parietal regions in hearing non-signers, but right superior temporal cortex in deaf signers. The perception of ASL verbs recruited similar regions as pantomimes for deaf signers, with some evidence of greater involvement of left inferior frontal gyrus for ASL verbs. Functional connectivity analyses with left hemisphere seed voxels (ventral premotor, inferior parietal lobule, fusiform gyrus) revealed robust connectivity with the MNS for the hearing non-signers. Deaf signers exhibited functional connectivity with the right hemisphere that was not observed for the hearing group for the fusiform gyrus seed voxel. We suggest that life-long experience with manual communication, and/or auditory deprivation, may alter regional connectivity and brain activation when viewing pantomimes. We conclude that the lack of activation within the MNS for deaf signers does not support an account of human communication that depends upon automatic sensorimotor resonance between perception and action.

Download full-text


Available from: Susan Goldin-Meadow, Mar 27, 2014
  • Source
    • "In this sense, the MNS hypothesis has been proposed to provide a neural basis for this transition (Arbib, 2005). Interestingly, using functional neuroimaging, Emmorey et al. (2010) reported that deaf signers displayed different patterns of brain activation when passively viewing pantomimes and ASL signs compared to hearing non-signers. Pantomimes strongly activated frontoparietal regions (MNS) in hearing non-signers, but only bilateral middle temporal regions in deaf signers. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The capacity for language is arguably the most remarkable innovation of the human brain. A relatively recent interpretation prescribes that part of the language-related circuits were co-opted from circuitry involved in hand control-the mirror neuron system (MNS), involved both in the perception and in the execution of voluntary grasping actions. A less radical view is that in early humans, communication was opportunistic and multimodal, using signs, vocalizations or whatever means available to transmit social information. However, one point that is not yet clear under either perspective is how learned communication acquired a semantic property thereby allowing us to name objects and eventually describe our surrounding environment. Here we suggest a scenario involving both manual gestures and learned vocalizations that led to the development of a primitive form of conventionalized reference. This proposal is based on comparative evidence gathered from other species and on neurolinguistic evidence in humans, which points to a crucial role for vocal learning in the early development of language. Firstly, the capacity to direct the attention of others to a common object may have been crucial for developing a consensual referential system. Pointing, which is a ritualized grasping gesture, may have been crucial to this end. Vocalizations also served to generate joint attention among conversants, especially when combined with gaze direction. Another contributing element was the development of pantomimic actions resembling events or animals. In conjunction with this mimicry, the development of plastic neural circuits that support complex, learned vocalizations was probably a significant factor in the evolution of conventionalized semantics in our species. Thus, vocal imitations of sounds, as in onomatopoeias (words whose sound resembles their meaning), are possibly supported by mirror system circuits, and may have been relevant in the acquisition of early meanings.
    Full-text · Article · Aug 2014 · Frontiers in Human Neuroscience
  • Source
    • "Additionally, functional connectivity analyses of TNN in hearing populations show that LTC activation has the weakest correlation with the other hubs in the default mode activation (Buckner, Andrews & Schacter, 2008; Andrews-Hanna et al., 2010), suggesting that it might not be central to TNN's function. Additionally, the observation that lifelong visual language experience leads to changes in TNN that include an increase in connectivity of right IPL and MTG cortices, and PCC and left MTG contributes to the laterality debate surrounding sign language processing (Neville et al., 1998; MacSweeney et al., 2002; Allen et al., 2008; Emmorey et al., 2010), suggesting that the increase in bilateral activation during sign language processing, as compared to spoken language, is not task-specific. Rather, repeated exposure to, and practice in comprehension of, sign language appear to lead to profound alterations in functional connectivity, as demonstrated by our data, as well as structural changes, such as increase in right hemisphere white matter volume (Allen et al., 2008). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Prior studies investigating cortical processing in Deaf signers suggest that life-long experience with sign language and/or auditory deprivation may alter the brain's anatomical structure and the function of brain regions typically recruited for auditory processing (Emmorey et al., 2010; Pénicaud et al., 2013 inter alia). We report the first investigation of the task-negative network in Deaf signers and its functional connectivity-the temporal correlations among spatially remote neurophysiological events. We show that Deaf signers manifest increased functional connectivity between posterior cingulate/precuneus and left medial temporal gyrus (MTG), but also inferior parietal lobe and medial temporal gyrus in the right hemisphere- areas that have been found to show functional recruitment specifically during sign language processing. These findings suggest that the organization of the brain at the level of inter-network connectivity is likely affected by experience with processing visual language, although sensory deprivation could be another source of the difference. We hypothesize that connectivity alterations in the task negative network reflect predictive/automatized processing of the visual signal.
    Full-text · Article · Jun 2014 · PeerJ
  • Source
    • "In addition to our speech capacity, gesturing is a flexible communicative tool which humans use to communicate both concrete and abstract information via the visual modality. Previous studies on object-or person-related gesture processing have either presented pantomimes of tool or object use, hands grasping for tools or objects (e.g., Decety et al., 1997; Faillenot et al., 1997; Decety and Grèzes, 1999; Grèzes and Decety, 2001; Buxbaum et al., 2005; Filimon et al., 2007; Pierno et al., 2009; Biagi et al., 2010; Davare et al., 2010; Emmorey et al., 2010; Jastorff et al., 2010); or symbolic gestures like " thumbs up " (Nakamura et al., 2004; Molnar-Szakacs et al., 2007; Husain et al., 2009; Xu et al., 2009; Andric et al., 2013). However, few studies directly compared abstract-social (person-related) with concreteobject-related gestures. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Abstractness and modality of interpersonal communication have a considerable impact on comprehension. They are relevant for determining thoughts and constituting internal models of the environment. Whereas concrete object-related information can be represented in mind irrespective of language, abstract concepts require a representation in speech. Consequently, modality-independent processing of abstract information can be expected. Here we investigated the neural correlates of abstractness (abstract vs. concrete) and modality (speech vs. gestures), to identify an abstractness-specific supramodal neural network. During fMRI data acquisition 20 participants were presented with videos of an actor either speaking sentences with an abstract-social [AS] or concrete-object-related content [CS], or performing meaningful abstract-social emblematic [AG] or concrete-object-related tool-use gestures [CG]. Gestures were accompanied by a foreign language to increase the comparability between conditions and to frame the communication context of the gesture videos. Participants performed a content judgment task referring to the person vs. object-relatedness of the utterances. The behavioral data suggest a comparable comprehension of contents communicated by speech or gesture. Furthermore, we found common neural processing for abstract information independent of modality (AS>CS ∩ AG>CG) in a left hemispheric network including the left inferior frontal gyrus, temporal pole and medial frontal cortex. Modality specific activations were found in bilateral occipital, parietal and temporal as well as right inferior frontal brain regions for gesture (G>S) and in left anterior temporal regions and the left angular gyrus for the processing of speech semantics (S>G). These data support the idea that abstract concepts are represented in a supramodal manner. Consequently, gestures referring to abstract concepts are processed in a predominantly left hemispheric language related neural network.
    Full-text · Article · Sep 2013 · Frontiers in Behavioral Neuroscience
Show more