Unattended emotional faces elicit early lateralized amygdala–frontal and fusiform activations

ArticleinNeuroImage 50(2):727-33 · April 2010with 203 Reads
Cite this publication
Abstract
Human adaptive behaviour to potential threats involves specialized brain responses allowing rapid and reflexive processing of the sensory input and a more directed processing for later evaluation of the nature of the threat. The amygdalae are known to play a key role in emotion processing. It is suggested that the amygdalae process threat-related information through a fast subcortical route and slower cortical feedback. Evidence from human data supporting this hypothesis is lacking. The present study investigated event-related neural responses during processing of facial emotions in the unattended hemifield using magnetoencephalography (MEG) and found activations of the amygdala and anterior cingulate cortex to fear as early as 100 ms. The right amygdala exhibited temporally dissociated activations to input from different visual fields, suggesting early subcortical versus later cortical processing of fear. We also observed asymmetrical fusiform activity related to lateralized feed-forward processing of the faces in the visual-ventral stream. Results demonstrate fast, automatic, and parallel processing of unattended emotional faces, providing important insights into the specific and dissociated neural pathways in emotion and face perception.

Do you want to read the rest of this article?

Request full-text
Request Full-text Paper PDF
  • Article
    Full-text available
    The role of consciousness in learning has been debated for nearly 50 years. Recent studies suggest that conscious awareness is needed to bridge the gap when learning about two events that are separated in time, as is true for trace fear conditioning. This has been repeatedly shown and seems to apply to other forms of classical conditioning as well. In contrast to these findings, we show that individuals can learn to associate a face with the later occurrence of a shock, even if they are unable to perceive the face. We used a novel application of magnetoencephalography (MEG) to non-invasively record neural activity from the amygdala, which is known to be important for fear learning. We demonstrate rapid (∼170-200 ms) amygdala responses during the stimulus free period between the face and the shock. These results suggest that unperceived faces can serve as signals for impending threat, and that rapid, automatic activation of the amygdala contributes to this process. In addition, we describe a methodology that can be applied in the future to study neural activity with MEG in other subcortical structures.
  • Article
    Full-text available
    Magnetoencephalography (MEG) is a neurophysiological technique that detects the magnetic fields associated with brain activity. Synthetic aperture magnetometry (SAM), a MEG magnetic source imaging technique, can be used to construct both detailed maps of global brain activity as well as virtual electrode signals, which provide information that is similar to invasive electrode recordings. This innovative approach has demonstrated utility in both clinical and research settings. For individuals with epilepsy, MEG provides valuable, nonredundant information. MEG accurately localizes the irritative zone associated with interictal spikes, often detecting epileptiform activity other methods cannot, and may give localizing information when other methods fail. These capabilities potentially greatly increase the population eligible for epilepsy surgery and improve planning for those undergoing surgery. MEG methods can be readily adapted to research settings, allowing noninvasive assessment of whole brain neurophysiological activity, with a theoretical spatial range down to submillimeter voxels, and in both humans and nonhuman primates. The combination of clinical and research activities with MEG offers a unique opportunity to advance translational research from bench to bedside and back.
  • Article
    Full-text available
    It is well known that we continuously filter incoming sensory information, selectively allocating attention to what is important while suppressing distracting or irrelevant information. Yet questions remain about spatiotemporal patterns of neural processes underlying attentional biases toward emotionally significant aspects of the world. One index of affectively biased attention is an emotional variant of an attentional blink (AB) paradigm, which reveals enhanced perceptual encoding for emotionally salient over neutral stimuli under conditions of limited executive attention. The present study took advantage of the high spatial and temporal resolution of magnetoencephalography (MEG) to investigate neural activation related to emotional and neutral targets in an AB task. MEG data were collected while participants performed a rapid stimulus visual presentation task in which two target stimuli were embedded in a stream of distractor words. The first target (T1) was a number and the second (T2) either an emotionally salient or neutral word. Behavioural results replicated previous findings of greater accuracy for emotionally salient than neutral T2 words. MEG source analyses showed that activation in orbitofrontal cortex, characterized by greater power in the theta and alpha bands, and dorsolateral prefrontal activation were associated with successful perceptual encoding of emotionally salient relative to neutral words. These effects were observed between 250 and 550 ms, latencies associated with discrimination of perceived from unperceived stimuli. These data suggest that important nodes of both emotional salience and frontoparietal executive systems are associated with the emotional modulation of the attentional blink.
  • Article
    Full-text available
    Amygdala is a key brain region for face perception. While the role of amygdala in the perception of facial emotion and gaze has been extensively highlighted with fMRI, the unfolding in time of amydgala responses to emotional versus neutral faces with different gaze directions is scarcely known. Here we addressed this question in healthy subjects using MEG combined with an original source imaging method based on individual amygdala volume segmentation and the localization of sources in the amygdala volume. We found an early peak of amygdala activity that was enhanced for fearful relative to neutral faces between 130 and 170 ms. The effect of emotion was again significant in a later time range (310-350 ms). Moreover, the amygdala response was greater for direct relative averted gaze between 190 and 350 ms, and this effect was selective of fearful faces in the right amygdala. Altogether, our results show that the amygdala is involved in the processing and integration of emotion and gaze cues from faces in different time ranges, thus underlining its role in multiple stages of face perception.
  • Article
    Full-text available
    The clinical evidences of variable epileptic propagation in occipital lobe epilepsy (OLE) have been demonstrated by several studies. However the exact localization of the epileptic focus sometimes represents a problem because of the rapid propagation to frontal, parietal, or temporal regions. Each white matter pathway close to the supposed initial focus can lead the propagation towards a specific direction, explaining the variable semiology of these rare epilepsy syndromes. Some new insights in occipital white matter anatomy are herein described by means of white matter dissection and compared to the classical epileptic patterns, mostly based on the central position of the primary visual cortex. The dissections showed a complex white matter architecture composed by vertical and longitudinal bundles, which are closely interconnected and segregated and are able to support specific high order functions with parallel bidirectional propagation of the electric signal.The same sublobar lesions may hyperactivate different white matter bundles reemphasizing the importance of the ictal semiology as a specific clinical demonstration of the subcortical networks recruited.Merging semiology, whitematter anatomy, and electrophysiologymay lead us to a better understanding of these complex syndromes and tailored therapeutic options based on individual white matter connectivity.
  • Article
    Full-text available
    Emotional facial expressions provide important non-verbal cues as to the imminent behavioural intentions of a second party. Hence, within emotion science the processing of faces (emotional or otherwise) has been at the forefront of research. Notably, however, such research has led to a number of debates including the ecological validity of utilising schematic faces in emotion research, and the face-selectively of N170. In order to investigate these issues, we explored the extent to which N170 is modulated by schematic faces, emotional expression and/or selective attention. Eighteen participants completed a three-stimulus oddball paradigm with two scrambled faces as the target and standard stimuli (counter-balanced across participants), and schematic angry, happy and neutral faces as the oddball stimuli. Results revealed that the magnitude of the N170 associated with the target stimulus was: (i) significantly greater than that elicited by the standard stimulus, (ii) comparable with the N170 elicited by the neutral and happy schematic face stimuli, and (iii) significantly reduced compared to the N170 elicited by the angry schematic face stimulus. These findings extend current literature by demonstrating N170 can be modulated by events other than those associated with structural face encoding; i.e. here, the act of labelling a stimulus a ‘target’ to attend to modulated the N170 response. Additionally, the observation that schematic faces demonstrate similar N170 responses to those recorded for real faces and, akin to real faces, angry schematic faces demonstrated heightened N170 responses, suggests caution should be taken before disregarding schematic facial stimuli in emotion processing research per se.
  • Article
    Full-text available
    Emotion regulation has an important role in child development and psychopathology. Reappraisal as cognitive regulation technique can be used effectively by children. Moreover, an ERP component known to reflect emotional processing called late positive potential (LPP) can be modulated by children using reappraisal and this modulation is also related to children's emotional adjustment. The present study seeks to elucidate the neural generators of such LPP effects. To this end, children aged 8-14 years reappraised emotional faces, while neural activity in an LPP time window was estimated using magnetoencephalography-based source localization. Additionally, neural activity was correlated with two indexes of emotional adjustment and age. Reappraisal reduced activity in the left dorsolateral prefrontal cortex during down-regulation and enhanced activity in the right parietal cortex during up-regulation. Activity in the visual cortex decreased with increasing age, more adaptive emotion regulation and less anxiety. Results demonstrate that reappraisal changed activity within a frontoparietal network in children. Decreasing activity in the visual cortex with increasing age is suggested to reflect neural maturation. A similar decrease with adaptive emotion regulation and less anxiety implies that better emotional adjustment may be associated with an advance in neural maturation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
  • Article
    Full-text available
    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict.
  • Article
    There is increasing appreciation that network‐level interactions among regions produce components of face processing previously ascribed to individual regions. Our goals were to use an exhaustive data‐driven approach to derive and quantify the topology of directed functional connections within a priori defined nodes of the face processing network and evaluate whether the topology is category‐specific. Young adults were scanned with fMRI as they viewed movies of faces, objects, and scenes. We employed GIMME to model effective connectivity among core and extended face processing regions, which allowed us to evaluate all possible directional connections, under each viewing condition (face, object, place). During face processing, we observed directional connections from the right posterior superior temporal sulcus to both the right occipital face area and right fusiform face area (FFA), which does not reflect the topology reported in prior studies. We observed connectivity between core and extended regions during face processing, but this limited to a feed‐forward connection from the FFA to the amygdala. Finally, the topology of connections was unique to face processing. These findings suggest that the pattern of directed functional connections within the face processing network, particularly in the right core regions, may not be as hierarchical and feed‐forward as described previously. Our findings support the notion that topologies of network connections are specialized, emergent, and dynamically responsive to task demands.
  • Article
    Full-text available
    The processing of emotional as compared to neutral information is associated with different patterns in eye movement and neural activity. However, the 'emotionality' of a stimulus can be conveyed not only by its physical properties, but also by the information that is presented with it. There is very limited work examining the how emotional information may influence the immediate perceptual processing of otherwise neutral information. We examined how presenting an emotion label for a neutral face may influence subsequent processing by using eye movement monitoring (EMM) and magnetoencephalography (MEG) simultaneously. Participants viewed a series of faces with neutral expressions. Each face was followed by a unique negative or neutral sentence to describe that person, and then the same face was presented in isolation again. Viewing of faces paired with a negative sentence was associated with increased early viewing of the eye region and increased neural activity between 600 and 1200 ms in emotion processing regions such as the cingulate, medial prefrontal cortex, and amygdala, as well as posterior regions such as the precuneus and occipital cortex. Viewing of faces paired with a neutral sentence was associated with increased activity in the parahippocampal gyrus during the same time window. By monitoring behavior and neural activity within the same paradigm, these findings demonstrate that emotional information alters subsequent visual scanning and the neural systems that are presumably invoked to maintain a representation of the neutral information along with its emotional details.
  • Article
    Full-text available
    A plethora of research demonstrates that the processing of emotional faces is prioritised over non-emotive stimuli when cognitive resources are limited (this is known as 'emotional superiority'). However, there is debate as to whether competition for processing resources results in emotional superiority per se, or more specifically, threat superiority. Therefore, to investigate prioritisation of emotional stimuli for storage in visual short-term memory (VSTM), we devised an original VSTM report procedure using schematic (angry, happy, neutral) faces in which processing competition was manipulated. In Experiment 1, display exposure time was manipulated to create competition between stimuli. Participants (n = 20) had to recall a probed stimulus from a set size of four under high (150 ms array exposure duration) and low (400 ms array exposure duration) perceptual processing competition. For the high competition condition (i.e. 150 ms exposure), results revealed an emotional superiority effect per se. In Experiment 2 (n = 20), we increased competition by manipulating set size (three versus five stimuli), whilst maintaining a constrained array exposure duration of 150 ms. Here, for the five-stimulus set size (i.e. maximal competition) only threat superiority emerged. These findings demonstrate attentional prioritisation for storage in VSTM for emotional faces. We argue that task demands modulated the availability of processing resources and consequently the relative magnitude of the emotional/threat superiority effect, with only threatening stimuli prioritised for storage in VSTM under more demanding processing conditions. Our results are discussed in light of models and theories of visual selection, and not only combine the two strands of research (i.e. visual selection and emotion), but highlight a critical factor in the processing of emotional stimuli is availability of processing resources, which is further constrained by task demands.
  • Data
    Full-text available
    A plethora of research demonstrates that the processing of emotional faces is prioritised over non-emotive stimuli when cognitive resources are limited (this is known as 'emotional superiority'). However, there is debate as to whether competition for processing resources results in emotional superiority per se, or more specifically, threat superiority. Therefore, to investigate prioritisation of emotional stimuli for storage in visual short-term memory (VSTM), we devised an original VSTM report procedure using schematic (angry, happy, neutral) faces in which processing competition was manipulated. In Experiment 1, display exposure time was manipulated to create competition between stimuli. Participants (n = 20) had to recall a probed stimulus from a set size of four under high (150 ms array exposure duration) and low (400 ms array exposure duration) perceptual processing competition. For the high competition condition (i.e. 150 ms exposure), results revealed an emotional superiority effect per se. In Experiment 2 (n = 20), we increased competition by manipulating set size (three versus five stimuli), whilst maintaining a constrained array exposure duration of 150 ms. Here, for the five-stimulus set size (i.e. maximal competition) only threat superiority emerged. These findings demonstrate attentional prioritisation for storage in VSTM for emotional faces. We argue that task demands modulated the availability of processing resources and consequently the relative magnitude of the emotional/threat superiority effect, with only threatening stimuli prioritised for storage in VSTM under more demanding processing conditions. Our results are discussed in light of models and theories of visual selection, and not only combine the two strands of research (i.e. visual selection and emotion), but highlight a critical factor in the processing of emotional stimuli is availability of processing resources, which is further constrained by task demands. Citation: Simione L, Calabrese L, Marucci FS, Belardinelli MO, Raffone A, et al. (2014) Emotion Based Attentional Priority for Storage in Visual Short-Term Memory. PLoS ONE 9(5): e95261. doi:10.1371/journal.pone.0095261
  • Article
    Full-text available
    Impaired social interaction is one of the hallmarks of Autism Spectrum Disorder (ASD). Emotional faces are arguably the most critical visual social stimuli and the ability to perceive, recognize, and interpret emotions is central to social interaction and communication, and subsequently healthy social development. However, our understanding of the neural and cognitive mechanisms underlying emotional face processing in adolescents with ASD is limited. We recruited 48 adolescents, 24 with high functioning ASD and 24 typically developing controls. Participants completed an implicit emotional face processing task in the MEG. We examined spatiotemporal differences in neural activation between the groups during implicit angry and happy face processing. While there were no differences in response latencies between groups across emotions, adolescents with ASD had lower accuracy on the implicit emotional face processing task when the trials included angry faces. MEG data showed atypical neural activity in adolescents with ASD during angry and happy face processing, which included atypical activity in the insula, anterior and posterior cingulate and temporal and orbitofrontal regions. Our findings demonstrate differences in neural activity during happy and angry face processing between adolescents with and without ASD. These differences in activation in social cognitive regions may index the difficulties in face processing and in comprehension of social reward and punishment in the ASD group. Thus, our results suggest that atypical neural activation contributes to impaired affect processing, and thus social cognition, in adolescents with ASD.
  • Article
    Full-text available
    Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by impairments in social cognition. The biological basis of deficits in social cognition in ASD, and their difficulty in processing emotional face information in particular, remains unclear. Atypical communication within and between brain regions has been reported in ASD. Interregional phase-locking is a neurophysiological mechanism mediating communication among brain areas and is understood to support cognitive functions. In the present study we investigated interregional magnetoencephalographic phase synchronization during the perception of emotional faces in adolescents with ASD. A total of 22 adolescents with ASD (18 males, mean age =14.2 ± 1.15 years, 22 right-handed) with mild to no cognitive delay and 17 healthy controls (14 males, mean age =14.4 ± 0.33 years, 16 right-handed) performed an implicit emotional processing task requiring perception of happy, angry and neutral faces while we recorded neuromagnetic signals. The faces were presented rapidly (80 ms duration) to the left or right of a central fixation cross and participants responded to a scrambled pattern that was presented concurrently on the opposite side of the fixation point. Task-dependent interregional phase-locking was calculated among source-resolved brain regions. Task-dependent increases in interregional beta synchronization were observed. Beta-band interregional phase-locking in adolescents with ASD was reduced, relative to controls, during the perception of angry faces in a distributed network involving the right fusiform gyrus and insula. No significant group differences were found for happy or neutral faces, or other analyzed frequency ranges. Significant reductions in task-dependent beta connectivity strength, clustering and eigenvector centrality (all P <0.001) in the right insula were found in adolescents with ASD, relative to controls. Reduced beta synchronization may reflect inadequate recruitment of task-relevant networks during emotional face processing in ASD. The right insula, specifically, was a hub of reduced functional connectivity and may play a prominent role in the inability to effectively extract emotional information from faces. These findings suggest that functional disconnection in brain networks mediating emotional processes may contribute to deficits in social cognition in this population.
  • Article
    Full-text available
    Recent evidence suggests that disruption of integrative processes in sensation and perception may play a critical role in cognitive and behavioural atypicalities characteristic of ASD. In line with this, ASD is associated with altered structural and functional brain connectivity and atypical patterns of inter-regional communication which have been proposed to contribute to cognitive difficulties prevalent in this group. The present MEG study used atlas–guided source space analysis of inter-regional phase synchronization in ASD participants, as well as matched typically developing controls, during a dot number estimation task. This task included stimuli with globally integrated forms (animal shapes) as well as randomly-shaped stimuli which lacked a coherent global pattern. Early task-dependent increases in inter-regional phase synchrony in theta, alpha and beta frequency bands were observed. Reduced long-range beta-band phase synchronization was found in participants with ASD at 70–145 ms during presentation of globally coherent dot patterns. This early reduction in task-dependent inter-regional connectivity encompassed numerous areas including occipital, parietal, temporal, and frontal lobe regions. These results provide the first evidence for inter-regional phase synchronization during numerosity estimation, as well as its alteration in ASD, and suggest that problems with communication among brain areas may contribute to difficulties with integrative processes relevant to extraction of meaningful ‘Gestalt’ features in this population.
  • Article
    Full-text available
    In this paper we discuss the work of Francis Bacon in the context of his declared aim of giving a "visual shock."We explore what this means in terms of brain activity and what insights into the brain's visual perceptive system his work gives. We do so especially with reference to the representation of faces and bodies in the human visual brain. We discuss the evidence that shows that both these categories of stimuli have a very privileged status in visual perception, compared to the perception of other stimuli, including man-made artifacts such as houses, chairs, and cars. We show that viewing stimuli that depart significantly from a normal representation of faces and bodies entails a significant difference in the pattern of brain activation. We argue that Bacon succeeded in delivering his "visual shock" because he subverted the normal neural representation of faces and bodies, without at the same time subverting the representation of man-made artifacts.
  • Article
    Low frequency theta band oscillations (4–8 Hz) are thought to provide a timing mechanism for hippocampal place cell firing and to mediate the formation of spatial memory. In rodents, hippocampal theta has been shown to play an important role in encoding a new environment during spatial navigation, but a similar functional role of hippocampal theta in humans has not been firmly established. To investigate this question, we recorded healthy participants’ brain responses with a 160- channel whole-head MEG system as they performed two training sets of a virtual Morris water maze task. Environment layouts (except for platform locations) of the two sets were kept constant to measure theta activity during spatial learning in new and familiar environments. In line with previous findings, left hippocampal/parahippocampal theta showed more activation navigating to a hidden platform relative to random swimming. Consistent with our hypothesis, right hippocampal/parahippocampal theta was stronger during the first training set compared to the second one. Notably, theta in this region during the first training set correlated with spatial navigation performance across individuals in both training sets. These results strongly argue for the functional importance of right hippocampal theta in initial encoding of configural properties of an environment during spatial navigation. Our findings provide important evidence that right hippocampal/parahippocampal theta activity is associated with environmental encoding in the human brain.
  • Article
    Full-text available
    Gaze is one of the most important cues for human communication and social interaction. In particular, gaze contact is the most primary form of social contact and it is thought to capture attention. A very early-differentiated brain response to direct versus averted gaze has been hypothesized. Here, we used high-density electroencephalography to test this hypothesis. Topographical analysis allowed us to uncover a very early topographic modulation (40 – 80 ms) of event-related responses to faces with direct as compared to averted gaze. This modulation was obtained only in the condition where intact broadband faces –as opposed to high-pass or low-pas filtered faces– were presented. Source estimation indicated that this early modulation involved the posterior parietal region, encompassing the left precuneus and inferior parietal lobule. This supports the idea that it reflected an early orienting response to direct versus averted gaze. Accordingly, in a follow-up behavioural experiment, we found faster response times to the direct gaze than to the averted gaze broadband faces. In addition, classical evoked potential analysis showed that the N170 peak amplitude was larger for averted gaze than for direct gaze. Taken together, these results suggest that direct gaze may be detected at a very early processing stage, involving a parallel route to the ventral occipito-temporal route of face perceptual analysis.
  • Article
    Full-text available
    The frontal lobes are involved in many higher-order cognitive functions such as social cognition executive functions and language and speech. These functions are complex and follow a prolonged developmental course from childhood through to early adulthood. Magnetoencephalography (MEG) is ideal for the study of development of these functions, due to its combination of temporal and spatial resolution which allows the determination of age-related changes in both neural timing and location. There are several challenges for MEG developmental studies: to design tasks appropriate to capture the neurodevelopmental trajectory of these cognitive functions, and to develop appropriate analysis strategies to capture various aspects of neuromagnetic frontal lobe activity. Here, we review our MEG research on social and executive functions, and speech in typically developing children and in two clinical groups - children with autism spectrum disorder and children born very preterm. The studies include facial emotional processing, inhibition, visual short-term memory, speech production, and resting-state networks. We present data from event-related analyses as well as on oscillations and connectivity analyses and review their contributions to understanding frontal lobe cognitive development. We also discuss the challenges of testing young children in the MEG and the development of age-appropriate technologies and paradigms.
  • Article
    Full-text available
    Individuals with autism spectrum disorder (ASD) demonstrate poor social functioning, which may be related to atypical emotional face processing. Altered functional connectivity among brain regions, particularly involving limbic structures may be implicated. The current magnetoencephalography (MEG) study investigated whole-brain functional connectivity of eight a priori identified brain regions during the implicit presentation of happy and angry faces in 20 7 to 10-year-old children with ASD and 22 typically developing controls. Findings revealed a network of increased alpha-band phase synchronization during the first 400 ms of happy face processing in children with ASD compared to controls. This network of increased alpha-band phase synchronization involved the left fusiform gyrus, right insula, and frontal regions critical for emotional face processing. In addition, greater connectivity strength of the left fusiform gyrus (maximal 85 to 208 ms) and right insula (maximal 73 to 270 ms) following happy face presentation in children with ASD compared to typically developing controls was found. These findings reflect altered neuronal communication in children with ASD only to happy faces during implicit emotional face processing.
  • Article
    Social information processing is a critical mechanism underlying children’s socio-emotional development. Central to this process are patterns of activation associated with one of our most salient socioemotional cues, the face. In this study, we obtained fMRI activation and high-density ERP source data evoked by parallel face dot-probe tasks from 9-to-12-year-old children. We then integrated the two modalities of data to explore the neural spatial-temporal dynamics of children’s face processing. Our results showed that the tomography of the ERP sources broadly corresponded with the fMRI activation evoked by the same facial stimuli. Further, we combined complementary information from fMRI and ERP by defining fMRI activation as functional ROIs and applying them to the ERP source data. Indices of ERP source activity were extracted from these ROIs at three a priori ERP peak latencies critical for face processing. We found distinct temporal patterns among the three time points across ROIs. The observed spatial-temporal profiles converge with a dual-system neural network model for face processing: a core system (including the occipito-temporal and parietal ROIs) supports the early visual analysis of facial features, and an extended system (including the paracentral, limbic, and prefrontal ROIs) processes the socio-emotional meaning gleaned and relayed by the core system. Our results for the first time illustrate the spatial validity of high-density source localization of ERP dot-probe data in children. By directly combining the two modalities of data, our findings provide a novel approach to understanding the spatial-temporal dynamics of face processing. This approach can be applied in future research to investigate different research questions in various study populations.
  • Article
    Any information represented in the brain holds the potential to influence behavior. It is therefore of broad interest to determine the extent and quality of neural processing of stimulus input that occurs with and without awareness. The attentional blink is a useful tool for dissociating neural and behavioral measures of perceptual visual processing across conditions of awareness. The extent of higher-order visual information beyond basic sensory signaling that is processed during the attentional blink remains controversial. To determine what neural processing at the level of visual-object identification occurs in the absence of awareness, electrophysiological responses to images of faces and houses were recorded both within and outside the attentional blink period during a rapid serial visual presentation stream. Electrophysiological results were sorted according to behavioral performance (correctly identified targets vs. missed targets) within these blink and nonblink periods. An early index of face-specific processing (the N170, 140- to 220-msec poststimulus) was observed regardless of whether the participant demonstrated awareness of the stimulus, whereas a later face-specific effect with the same topographic distribution (500- to 700-msec poststimulus) was only seen for accurate behavioral discrimination of the stimulus content. The present findings suggest a multistage process of object-category processing, with only the later phase being associated with explicit visual awareness.
  • Article
    Probabilistic diffusion tractography was used to provide the first direct evidence for a subcortical pathway from the retina to the amygdala, via the superior colliculus and pulvinar, that transmits visual stimuli signalling threat. A bias to orient toward threat was measured in a temporal order judgement saccade decision task, under monocular viewing, in a group of 19 healthy participants who also underwent diffusion weighted MR imaging. On each trial of the behavioural task a picture depicting threat was presented in one visual field and a competing non-threating stimulus in the other. The onset interval between the two pictures was randomly varied and participants made a saccade toward the stimulus that they judged to have appeared first. The bias to orient toward threat was stronger when the threatening stimulus was in the temporal visual hemifield, suggesting that afferents via the retinotectal tract contributed to the bias. Probabalistic tractography was used to virtually dissect connections between the superior colliculus and the amygdala traversing the pulvinar. Individual differences in microstructure (fractional anisotropy) of the streamline predicted the magnitude of the bias to orient toward threat, providing supporting evidence for a functional role of the subcortical SC-amygdala pathway in processing threat in healthy humans.
  • Article
    Full-text available
    Background: People with anorexia nervosa (AN) have difficulties in a wide range of social-emotional processes. Previous work suggests atypical involvement of the prefrontal cortex (PFC), amygdala, insula, and fusiform gyri during social-emotional processing in AN. Methods: Twenty women with AN and twenty healthy comparison (HC) women were presented with happy, fearful, and neutral faces during a functional magnetic resonance imaging study. Group differences were investigated in the following regions of interest: lateral PFC, amygdala, insula, and fusiform gyri. Results: The HC participants showed significantly increased recruitment of the ventrolateral PFC and amygdala in the fearful > neutral contrast relative to the AN participants. The AN participants showed a significantly increased recruitment of a small cluster in the right posterior insula in the happy > neutral contrast. Conclusions: These findings are in line with the hypothesis that people with AN have a blunted response to negative and atypical exaggerated response to positive emotionally provoking stimuli.
  • Article
    Full-text available
    Introduction Chronic alcohol abuse is associated with neurophysiological changes in brain activity; however, these changes are not well localized in humans. Non-human primate models of alcohol abuse enable control over many potential confounding variables associated with human studies. The present study utilized high-resolution magnetoencephalography (MEG) to quantify the effects of chronic EtOH self-administration on resting state (RS) brain function in vervet monkeys. Methods Adolescent male vervet monkeys were trained to self-administer ethanol (n = 7) or an isocaloric malto-dextrin solution (n = 3). Following training, animals received 12 months of free access to ethanol. Animals then underwent RS magnetoencephalography (MEG) and subsequent power spectral analysis of brain activity at 32 bilateral regions of interest associated with the chronic effects of alcohol use. Results demonstrate localized changes in brain activity in chronic heavy drinkers, including reduced power in the anterior cingulate cortex, hippocampus, and amygdala as well as increased power in the right medial orbital and parietal areas. Discussion The current study is the first demonstration of whole-head MEG acquisition in vervet monkeys. Changes in brain activity were consistent with human electroencephalographic studies; however, MEG was able to extend these findings by localizing the observed changes in power to specific brain regions. These regions are consistent with those previously found to exhibit volume loss following chronic heavy alcohol use. The ability to use MEG to evaluate changes in brain activity following chronic ethanol exposure provides a potentially powerful tool to better understand both the acute and chronic effects of alcohol on brain function.
  • Article
    Full-text available
    Background: The priority processing of peripherally presented affective stimuli was recently shown in healthy individuals to divert attentional resources dedicated to foveal processing. Here we investigated the influence of sub-clinical levels of anxiety and depression on this bias. Methods: Eighty-four participants were submitted to psychological tests that evaluate anxiety and depression levels. Then, they had to make speeded responses to the direction of left-or right-oriented arrows that were presented foveally at fixation. Each arrow was preceded by a peripherally presented pair of pictures, one neutral and one emotional, unpleasant or pleasant. Thus, the direction of the foveal arrow was either congruent or not with the peripheral location of the previously presented emotional picture. Data analysis focused on the differences of reaction times between congruent and incongruent conditions, which assess the spatial response bias in the task. Results: A main effect of state-anxiety was observed suggesting that the higher the level of state-anxiety, the greater the congruence effect. Limitations: Since the obtained result relates to subclinical anxiety levels, its generalization to anxiety disorders remains tentative. Conclusions: State-anxiety appears to modulate the propensity to be influenced by emotionally salient information occurring in peripheral vision, independently of its relevance to the ongoing behavior. The long-term persistence of a high level of alertness for emotional cues in visual periphery could contribute to the causation and the maintenance of anxiety disorders.
  • Article
    Full-text available
    Background Socio-emotional difficulties in autism spectrum disorder (ASD) are thought to reflect impaired functional connectivity within the “social brain”. Nonetheless, a whole-brain characterization of the fast responses in functional connectivity during implicit processing of emotional faces in adults with ASD is lacking. Methods The present study used magnetoencephalography to investigate early responses in functional connectivity, as measured by interregional phase synchronization, during implicit processing of angry, neutral and happy faces. The sample (n = 44) consisted of 22 young adults with ASD and 22 age- and sex-matched typically developed (TD) controls. Results Reduced phase-synchrony in the beta band around 300 ms emerged during processing of angry faces in the ASD compared to TD group, involving key areas of the social brain. In the same time window, de-synchronization in the beta band in the amygdala was reduced in the ASD group across conditions. Conclusions This is the first demonstration of atypical global and local synchrony patterns in the social brain in adults with autism during implicit processing of emotional faces. The present results replicate and substantially extend previous findings on adolescents, highlighting that atypical brain synchrony during processing of socio-emotional stimuli is a hallmark of clinical sequelae in autism.
  • Article
    Full-text available
    The aim of the current study was to evaluate alterations in whole-brain resting-state networks associated with posttraumatic stress disorder (PTSD) and mild traumatic brain injury (mTBI). Networks were constructed from locations of peak statistical power on an individual basis from magnetoencephalography (MEG) source series data by applying the weighted phase lag index and surrogate data thresholding procedures. Networks representing activity in the alpha bandwidth as well as wide band activity (DC-80 Hz) were created. Statistical comparisons were adjusted for age and education level. Alpha network results demonstrate reductions in network structure associated with PTSD, but no differences associated with mTBI. Wide band network results demonstrate a shift in connectivity from the alpha to theta bandwidth in both PTSD and mTBI. Also, contrasting alterations in network structure are noted, with increased randomness associated with PTSD and increased structure associated with mTBI. These results demonstrate the potential of the analysis of MEG resting-state networks to differentiate two highly comorbid conditions. The importance of the alpha bandwidth to resting state connectivity is also highlighted, while demonstrating the necessity of considering activity in other bandwidths during network construction.
  • Article
    The influence of external factors on food preferences and choices is poorly understood. Knowing which and how food-external cues impact the sensory processing and cognitive valuation of food would provide a strong benefit toward a more integrative understanding of food intake behavior and potential means of interfering with deviant eating patterns to avoid detrimental health consequences for individuals on the long run. We investigated whether written labels with positive and negative (as opposed to 'neutral') valence differentially modulate the spatio-temporal brain dynamics in response to the subsequent viewing of high- and low-energetic food images. Electrical neuroimaging analyses were applied to visual evoked potentials (VEPs) from 20 normal-weight participants. VEPs and source estimations in response to high- and low- energy foods were differentially affected by the valence of preceding word labels over the ~260-300ms post-stimulus period. These effects were only observed when high-energy foods were preceded by labels with positive valence. Neural sources in occipital, as well as posterior frontal, insular and cingulate regions were down-regulated. These findings favor implicit cognitive-affective influences especially on the visual responses to high-energetic food cues, potentially indicating decreases in cognitive control and goal-adaptive behavior. Inverse correlations between insular activity and effectiveness in food classification further indicate that this down-regulation directly impacts food-related behavior.
  • Article
    Low frequency theta band oscillations (4-8 Hz) are thought to provide a timing mechanism for hippocampal place cell firing and to mediate the formation of spatial memory. In rodents, hippocampal theta has been shown to play an important role in encoding a new environment during spatial navigation, but a similar functional role of hippocampal theta in humans has not been firmly established. To investigate this question, we recorded healthy participants' brain responses with a 160-channel whole-head MEG system as they performed two training sets of a virtual Morris water maze task. Environment layouts (except for platform locations) of the two sets were kept constant to measure theta activity during spatial learning in new and familiar environments. In line with previous findings, left hippocampal/parahippocampal theta showed more activation navigating to a hidden platform relative to random swimming. Consistent with our hypothesis, right hippocampal/parahippocampal theta was stronger during the first training set compared to the second one. Notably, theta in this region during the first training set correlated with spatial navigation performance across individuals in both training sets. These results strongly argue for the functional importance of right hippocampal theta in initial encoding of configural properties of an environment during spatial navigation. Our findings provide important evidence that right hippocampal/parahippocampal theta activity is associated with environmental encoding in the human brain. Hum Brain Mapp, 2016. © 2016 Wiley Periodicals, Inc.
  • Article
    Autism spectrum disorder (ASD) is increasingly understood to be associated with aberrant functional brain connectivity. Few studies, however, have described such atypical neural synchrony among specific brain regions. Here, we used magnetoencephalography (MEG) to characterize alterations in functional connectivity in adolescents with ASD through source space analysis of phase synchrony. Resting-state MEG data were collected from 16 adolescents with ASD and 15 age- and sex-matched typically developing (TD) adolescents. Atlas-guided reconstruction of neural activity at various cortical and subcortical regions was performed and inter-regional phase synchrony was calculated in physiologically relevant frequency bands. Using a multilevel approach, we characterized atypical resting-state synchrony within specific anatomically defined networks as well as altered network topologies at both regional and whole-network scales. Adolescents with ASD demonstrated frequency-dependent alterations in inter-regional functional connectivity. Hyperconnectivity was observed among the frontal, temporal, and subcortical regions in beta and gamma frequency ranges. In contrast, parietal and occipital regions were hypoconnected to widespread brain regions in theta and alpha bands in ASD. Furthermore, we isolated a hyperconnected network in the gamma band in adolescents with ASD which encompassed orbitofrontal, subcortical, and temporal regions implicated in social cognition. Results from graph analyses confirmed that frequency-dependent alterations of network topologies exist at both global and local levels. We present the first source-space investigation of oscillatory phase synchrony in resting-state MEG in ASD. This work provides evidence of atypical connectivity at physiologically relevant time scales and indicates that alterations of functional connectivity in adolescents with ASD are frequency dependent and region dependent. Hum Brain Mapp, 2014. © 2014 Wiley Periodicals, Inc.
  • Article
    Since the classical Babinski’s (1914) observation of lack of awareness (“anosognosia”) for a left-sided hemiplegia, the problem of the mechanisms underlying this surprising phenomenon has been raised. Most authors have stressed the links between right hemisphere and emotional processes, considering anosognosia as an abnormal emotional reaction, caused by disruption of the side of the brain crucially involved in emotional behavior. Theoretically motivated models of hemispheric asymmetries in emotional processing have proposed either a right-hemisphere dominance for specific components of emotions, or a different involvement of the right and left hemispheres in different levels of emotional processing. Following the last line of thought, we have proposed that the right hemisphere may subserve the lower “schematic” level (where emotions are automatically generated and experienced as “true emotions) and the left hemisphere the higher “conceptual” level (where emotions are consciously analyzed and submitted to an intentional control). In agreement with this model, recent empirical data strongly suggest that the right hemisphere might play a major role in the automatic, unconscious generation of emotions, whereas the left hemisphere could be mainly involved in the conscious analysis and control of emotional processes.
  • Article
    This study investigated unconscious and conscious processes by which negative emotions arise. Participants (26 men, 47 women; M age = 20.3 yr.) evaluated target words that were primed with subliminally or supraliminally presented emotional pictures. Stimulus onset asynchrony was either 200 or 800 msec. With subliminal presentations, reaction times to negative targets were longer than reaction times to positive targets after negative primes for the 200-msec. stimulus onset asynchrony. Reaction times to positive targets after negative or positive primes were shorter when the stimulus onset asynchrony was 800 msec. For supraliminal presentations, reaction times were longer when evaluating targets that followed emotionally opposite primes. When emotional stimuli were consciously distinguished, the evoked emotional states might lead to emotional conflicts, although the qualitatively different effects might be caused when subliminally presented emotion evoking stimulus was appraised unconsciously; that possibility was discussed.
  • Article
    Recent anatomical and DTI data demonstrated new aspects in the subcortical occipito-temporal connections. Although a direct (inferior longitudinal fasciculus, ILF) pathway has been previously described, its fine description is still matter of debate. Moreover, a fast and direct subcortical connection between the limbic system and the occipital lobe has been previously recognized in many functional studies but it still remains poorly documented by anatomical images. We provided for the first time an extensive and detailed anatom- ical description of the ILF subcortical segmentation. We dissected four human hemispheres with modified Klingler’s technique, from the basal to the lateral occipito-temporal surface in the two steps, tracking the ILF fibers until their cortical termination. Pictures of this direct temporo-occipital pathway are discussed in the light of recent literature regarding anatomy and functions of occipito-temporal areas. The dissection confirmed the classical originating branches of ILF and allowed a fine description of two main subcomponent of this bundle, both characterized by separate hierarchical distri- bution: a dorsal ILF and a ventral ILF. Moreover, a direct pathway between lingual cortex and amygdala, not previously demonstrated, is here described with anatomical images. Even if preliminary in results, this is the first fine description of ILF’s subcomponents. The complex but clearly segregated organization of the fibers of this bundle (dILF and vILF) supports different level of functions mediated by visual recognition. Moreover, the newly described direct pathway from lingual to amygdala (Li-Am), seems involved in the limbic modulation of visual processing, so it may support in physiological conditions the crucial role of this connection in human social cognition. In pathological conditions, on the other hand,this may be one of the hyperactivated pathways in temporo-occipital epileptic and nonepileptic syndromes.
  • Article
    Full-text available
    Magnetoencephalography (MEG) provides useful and non-redundant information in the evaluation of patients with epilepsy, and in particular, during the pre-surgical evaluation of pharmaco-resistant epilepsy. Vagus nerve stimulation (VNS) is a common treatment for pharmaco-resistant epilepsy. However, interpretation of MEG recordings from patients with a VNS is challenging due to the severe magnetic artifacts produced by the VNS. We used synthetic aperture magnetometry (g2) [SAM(g2)], an adaptive beamformer that maps the excessive kurtosis, to map interictal spikes to the coregistered MRI image, despite the presence of contaminating VNS artifact. We present a series of eight patients with a VNS who underwent MEG recording. Localization of interictal epileptiform activity by SAM(g2) is compared to invasive electrophysiologic monitoring and other localizing approaches. While the raw MEG recordings were uninterpretable, analysis of the recordings with SAM(g2) identified foci of peak kurtosis and source signal activity that was unaffected by the VNS artifact. SAM(g2) analysis of MEG recordings in patients with a VNS produces interpretable results and expands the use of MEG for the pre-surgical evaluation of epilepsy.
  • Article
    Full-text available
    In humans, the neuropeptide oxytocin plays a critical role in social and emotional behavior. The actions of this molecule are dependent on a protein that acts as its receptor, which is encoded by the oxytocin receptor gene (OXTR). DNA methylation of OXTR, an epigenetic modification, directly influences gene transcription and is variable in humans. However, the impact of this variability on specific social behaviors is unknown. We hypothesized that variability in OXTR methylation impacts social perceptual processes often linked with oxytocin, such as perception of facial emotions. Using an imaging epigenetic approach, we established a relationship between OXTR methylation and neural activity in response to emotional face processing. Specifically, high levels of OXTR methylation were associated with greater amounts of activity in regions associated with face and emotion processing including amygdala, fusiform, and insula. Importantly, we found that these higher levels of OXTR methylation were also associated with decreased functional coupling of amygdala with regions involved in affect appraisal and emotion regulation. These data indicate that the human endogenous oxytocin system is involved in attenuation of the fear response, corroborating research implicating intranasal oxytocin in the same processes. Our findings highlight the importance of including epigenetic mechanisms in the description of the endogenous oxytocin system and further support a central role for oxytocin in social cognition. This approach linking epigenetic variability with neural endophenotypes may broadly explain individual differences in phenotype including susceptibility or resilience to disease.
  • Article
    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120ms) for happy faces was seen at occipital sites and was sustained until ∼350ms post-stimulus. For fearful faces, an early effect was seen around 80ms followed by a later effect appearing at ∼150ms until ∼300ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. Copyright © 2015 Elsevier Inc. All rights reserved.
  • Article
    Human social, executive and language functions are complex and known to follow a prolonged developmental course from childhood through to early adulthood. These processes rely on the integrity and maturity of the frontal lobes, which also show protracted maturation into adulthood. MEG is the ideal modality to determine the development of these intricate and multi-faceted cognitive abilities; its exquisite temporal and spatial resolution allows investigators to track the age-related changes in both neural timing and location. The challenge for MEG has been two-fold: to develop appropriate tasks to capture the neurodevelopmental trajectory of these functions; and, to develop appropriate analysis strategies that can capture the subtle, often rapid, frontal lobe activity. In this chapter, we will review our MEG research on social, executive and language functions controlled by the frontal lobes in typically developing children and clinical groups. The studies include the examination of facial emotional processing, inhibition and verb generation. We end with a discussion on the challenges of testing young children in the MEG environment and the development of age-appropriate technologies and paradigms. © 2014 Springer-Verlag Berlin Heidelberg. All rights are reserved.
  • Chapter
    The thalamus has received a renewed interest in systems neuroscience because emerging evidence indicates that the thalamus may modulate cortical responses according to behavioral demands. Moreover, there is evidence to suggest that in addition to normal brain functioning, thalamic–cortical (TC) interactions are critically implicated in neuropsychiatric disorders, such as schizophrenia. In this chapter, we will discuss the possibility to examine TC interactions using magnetoencephalography (MEG), a technique that is commonly considered as too unreliable to monitor activity generated by thalamic sources. Here, we argue that if certain requirements are met, MEG can be employed to investigate TC interactions by combining advanced source reconstruction techniques and novel connectivity measures. Specifically, we summarize evidence from MEG experiments that examined alpha–gamma coupling in TC networks during resting-state recordings as well as data from a study that tested the effects of ketamine on neural oscillations in healthy volunteers. We will discuss the implication of these findings for the understanding of normal and abnormal brain functioning as well as further steps to validate and improve MEG as a noninvasive technique to probe interactions in TC circuits.
  • Article
    Stimulus exposure duration in emotion perception research is often chosen pragmatically; however, little work exists on the consequences of stimulus duration for the processing of emotional faces. We utilized the spatiotemporal resolution capabilities of magnetoencephalography (MEG) to characterize early implicit processing of emotional and neutral faces in response to stimuli presented for 80 and 150 milliseconds. We found that the insula was recruited to a greater degree within the short (80ms) condition for all face categories, and this effect was more pronounced for emotional compared to neutral faces. The orbitofrontal cortex was more active in the 80ms condition for neutral faces only, suggesting a modulation of task difficulty by both the duration and the emotional category of the stimuli. No effects on reaction time or accuracy were observed. Our findings caution that differences in stimulus duration may result in differential neural processing of emotional faces and challenge the idea that neutral faces constitute a neutral baseline.
  • Article
    Full-text available
    Emotions important for survival and social interaction have received wide and deep investigations. The application of the fMRI technique into emotion processing has obtained overwhelming achievements with respect to the localization of emotion processes. The ERP method, which possesses highly temporal resolution compared to fMRI, can be employed to investigate the time course of emotion processing. The emotional modulation of the ERP component has been verified across numerous researches. Emotions, described as dynamically developing along with the growing age, have the possibility to be enhanced through learning (or training) or to be damaged due to disturbances in growth, which is underlain by the neural plasticity of emotion-relevant nervous systems. And mood disorders with typical symptoms of emotion discordance probably have been caused by the dysfunctional neural plasticity.
  • Article
    Full-text available
    Social cognition is impaired in autism spectrum disorder (ASD). The ability to perceive and interpret affect is integral to successful social functioning and has an extended developmental course. However, the neural mechanisms underlying emotional face processing in ASD are unclear. Using magnetoencephalography (MEG), the present study explored neural activation during implicit emotional face processing in young adults with and without ASD. Twenty-six young adults with ASD and 26 healthy controls were recruited. Participants indicated the location of a scrambled pattern (target) that was presented alongside a happy or angry face. Emotion-related activation sources for each emotion were estimated using the Empirical Bayes Beamformer (pcorr ≤ 0.001) in Statistical Parametric Mapping 12 (SPM12). Emotional faces elicited elevated fusiform, amygdala and anterior insula and reduced anterior cingulate cortex (ACC) activity in adults with ASD relative to controls. Within group comparisons revealed that angry vs. happy faces elicited distinct neural activity in typically developing adults; there was no distinction in young adults with ASD. Our data suggest difficulties in affect processing in ASD reflect atypical recruitment of traditional emotional processing areas. These early differences may contribute to difficulties in deriving social reward from faces, ascribing salience to faces, and an immature threat processing system, which collectively could result in deficits in emotional face processing.
  • Article
    Models advanced to explain hemispheric asymmetries in representation of emotions will be discussed following their historical progression. First, the clinical observations that have suggested a general dominance of the right hemisphere for all kinds of emotions will be reviewed. Then the experimental investigations that have led to proposal of a different hemispheric specialization for positive versus negative emotions (valence hypothesis) or, alternatively, for approach versus avoidance tendencies (motivational hypothesis) will be surveyed. The discussion of these general models will be followed by a review of recent studies which have documented laterality effects within specific brain structures, known to play a critical role in different components of emotions, namely the amygdata in the computation of emotionally laden stimuli, the ventromedial prefrontal cortex in the integration between cognition and emotion and in the control of impulsive reactions and the anterior insula in the conscious experience of emotion. Results of these recent investigations support and provide an updated integrated version of early models assuming a general right hemisphere dominance for all kinds of emotions.
  • Article
    Research on Autism Spectrum Disorder (ASD) has focused on processing of socially-relevant stimuli, such as faces. Nonetheless, before being ‘social’, faces are visual stimuli. The present magnetoencephalography study investigated the time course of brain activity during an implicit emotional task in visual emotion-related regions in 19 adults with ASD (mean age 26.3 ± 4.4) and 19 typically developed controls (26.4 ± 4). The results confirmed previously-reported differences between groups in brain responses to emotion and a hypo-activation in the ASD group in the right fusiform gyrus around 150 ms. However, the ASD group also presented early enhanced activity in the occipital region. These results support that impaired face processing in ASD might be sustained by atypical responses in primary visual areas.
  • Article
    The problem of the possible relationships between hemispheric asymmetries and aspects of the psychoanalytic model of mind has been repeatedly raised during the past century. In a rather arbitrary manner, we could even say that this problem has passed through three distinct stages. In the first stage (roughly covering the first part of the twentieth century), psychoanalytically oriented authors, although acknowledging that some clinical phenomena could suggest a special link between aspects of psychoanalytical theory and the right hemisphere, substantially rejected the specificity of these relationships. In the second period, influenced by the influential work on split-brain patients of Sperry and co-workers [1], the interest that the right hemisphere could have for psychoanalysis has been explicitly acknowledged, but this has been made focusing attention more on cognitive than on emotional features and mapping onto the right and left hemispheres, in a rather global manner, two main constructs of the psychoanalytical theory, namely those of primary and secondary process [2, 3]. Finally, in the most recent period, Kandell's paper [4], stressing the need for an intimate relationship between psychoanalysis and cognitive neuroscience, has suggested a new way of investigating the links between hemispheric asymmetries and the psychoanalytic model of mind.
  • Article
    The aim of the present study was to identify synaptic contacts from axons originating in the superior colliculus with thalamic neurons projecting to the lateral nucleus of the amygdala. Axons from the superior colliculus were traced with the anterograde tracers Phaseolus vulgaris leucoagglutinin or the biotinylated and fluorescent dextran amine "Miniruby." Thalamo-amygdaloid projection neurons were identified with the retrograde tracer Fluoro-Gold. Injections of Fluoro-Gold into the lateral nucleus of the amygdala labeled neurons in nuclei of the posterior thalamus which surround the medial geniculate body, viz. the suprageniculate nucleus, the medial division of the medial geniculate body, the posterior intralaminar nucleus, and the peripeduncular nucleus. Anterogradely labeled axons from the superior colliculus terminated in the same regions of the thalamus. Tecto-thalamic axons originating from superficial collicular layers were found predominantly in the suprageniculate nucleus, whereas axons from deep collicular layers were detected in equal density in all thalamic nuclei surrounding the medial geniculate body. Double-labeling experiments revealed an overlap of projection areas in the above-mentioned thalamic nuclei. Electron microscopy of areas of overlap confirmed synaptic contacts of anterogradely labeled presynaptic profiles originating in the superficial layers of the superior colliculus with retrogradely labeled postsynaptic profiles of thalamo-amygdaloid projection neurons. These connections may represent a subcortical pathway for visual information transfer to the amygdala.
  • Article
    Brain mapping studies using dynamic imaging methods demonstrate areas regional cerebral blood flow (rCBF) decreases, as well as areas where increases, during performance of various experimental tasks. Task holds for both sets of cerebral blood flow changes (CBF), providing the opportunity to investigate areas that become and “activated” in the experimental condition relative to control state. Such data yield the intriguing observation that in areas in emotional processing, such as the amygdala, the posteromedial cortex, and the ventral anterior cingulate cortex, although flow as expected during specific emotion-related tasks, flow decreases performance of some attentionally demanding, cognitive tasks. Conversely, in some of the areas that appear to subserve cognitive functions, as the dorsal anterior cingulate and the dorsolateral prefrontal cortices, increases while performing attentionally demanding cognitive tasks, but during some experimentally induced and pathological emotional Although the specific nature of such reciprocal patterns of regional remains unclear, they may reflect an important cross-modal interaction during mental operations. The possibility that neural activity is less in areas required in emotional processing during some higher cognitive processes holds implications for the mechanisms underlying interactions cognition and emotion. Furthermore, the possibility that neural in some cognitive-processing areas is suppressed during intense states suggests mechanisms by which extreme fear or severe may interfere with cognitive performance.
  • Article
    Full-text available
    The perception of faces is sometimes regarded as a specialized task involving discrete brain regions. In an attempt to identi$ face-specific cortex, we used functional magnetic resonance imaging (fMRI) to measure activation evoked by faces presented in a continuously changing montage of common objects or in a similar montage of nonobjects. Bilateral regions of the posterior fusiform gyrus were activated by faces viewed among nonobjects, but when viewed among objects, faces activated only a focal right fusiform region. To determine whether this focal activation would occur for another category of familiar stimuli, subjects viewed flowers presented among nonobjects and objects. While flowers among nonobjects evoked bilateral fusiform activation, flowers among objects evoked no activation. These results demonstrate that both faces and flowers activate large and partially overlapping regions of inferior extrastriate cortex. A smaller region, located primarily in the right lateral fusiform gyrus, is activated specifically by faces.
  • Article
    Neuroimaging studies have shown differential amygdala responses to masked ("unseen") emotional stimuli. How visual signals related to such unseen stimuli access the amygdala is unknown. A possible pathway, involving the superior colliculus and pulvinar, is suggested by observations of patients with striate cortex lesions who show preserved abilities to localize and discriminate visual stimuli that are not consciously perceived ("blindsight"). We used measures of right amygdala neural activity acquired from volunteer subjects viewing masked fear-conditioned faces to determine whether a colliculo-pulvinar pathway was engaged during processing of these unseen target stimuli. Increased connectivity between right amygdala, pulvinar, and superior colliculus was evident when fear-conditioned faces were unseen rather than seen. Right amygdala connectivity with fusiform and orbitofrontal cortices decreased in the same condition. By contrast, the left amygdala, whose activity did not discriminate seen and unseen fear-conditioned targets, showed no masking-dependent changes in connectivity with superior colliculus or pulvinar. These results suggest that a subcortical pathway to the right amygdala, via midbrain and thalamus, provides a route for processing behaviorally relevant unseen visual events in parallel to a cortical route necessary for conscious identification.
  • Article
    Repeated presentations of emotional facial expressions were used to assess habituation in the human brain using fMRI. Significant fMRI signal decrement was present in the left dorsolateral prefrontal and premotor cortex, and right amygdala. Within the left prefrontal cortex greater habituation to happy vs fearful stimuli was evident, suggesting devotion of sustained neural resources for processing of threat vs safety signals. In the amygdala, significantly greater habituation was observed on the right compared to the left. In contrast, the left amygdala was significantly more activated than the right to the contrast of fear vs happy. We speculate that the right amygdala is part of a dynamic emotional stimulus detection system, while the left is specialized for sustained stimulus evaluations.
  • Article
    Posttraumatic stress disorder (PTSD) may develop from impaired extinction of conditioned fear responses. Exposure-based treatment of PTSD is thought to facilitate extinction learning (Charney, 2004). Fear extinction is mediated by inhibitory control of the ventromedial prefrontal cortex (vmPFC) over amygdala-based fear processes (Phelps, Delgado, Nearing, & LeDoux, 2004; Quirk, Russo, Barron, & LeBron, 2000). Most neuroimaging studies of PTSD reveal reduced vmPFC activity (particularly in rostral anterior cingulate cortex, or rACC; Lanius et al., 2001; Shin et al., 2005), and some find increased amygdala activity during threat processing (Shin et al., 2005). In addition, increased amygdala activity during fear conditioning and reduced vmPFC activity during extinction have been reported in PTSD (Bremner et al., 2005). Although PTSD patients show increased orbitofrontal and medial prefrontal activity following treatment with serotonin reuptake inhibitors (SSRIs; Fernandez et al., 2001; Seedat et al., 2004), no studies have investigated neural networks before and after exposure-based treatment of PTSD. We report the first such study. We hypothesized that symptom reduction would be associated with increased rACC activity and reduced amygdala activity during fear processing.
  • Article
    Full-text available
    What makes us become aware? A popular hypothesis is that if cortical neurons fire in synchrony at a certain frequency band (gamma), we become aware of what they are representing. We tested this hypothesis adopting brain-imaging techniques with good spatiotemporal resolution and frequency-specific information. Specifically, we examined the degree to which increases in event-related synchronization (ERS) in the gamma band were associated with awareness of a stimulus (its detectability) and/or the emotional content of the stimulus. We observed increases in gamma band ERS within prefrontal-anterior cingulate, visual, parietal, posterior cingulate, and superior temporal cortices to stimuli available to conscious awareness. However, we also observed increases in gamma band ERS within the amygdala, visual, prefrontal, parietal, and posterior cingulate cortices to emotional relative to neutral stimuli, irrespective of their availability to conscious access. This suggests that increased gamma band ERS is related to, but not sufficient for, consciousness.
  • Article
    Full-text available
    The detection of threat is a role that the amygdala plays well, evidenced by its increased response to fearful faces in human neuroimaging studies. A critical element of the fearful face is an increase in eye white area (EWA), hypothesized to be a significant cue in activating the amygdala. However, another important social signal that can increase EWA is a lateral shift in gaze direction, which also serves to orient attention to potential threats. It is unknown how the amygdala differentiates between these increases in EWA and those that are specifically associated with fear. Using functional magnetic resonance imaging, we show that the left amygdala distinguished between fearful eyes and gaze shifts despite similar EWA increases whereas the right amygdala was less discriminatory. Additional analyses also revealed selective hemispheric response patterns in the left fusiform gyrus. Our data show clear hemispheric differences in EWA-based fear activation, suggesting the existence of parallel mechanisms that code for emotional face information.
  • Article
    Behavioural evidence indicates that individual differences in anxiety influence the response to facial signals of threat. Angry and fearful faces represent qualitatively different forms of threat. Fearful faces are thought to signal the presence of a significant, yet undetermined source of danger within the environment, referred to as 'ambiguous threat'. In contrast, angry faces represent a more direct form of threat, often used in face-to-face encounters to exert dominance. Given the inherent differences between anger and fear, we hypothesised that anxiety would modulate the amygdala response to angry faces to a greater extent when attended. Following previous research, we expected anxiety to show a stronger relationship with the amygdala response to unattended fearful faces. In an event-related fMRI study, we presented images of two houses and two faces (consisting of fearful, angry or neutral expressions) in horizontal and vertical pairs around a central fixation cross, with participants instructed to attend to either the face or house stimuli. The results showed that higher anxiety levels produced an increased right amygdala response to viewer directed angry facial expressions (versus neutral or fearful faces) only when attended. By contrast, increased anxiety was associated with a greater left amygdala response to fearful faces (versus neutral or angry faces) in the unattended condition, with only borderline evidence for attended fear (relative to neutral). Our findings demonstrate the striking effects of personality in a non-clinical population, and show how this can distinguish the neural coding of anger and fear faces.
  • Article
    Full-text available
    Successful control of affect partly depends on the capacity to modulate negative emotional responses through the use of cognitive strategies (i.e., reappraisal). Recent studies suggest the involvement of frontal cortical regions in the modulation of amygdala reactivity and the mediation of effective emotion regulation. However, within-subject inter-regional connectivity between amygdala and prefrontal cortex in the context of affect regulation is unknown. Here, using psychophysiological interaction analyses of functional magnetic resonance imaging data, we show that activity in specific areas of the frontal cortex (dorsolateral, dorsal medial, anterior cingulate, orbital) covaries with amygdala activity and that this functional connectivity is dependent on the reappraisal task. Moreover, strength of amygdala coupling with orbitofrontal cortex and dorsal medial prefrontal cortex predicts the extent of attenuation of negative affect following reappraisal. These findings highlight the importance of functional connectivity within limbic-frontal circuitry during emotion regulation.
  • Article
    Adaptive beamformer analyses of magnetoencephalograms (MEG) have shown promise as a method for functional imaging of cortical processes. Although recent evidence is encouraging, it is unclear whether these methods can both localize and reconstruct the time course of activity in subcortical structures such as the amygdala. Fourteen healthy participants (7 women) performed a perceptual matching task of negative emotional faces (angry and fearful) and geometric shapes that was designed for functional magnetic resonance imaging (fMRI) studies to maximize amygdala activation. Neuromagnetic data were collected with a 275-channel whole-head magnetometer, and event-related adaptive beamformer analyses were conducted to estimate broadband evoked responses to faces and shapes across the whole brain in 7 mm steps. Group analyses revealed greater left amygdala activity to faces over shapes, both when face-matching and shape-matching trials were presented in separate blocks and when they were randomly intermixed. This finding was replicated in a second experiment with 7 new participants (3 women). Virtual sensor time series showed clear evoked responses in the left amygdala and left fusiform gyrus in both runs and experiments. We conclude that amygdala activity can be resolved from MEGs with adaptive beamformers with temporal resolution superior to other neuroimaging modalities. This demonstration should encourage the use of MEG for elucidating functional networks mediating fear-related neural phenomena that likely unfold rapidly in time across cortical and subcortical structures.
  • Article
    The purpose of the present investigation was to examine the topographical organization of efferent projections from the cytoarchitectonic divisions of the mPFC (the medial precentral, dorsal anterior cingulate and prelimbic cortices). We also sought to determine whether the efferents from different regions within the prelimbic division were organized topographically. Anterograde transport of Phaseolus vulgaris leucoagglutinin was used to examined the efferent projections from restricted injection sites within the mPFC.
  • Article
    A survey of previous literature suggests that asymmetries obtained in visual field recognition are larger when stimuli are presented bilaterally than when presented unilaterally. Three experiments confirm the existence of the effect, eliminate a number of published hypotheses used to explain it, and support a mechanism of hemispheric interaction in processing information. Both theoretical and methodological implications are discussed.
  • Article
    Full-text available
    1. We have previously identified face-selective areas in the mid-fusiform and inferior temporal gyri in electrophysiological recordings made from chronically implanted subdural electrodes in epilepsy patients. In this study, functional magnetic resonance imaging (fMRI) was used to study the anatomic extent of face-sensitive brain regions and to assess hemispheric laterality. 2. A time series of 128 gradient echo echoplanar images was acquired while subjects continuously viewed an alternating series of 10 unfamiliar faces followed by 10 equiluminant scrambled faces. Each cycle of this alternating sequence lasted 12 s and each experimental run consisted of 14 cycles. The time series of each voxel was transformed into the frequency domain using Fourier analysis. Activated voxels were defined by significant peaks in their power spectra at the frequency of stimulus alternation and by a 180 degrees phase shift that followed changes in stimulus alternation order. 3. Activated voxels to faces were obtained in the fusiform and inferior temporal gyri in 9 of 12 subjects and were approximately coextensive with previously identified face-selective regions. Nine subjects also showed activation in the left or right middle occipital gyri, or in the superior temporal or lateral occipital sulci. Cortical volumes activated in the left and right hemispheres were not significantly different. Activated voxels to scrambled faces were observed in six subjects at locations mainly in the lingual gyri and collateral sulci, medial to the regions activated by faces. 4. Face stimuli activated portions of the midfusiform and inferior temporal gyri, including adjacent cortex within occipitotemporal sulci.(ABSTRACT TRUNCATED AT 250 WORDS)
  • Article
    Full-text available
    In order to compare the frontal cortex of rat and macaque monkey, cortical and subcortical afferents to subdivisions of the medial frontal cortex (MFC) in the rat were analyzed with fluorescent retrograde tracers. In addition to afferent inputs common to the whole MFC, each subdivision of the MFC has a specific pattern of afferent connections. The dorsally situated precentral medial area (PrCm) was the only area to receive inputs from the somatosensory cortex. The specific pattern of afferents common to the ventrally situated prelimbic (PL) and infralimbic (IL) areas included projections from the agranular insular cortex, the entorhinal and piriform cortices, the CA1–CA2 fields of the hippocampus, the subiculum, the endopiriform nucleus, the amygdalopiriform transition, the amygdalohippocampal area, the lateral tegmentum, and the parabrachial nucleus. In all these structures, the number of retrogradely labeled cells was larger when the injection site was located in area IL. The dorsal part of the anterior cingulate area (ACd) seemed to be connectionally intermediate between the adjacent areas PrCm and PL; it receives neither the somatosensory inputs characteristic of area PrCm nor the afferents characteristic of areas PL and IL, with the exception of the afferents from the caudal part of the retrosplenial cortex. A comparison of the pattern of afferent and efferent connections of the rat MFC with the pattern of macaque prefrontal cortex suggests that PrCm and ACd areas share some properties with the macaque premotor cortex, whereas PL and IL areas may have characteristics in common with the cingulate or with medial areas 24, 25, and 32 and with orbital areas 12, 13, and 14 of macaques.
  • Article
    Twelve normal subjects viewed alternating sequences of unfamiliar faces, unpronounceable nonword letterstrings, and textures while echoplanar functional magnetic resonance images were acquired in seven slices extending from the posterior margin of the splenium to near the occipital pole. These stimuli were chosen to elicit initial category-specific processing in extrastriate cortex while minimizing semantic processing. Overall, faces evoked more activation than did letterstrings. Comparing hemispheres, faces evoked greater activation in the right than the left hemisphere, whereas letterstrings evoked greater activation in the left than the right hemisphere. Faces primarily activated the fusiform gyrus bilaterally, and also activated the right occipitotemporal and inferior occipital sulci and a region of lateral cortex centered in the middle temporal gyrus. Letterstrings primarily activated the left occipitotemporal and inferior occipital sulci. Textures primarily activated portions of the collateral sulcus. In the left hemisphere, 9 of the 12 subjects showed a characteristic pattern in which faces activated a discrete region of the lateral fusiform gyrus, whereas letterstrings activated a nearby region of cortex within the occipitotemporal and inferior occipital sulci. These results suggest that different regions of ventral extrastriate cortex are specialized for processing the perceptual features of faces and letterstrings, and that these regions are intermediate between earlier processing in striate and peristriate cortex, and later lexical, semantic, and associative processing in downstream cortical regions.
  • Article
    Full-text available
    The amygdala is thought to play a crucial role in emotional and social behaviour. Animal studies implicate the amygdala in both fear conditioning and face perception. In humans, lesions of the amygdala can lead to selective deficits in the recognition of fearful facial expressions and impaired fear conditioning, and direct electrical stimulation evokes fearful emotional responses. Here we report direct in vivo evidence of a differential neural response in the human amygdala to facial expressions of fear and happiness. Positron-emission tomography (PET) measures of neural activity were acquired while subjects viewed photographs of fearful or happy faces, varying systematically in emotional intensity. The neuronal response in the left amygdala was significantly greater to fearful as opposed to happy expressions. Furthermore, this response showed a significant interaction with the intensity of emotion (increasing with increasing fearfulness, decreasing with increasing happiness). The findings provide direct evidence that the human amygdala is engaged in processing the emotional salience of faces, with a specificity of response to fearful facial expressions.
  • Article
    We measured amygdala activity in human volunteers during rapid visual presentations of fearful, happy, and neutral faces using functional magnetic resonance imaging (fMRI). The first experiment involved a fixed order of conditions both within and across runs, while the second one used a fully counterbalanced order in addition to a low level baseline of simple visual stimuli. In both experiments, the amygdala was preferentially activated in response to fearful versus neutral faces. In the counterbalanced experiment, the amygdala also responded preferentially to happy versus neutral faces, suggesting a possible generalized response to emotionally valenced stimuli. Rapid habituation effects were prominent in both experiments. Thus, the human amygdala responds preferentially to emotionally valenced faces and rapidly habituates to them.
  • Article
    If subjects are shown an angry face as a target visual stimulus for less than forty milliseconds and are then immediately shown an expressionless mask, these subjects report seeing the mask but not the target. However, an aversively conditioned masked target can elicit an emotional response from subjects without being consciously perceived. Here we study the mechanism of this unconsciously mediated emotional learning. We measured neural activity in volunteer subjects who were presented with two angry faces, one of which, through previous classical conditioning, was associated with a burst of white noise. In half of the trials, the subjects' awareness of the angry faces was prevented by backward masking with a neutral face. A significant neural response was elicited in the right, but not left, amygdala to masked presentations of the conditioned angry face. Unmasked presentations of the same face produced enhanced neural activity in the left, but not right, amygdala. Our results indicate that, first, the human amygdala can discriminate between stimuli solely on the basis of their acquired behavioural significance, and second, this response is lateralized according to the subjects' level of awareness of the stimuli.
  • Article
    Full-text available
    Neuroimaging studies have shown differential amygdala responses to masked ("unseen") emotional stimuli. How visual signals related to such unseen stimuli access the amygdala is unknown. A possible pathway, involving the superior colliculus and pulvinar, is suggested by observations of patients with striate cortex lesions who show preserved abilities to localize and discriminate visual stimuli that are not consciously perceived ("blindsight"). We used measures of right amygdala neural activity acquired from volunteer subjects viewing masked fear-conditioned faces to determine whether a colliculo-pulvinar pathway was engaged during processing of these unseen target stimuli. Increased connectivity between right amygdala, pulvinar, and superior colliculus was evident when fear-conditioned faces were unseen rather than seen. Right amygdala connectivity with fusiform and orbitofrontal cortices decreased in the same condition. By contrast, the left amygdala, whose activity did not discriminate seen and unseen fear-conditioned targets, showed no masking-dependent changes in connectivity with superior colliculus or pulvinar. These results suggest that a subcortical pathway to the right amygdala, via midbrain and thalamus, provides a route for processing behaviorally relevant unseen visual events in parallel to a cortical route necessary for conscious identification.
  • Article
    Anterior cingulate cortex (ACC) is a part of the brain's limbic system. Classically, this region has been related to affect, on the basis of lesion studies in humans and in animals. In the late 1980s, neuroimaging research indicated that ACC was active in many studies of cognition. The findings from EEG studies of a focal area of negativity in scalp electrodes following an error response led to the idea that ACC might be the brain's error detection and correction device. In this article, these various findings are reviewed in relation to the idea that ACC is a part of a circuit involved in a form of attention that serves to regulate both cognitive and emotional processing. Neuroimaging studies showing that separate areas of ACC are involved in cognition and emotion are discussed and related to results showing that the error negativity is influenced by affect and motivation. In addition, the development of the emotional and cognitive roles of ACC are discussed, and how the success of this regulation in controlling responses might be correlated with cingulate size. Finally, some theories are considered about how the different subdivisions of ACC might interact with other cortical structures as a part of the circuits involved in the regulation of mental and emotional activity.
  • Article
    We assessed the effect of directed attention on early neurophysiological indices of face processing, measuring the N170 event-related potential (ERP). Twelve subjects were tested on two tasks each in which they attended either to eyes only or to faces with eyes closed, presented within series of facial and control stimuli. Consistent with the ERP literature, N170 was recorded to facial stimuli at posterior temporal electrodes and a concomitant positive peak at the vertex, with latencies around 150 ms for faces and 174 ms for eyes. However, unlike fMRI studies, neither the latency nor the amplitude of the peaks were sensitive to the target/non-target status of either the eyes or the face stimuli. This suggests that early stages of face processing indexed by N 170 are automatic and unmodified by selective attention.
  • Article
    Full-text available
    An analysis of response latencies shows that when an image is presented to the visual system, neuronal activity is rapidly routed to a large number of visual areas. However, the activity of cortical neurons is not determined by this feedforward sweep alone. Horizontal connections within areas, and higher areas providing feedback, result in dynamic changes in tuning. The differences between feedforward and recurrent processing could prove pivotal in understanding the distinctions between attentive and pre-attentive vision as well as between conscious and unconscious vision. The feedforward sweep rapidly groups feature constellations that are hardwired in the visual brain, yet is probably incapable of yielding visual awareness; in many cases, recurrent processing is necessary before the features of an object are attentively grouped and the stimulus can enter consciousness.
  • Article
    In the present study we report a double dissociation between right and left medial temporal lobe damage in the modulation of fear responses to different types of stimuli. We found that right unilateral temporal lobectomy (RTL) patients, in contrast to control subjects and left temporal lobectomy (LTL) patients, failed to show potentiated startle while viewing negative pictures. However, the opposite pattern of impairment was observed during a stimulus that patients had been told signaled the possibility of shock. Control subjects and RTL patients showed potentiated startle while LTL patients failed to show potentiated startle. We hypothesize that the right medial temporal lobe modulates fear responses while viewing emotional pictures, which involves exposure to (emotional) visual information and is consistent with the emotional processing traditionally ascribed to the right hemisphere. In contrast, the left medial temporal lobe modulates fear responses when those responses are the result of a linguistic/cognitive representation acquired through language, which, like other verbally mediated material, generally involves the left hemisphere. Additional evidence from case studies suggests that, within the medial temporal lobe, the amygdala is responsible for this modulation.
  • Article
    Requiring only minimal assumptions for validity, nonparametric permutation testing provides a flexible and intuitive methodology for the statistical analysis of data from functional neuroimaging experiments, at some computational expense. Introduced into the functional neuroimaging literature by Holmes et al. ([1996]: J Cereb Blood Flow Metab 16:7-22), the permutation approach readily accounts for the multiple comparisons problem implicit in the standard voxel-by-voxel hypothesis testing framework. When the appropriate assumptions hold, the nonparametric permutation approach gives results similar to those obtained from a comparable Statistical Parametric Mapping approach using a general linear model with multiple comparisons corrections derived from random field theory. For analyses with low degrees of freedom, such as single subject PET/SPECT experiments or multi-subject PET/SPECT or fMRI designs assessed for population effects, the nonparametric approach employing a locally pooled (smoothed) variance estimate can outperform the comparable Statistical Parametric Mapping approach. Thus, these nonparametric techniques can be used to verify the validity of less computationally expensive parametric approaches. Although the theory and relative advantages of permutation approaches have been discussed by various authors, there has been no accessible explication of the method, and no freely distributed software implementing it. Consequently, there have been few practical applications of the technique. This article, and the accompanying MATLAB software, attempts to address these issues. The standard nonparametric randomization and permutation testing ideas are developed at an accessible level, using practical examples from functional neuroimaging, and the extensions for multiple comparisons described. Three worked examples from PET and fMRI are presented, with discussion, and comparisons with standard parametric approaches made where appropriate. Practical considerations are given throughout, and relevant statistical concepts are expounded in appendices.
  • Article
    Using event-related brain potentials (ERPs), we investigated the time course of facial expression processing in human subjects watching photographs of fearful and neutral faces. Upright fearful faces elicited a frontocentral positivity within 120 ms after stimulus presentation, which was followed by a broadly distributed sustained positivity beyond 250 ms post-stimulus. Emotional expression effects were delayed and attenuated when faces were inverted. In contrast, the face-specific N170 component was completely unaffected by facial expression. We conclude that emotional expression analysis and the structural encoding of faces are parallel processes. Early emotional ERP modulations may reflect the rapid activation of prefrontal areas involved in the analysis of facial expression.
  • Conference Paper
    We describe a new magnetic resonance (MR) image analysis tool that produces cortical surface representations with spherical topology from MR images of the human brain. The tool provides a sequence of low-level operations in a single package that can produce accurate brain segmentations in clinical time. The tools include skull and scalp removal, image nonuniformity compensation, voxel-based tissue classification, topological correction, rendering, and editing functions. The collection of tools is designed to require minimal user interaction to produce cortical representations. In this paper we describe the theory of each stage of the cortical surface identification process. We then present classification validation results using real and phantom data. We also present a study of interoperator variability.
  • Article
    Here we used magnetoencephalography (MEG) to investigate stages of processing in face perception in humans. We found a face-selective MEG response occurring only 100 ms after stimulus onset (the 'M100'), 70 ms earlier than previously reported. Further, the amplitude of this M100 response was correlated with successful categorization of stimuli as faces, but not with successful recognition of individual faces, whereas the previously-described face-selective 'M170' response was correlated with both processes. These data suggest that face processing proceeds through two stages: an initial stage of face categorization, and a later stage at which the identity of the individual face is extracted.
  • Article
    Full-text available
    The ability to cognitively regulate emotional responses to aversive events is important for mental and physical health. Little is known, however, about neural bases of the cognitive control of emotion. The present study employed functional magnetic resonance imaging to examine the neural systems used to reappraise highly negative scenes in unemotional terms. Reappraisal of highly negative scenes reduced subjective experience of negative affect. Neural correlates of reappraisal were increased activation of the lateral and medial prefrontal regions and decreased activation of the amygdala and medial orbito-frontal cortex. These findings support the hypothesis that prefrontal cortex is involved in constructing reappraisal strategies that can modulate activity in multiple emotion-processing systems.
  • Article
    High and low spatial frequency information in visual images is processed by distinct neural channels. Using event-related functional magnetic resonance imaging (fMRI) in humans, we show dissociable roles of such visual channels for processing faces and emotional fearful expressions. Neural responses in fusiform cortex, and effects of repeating the same face identity upon fusiform activity, were greater with intact or high-spatial-frequency face stimuli than with low-frequency faces, regardless of emotional expression. In contrast, amygdala responses to fearful expressions were greater for intact or low-frequency faces than for high-frequency faces. An activation of pulvinar and superior colliculus by fearful expressions occurred specifically with low-frequency faces, suggesting that these subcortical pathways may provide coarse fear-related inputs to the amygdala.
  • Article
    Full-text available
    Neuroimaging studies have identified at least two bilateral areas of the visual extrastriate cortex that respond more to pictures of faces than objects in normal human subjects in the middle fusiform gyrus [the 'fusiform face area' (FFA)] and, more posteriorly, in the inferior occipital cortex ['occipital face area' (OFA)], with a right hemisphere dominance. However, it is not yet clear how these regions interact which each other and whether they are all necessary for normal face perception. It has been proposed that the right hemisphere FFA acts as an isolated ('modular') processing system for faces or that this region receives its face-sensitive inputs from the OFA in a feedforward hierarchical model of face processing. To test these proposals, we report a detailed neuropsychological investigation combined with a neuroimaging study of a patient presenting a deficit restricted to face perception, consecutive to bilateral occipito-temporal lesions. Due to the asymmetry of the lesions, the left middle fusiform gyrus and the right inferior occipital cortex were damaged but the right middle fusiform gyrus was structurally intact. Using functional MRI, we disclosed a normal activation of the right FFA in response to faces in the patient despite the absence of any feedforward inputs from the right OFA, located in a damaged area of cortex. Together, these findings show that the integrity of the right OFA is necessary for normal face perception and suggest that the face-sensitive responses observed at this level in normal subjects may arise from feedback connections from the right FFA. In agreement with the current literature on the anatomical basis of prosopagnosia, it is suggested that the FFA and OFA in the right hemisphere and their re-entrant integration are necessary for normal face processing.
  • Article
    Synthetic aperture magnetometry (SAM) is a nonlinear beamformer technique for producing 3D images of cortical activity from magnetoencephalography data. We have previously shown how SAM images can be spatially normalised and averaged to form a group image. In this paper we show how nonparametric permutation methods can be used to make robust statistical inference about group SAM data. Data from a biological motion direction discrimination experiment were analysed using both a nonparametric analysis toolbox (SnPM) and a conventional parametric approach utilising Gaussian field theory. In data from a group of six subjects, we were able to show robust group activation at the P < 0.05 (corrected) level using the nonparametric methods, while no significant clusters were found using the conventional parametric approach. Activation was found using SnPM in several regions of right occipital-temporal cortex, including the superior temporal sulcus, V5/MT, the fusiform gyrus, and the lateral occipital complex.
  • Article
    Facial emotions represent an important part of non-verbal communication used in everyday life. Recent studies on emotional processing have implicated differing brain regions for different emotions, but little has been determined on the timing of this processing. Here we presented a large number of unfamiliar faces expressing the six basic emotions, plus neutral faces, to 26 young adults while recording event-related potentials (ERPs). Subjects were naive with respect to the specific questions investigated; it was an implicit emotional task. ERPs showed global effects of emotion from 90 ms (P1), while latency and amplitude differences among emotional expressions were seen from 140 ms (N170 component). Positive emotions evoked N170 significantly earlier than negative emotions and the amplitude of N170 evoked by fearful faces was larger than neutral or surprised faces. At longer latencies (330-420 ms) at fronto-central sites, we also found a different pattern of effects among emotions. Localization analyses confirmed the superior and middle-temporal regions for early processing of facial expressions; the negative emotions elicited later, distinctive activations. The data support a model of automatic, rapid processing of emotional expressions.
  • Article
    Full-text available
    The amygdala is known to play an important role in conscious and unconscious processing of emotional and highly arousing stimuli. Neuroanatomical evidence suggests that the amygdala participates in the control of autonomic responses, such as skin conductance responses (SCRs), elicited by emotionally salient stimuli, but little is known regarding its functional role in such control. We investigated this issue by showing emotional visual stimuli of varying arousal to patients with left (n = 12), right (n = 8), and bilateral (n = 3) amygdala damage and compared their results with those from 38 normal controls. Stimuli were presented both subliminally (using backward masking) and supraliminally under lateralized presentation to one visual hemifield. We collected SCRs as a physiological index of emotional responses. Subjects subsequently rated each stimulus on valence and arousal under free viewing conditions. There were two key findings: (1) impaired overall SCR after right amygdala damage; and (2) impaired correlation of SCR with the rated arousal of the stimuli after left amygdala damage. The second finding was strengthened further by finding a positive correlation between the evoked SCR magnitude and postsurgery amygdala volume, indicating impaired autonomic responses with larger tissue damage. Bilateral amygdala damage resulted in severe impairments on both of the above measures. Our results provide support for the hypothesis that the left and right amygdalae subserve different functions in emotion processing: the left may decode the arousal signaled by the specific stimulus, whereas the right may provide a global level of autonomic activation triggered automatically by any arousing stimulus.
  • Article
    Full-text available
    The amygdala has been implicated in fundamental functions for the survival of the organism, such as fear and pain. In accord with this, several studies have shown increased amygdala activity during fear conditioning and the processing of fear-relevant material in human subjects. In contrast, functional neuroimaging studies of pain have shown a decreased amygdala activity. It has previously been proposed that the observed deactivations of the amygdala in these studies indicate a cognitive strategy to adapt to a distressful but in the experimental setting unavoidable painful event. In this positron emission tomography study, we show that a simple contextual manipulation, immediately preceding a painful stimulation, that increases the anticipated duration of the painful event leads to a decrease in amygdala activity and modulates the autonomic response during the noxious stimulation. On a behavioral level, 7 of the 10 subjects reported that they used coping strategies more intensely in this context. We suggest that the altered activity in the amygdala may be part of a mechanism to attenuate pain-related stress responses in a context that is perceived as being more aversive. The study also showed an increased activity in the rostral part of anterior cingulate cortex in the same context in which the amygdala activity decreased, further supporting the idea that this part of the cingulate cortex is involved in the modulation of emotional and pain networks.
  • Article
    Non-parametric statistical methods, such as permutation, are flexible tools to analyze data when the population distribution is not known. With minimal assumptions and better statistical power compared to the parametric tests, permutation tests have recently been applied to the spatially filtered magnetoencephalography (MEG) data for group analysis. To perform permutation tests on neuroimaging data, an empirical maximal null distribution has to be found, which is free from any activated voxels, to determine the threshold to classify the voxels as active at a given probability level. An iterative procedure is used to determine the distribution by computing the null distribution, which is recomputed when a possible activated voxel is found within the current distributions. Besides the high computational costs associated with this approach, there is no guarantee that all activated voxels are excluded when constructing the maximal null distribution, which may reduce the statistical power. In this study, we propose a novel way to construct the maximal null distribution from the data of the resting period. The approach is tested on the MEG data from a somatosensory experiment, and demonstrated that the approach could improve the power of the permutation test while reducing the computational cost at the same time.
  • Article
    Full-text available
    The amygdala was more responsive to fearful (larger) eye whites than to happy (smaller) eye whites presented in a masking paradigm that mitigated subjects' awareness of their presence and aberrant nature. These data demonstrate that the amygdala is responsive to elements of.
  • Article
    Traditional split-field studies and patient research indicate a privileged role for the right hemisphere in emotional processing [1-7], but there has been little direct fMRI evidence for this, despite many studies on emotional-face processing [8-10](see Supplemental Background). With fMRI, we addressed differential hemispheric processing of fearful versus neutral faces by presenting subjects with faces bilaterally [11-13]and orthogonally manipulating whether each hemifield showed a fearful or neutral expression prior to presentation of a checkerboard target. Target discrimination in the left visual field was more accurate after a fearful face was presented there. Event-related fMRI showed right-lateralized brain activations for fearful minus neutral left-hemifield faces in right visual areas, as well as more activity in the right than in the left amygdala. These activations occurred regardless of the type of right-hemifield face shown concurrently, concordant with the behavioral effect. No analogous behavioral or fMRI effects were observed for fearful faces in the right visual field (left hemisphere). The amygdala showed enhanced functional coupling with right-middle and anterior-fusiform areas in the context of a left-hemifield fearful face. These data provide behavioral and fMRI evidence for right-lateralized emotional processing during bilateral stimulation involving enhanced coupling of the amygdala and right-hemispheric extrastriate cortex.
  • Article
    In the past few years, important contributions have been made to the study of emotional visual perception. Researchers have reported responses to emotional stimuli in the human amygdala under some unattended conditions (i.e. conditions in which the focus of attention was diverted away from the stimuli due to task instructions), during visual masking and during binocular suppression. Taken together, these results reveal the relative degree of autonomy of emotional processing. At the same time, however, important limitations to the notion of complete automaticity have been revealed. Effects of task context and attention have been shown, as well as large inter-subject differences in sensitivity to the detection of masked fearful faces (whereby briefly presented, target fearful faces are immediately followed by a neutral face that 'masks' the initial face). A better understanding of the neural basis of emotional perception and how it relates to visual attention and awareness is likely to require further refinement of the concepts of automaticity and awareness.
  • Article
    To investigate whether the processing of faces and emotional facial expression can be modulated by spatial attention, ERPs were recorded in response to stimulus arrays containing two faces and two non-face stimuli (houses). In separate trials, attention was focused on the face pair or on the house pair, and facial expression was either fearful or neutral. When faces were attended, a greater frontal positivity in response to arrays containing fearful faces was obtained, starting about 100 ms after stimulus onset. In contrast, with faces located outside the attentional focus, this emotional expression effect was completely eliminated. This differential result demonstrates for the first time a strong attentional gating of brain processes involved in the analysis of emotional facial expression. It is argued that while an initial detection of emotionally relevant events mediated by the amygdala may occur pre-attentively, subsequent stages of emotional processing require focal spatial attention. The face-sensitive N170 component was unaffected by emotional facial expression, but N170 amplitudes were enhanced when faces were attended, suggesting that spatial attention can modulate the structural encoding of faces.
  • Article
    We describe a novel spatial filtering approach to the localization of cortical activity accompanying voluntary movements. The synthetic aperture magnetometry (SAM) minimum-variance beamformer algorithm was used to compute spatial filters three-dimensionally over the entire brain from single trial neuromagnetic recordings of subjects performing self-paced index finger movements. Images of instantaneous source power ("event-related SAM") computed at selected latencies revealed activation of multiple cortical motor areas prior to and following left and right index finger movements in individual subjects, even in the presence of low-frequency noise (e.g., eye movements). A slow premovement motor field (MF) reaching maximal amplitude approximately 50 ms prior to movement onset was localized to the hand area of contralateral precentral gyrus, followed by activity in the contralateral postcentral gyrus at 40 ms, corresponding to the first movement-evoked field (MEFI). A novel finding was a second activation of the precentral gyrus at a latency of approximately 150 ms, corresponding to the second movement-evoked field (MEFII). Group averaging of spatially normalized images indicated additional premovement activity in the ipsilateral precentral gyrus and the left inferior parietal cortex for both left and right finger movements. Weaker activations were also observed in bilateral premotor areas and the supplementary motor area. These results show that event-related beamforming provides a robust method for studying complex patterns of time-locked cortical activity accompanying voluntary movements, and offers a new approach for the localization of multiple cortical sources derived from neuromagnetic recordings in single subject and group data.
  • Article
    Recent functional imaging, neuropsychological and electrophysiological studies on adults have provided evidence for a fast, low-spatial-frequency, subcortical face-detection pathway that modulates the responses of certain cortical areas to faces and other social stimuli. These findings shed light on an older literature on the face-detection abilities of newborn infants, and the hypothesis that these newborn looking preferences are generated by a subcortical route. Converging lines of evidence indicate that the subcortical face route provides a developmental foundation for what later becomes the adult cortical 'social brain' network, and that disturbances to this pathway might contribute to certain developmental disorders.
  • Article
    Full-text available
    We studied attentional modulation of cortical processing of faces and houses with functional MRI and magnetoencephalography (MEG). MEG detected an early, transient face-selective response. Directing attention to houses in “double-exposure” pictures of superimposed faces and houses strongly suppressed the characteristic, face-selective functional MRI response in the fusiform gyrus. By contrast, attention had no effect on the M170, the early, face-selective response detected with MEG. Late (>190 ms) category-related MEG responses elicited by faces and houses, however, were strongly modulated by attention. These results indicate that hemodynamic and electrophysiological measures of face-selective cortical processing complement each other. The hemodynamic signals reflect primarily late responses that can be modulated by feedback connections. By contrast, the early, face-specific M170 that was not modulated by attention likely reflects a rapid, feed-forward phase of face-selective processing. • functional MRI • human • magnetoencephalography • visual