Article

Impaired fixation to eyes following amygdala damage arises from abnormal bottom-up attention

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

SM is a patient with complete bilateral amygdala lesions who fails to fixate the eyes in faces and is consequently impaired in recognizing fear (Adolphs et al., 2005). Here we first replicated earlier findings in SM of reduced gaze to the eyes when seen in whole faces. Examination of the time course of fixations revealed that SM's reduced eye contact is particular pronounced in the first fixation to the face, and less abnormal in subsequent fixations. In a second set of experiments, we used a gaze-contingent presentation of faces with real time eye tracking, wherein only a small region of the face is made visible at the center of gaze. In essence, viewers explore the face by moving a small searchlight over the face with their gaze. Under such viewing conditions, SM's fixations to eye region of faces became entirely normalized. We suggest that this effect arises from the absence of bottom-up effects due to the facial features, allowing gaze location to be driven entirely by top-down control. Together with SM's failure to fixate the eyes in whole faces primarily at the very first saccade, the findings suggest that the saliency of the eyes normally attract our gaze in an amygdala-dependent manner. Impaired eye gaze is also a prominent feature of several psychiatric illnesses in which the amygdala has been hypothesized to be dysfunctional, and our findings and experimental manipulation may hold promise for interventions in such populations, including autism and fragile X syndrome.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... This line of research has highlighted the importance of the amygdala and the ventromedial prefrontal cortex (vmPFC) as key regions in face perception (Todorov, 2012) as well as in attention orienting to socially salient cues like the eye area (Adolphs et al., 2005;Wolf, Philippi, Motzkin, Baskaya, & Koenigs, 2014). Accordingly, eye tracking studies with lesion patients have shown that amygdala and vmPFC are involved in facial emotion processing and play a crucial role for spontaneous attention orienting to the eye region (Gamer, Schmitz, Tittgemeyer, & Schilbach, 2013;Kennedy & Adolphs, 2010;Wolf et al., 2014). Moreover, amygdala activation in response to emotional faces has been shown to be correlated with the fixation of the eye region in healthy adults (Gamer & Büchel, 2009). ...
... Second, no previous study examined whether these abnormalities are restricted to emotion categorization or generalize across different task demands. Finally, it is important to further delineate which attentional components are impaired and whether early attention orienting is affected as well (like for instance in brain lesion patients; Kennedy & Adolphs, 2010;Wolf et al., 2014). ...
... Task order was fixed for all participants, beginning with the gender discrimination task. The trial structure was as follows: To start the trial, participants were required to fixate a cross on the left or on the right side (presentation side was balanced) of the screen for a 300 ms interval (compare Kennedy & Adolphs, 2010). Since a central start position of gaze at stimulus onset might influence the position of the first detected fixation by allowing a more detailed processing of the fixated stimulus area prior to the initial saccade (Arizpe, Kravitz, Yovel, & Baker, 2012), we ensured that the actual fixation was not within the image at stimulus onset. ...
Article
Attention orienting to socially salient cues, such as the eyes of interaction partners, is assumed to be crucial for the development of intact social cognition. Dysfunctions in such basic processes that guide the perception of social cues have been suggested to play a role in the development of psychopathy. The present study investigated gaze patterns in two groups of incarcerated psychopathic and non-psychopathic offenders. While recording their eye movements, participants were asked to categorize either gender (task 1) or emotional expression (task 2) of facial images. Psychopaths exhibited significantly reduced attention orienting toward the eyes, as indicated by absolute dwell time as well as frequency of the initial fixation on the eye region. This pattern was evident across all emotional expressions and independent of the task. The present results suggest a pervasive impairment to attention orienting toward the eyes in psychopaths compared to non-psychopathic offenders. This impairment appears to affect not only general attention but also early attention shifts. Thus, our findings provide evidence that these dysfunctions might particularly contribute to the development of psychopathy instead of antisocial behavior per se. Future studies should further examine the origin, emergence, and consequences of these impairments in order to develop targeted interventions.
... Under free-viewing conditions, macaques, like humans, explore the eye region of the face more than any other facial feature 13,14 . In humans with amygdala damage, the deficits in perception and recognition of facial expressions can be attributed to decreased fixation on the eyes and increased fixation on the mouth while viewing faces [15][16][17] . Studies in monkeys have also shown that neurons in the amygdala respond preferentially to fixations on the eyes of conspecifics 18 , as well as during the production of expressions 19 . ...
... Consistent with previous studies in monkeys 13,14 and human participants 17 , both groups spent more time exploring the eyes than the mouth. However, control animals spent more time exploring the eye region compared with monkeys with amygdala lesions, whereas the lesion group spent more time exploring the mouth compared with the controls, similar to previous studies in human participants with amygdala lesions 16,17 . A role for the amygdala in driving exploration of the eyes is further substantiated by neuroimaging and electrophysiological evidence. ...
... Slits were cut in the dura to allow passage of the injection needle. A mean of 18 injections (range [13][14][15][16][17][18][19][20][21][22], each consisting of 0.6-1.0 ml ibotenic acid (10-15 mg ml À 1 ; Biosearch Technologies or Sigma), were made into the amygdala via the 30-gauge needle of a Hamilton syringe held in a micromanipulator. The injection sites were roughly 2 mm apart in each plane. ...
Article
Full-text available
Evidence from animal and human studies has suggested that the amygdala plays a role in detecting threat and in directing attention to the eyes. Nevertheless, there has been no systematic investigation of whether the amygdala specifically facilitates attention to the eyes or whether other features can also drive attention via amygdala processing. The goal of the present study was to examine the effects of amygdala lesions in rhesus monkeys on attentional capture by specific facial features, as well as gaze patterns and changes in pupil dilation during free viewing. Here we show reduced attentional capture by threat stimuli, specifically the mouth, and reduced exploration of the eyes in free viewing in monkeys with amygdala lesions. Our findings support a role for the amygdala in detecting threat signals and in directing attention to the eye region of faces when freely viewing different expressions.
... In addition to distinctive patterns of eye looking in individuals with neurodevelopmental disorders associated with divergent socio-behavioural characteristics, reduced eye looking has been well documented in individuals with amygdala damage. For example, failure to spontaneously fixate to the eye region of static faces has been reported in a patient with bilateral amygdala damage [6]. Reduced eye contact during real social interactions has also been shown in a patient with amygdala damage [7], and a positive relationship between amygdala activation and looking to the eye region of faces has been documented in ASD [8]. ...
... In support of this, eye contact has been reported to predict performance on a facial emotion recognition task in individuals with ASD [14], and increased emotion recognition performance has also been reported in those looking longer to the eye region [15]. Furthermore, neuropsychological patients with damage to the amygdala have been found to exhibit both reduced looking to the eye region of static faces and reduced ability to discriminate facial expressions of emotion [6,16]. Both of these findings are in line with the hypothesis that looking to the eye region is important for successful emotion recognition, which in turn has been suggested to be important for successful social interaction [17]. ...
... These findings do not support the hypothesis of a difference between groups with contrasting profiles of social behaviour exhibiting different face processing techniques. Furthermore, as existing literature points to a role for amygdala dysfunction in reduced looking to the eye region of static faces [6], the results from the present study indicate that the documented differences in CdLS and RTS are unlikely to be subcortically mediated. ...
Article
Full-text available
Background Existing literature suggests differences in face scanning in individuals with different socio-behavioural characteristics. Cornelia de Lange syndrome (CdLS) and Rubinstein-Taybi syndrome (RTS) are two genetically defined neurodevelopmental disorders with unique profiles of social behaviour. Methods Here, we examine eye gaze to the eye and mouth regions of neutrally expressive faces, as well as the spontaneous visual preference for happy and disgusted facial expressions compared to neutral faces, in individuals with CdLS versus RTS. Results Results indicate that the amount of time spent looking at the eye and mouth regions of faces was similar in 15 individuals with CdLS and 17 individuals with RTS. Both participant groups also showed a similar pattern of spontaneous visual preference for emotions. Conclusions These results provide insight into two rare, genetically defined neurodevelopmental disorders that have been reported to exhibit contrasting socio-behavioural characteristics and suggest that differences in social behaviour may not be sufficient to predict attention to the eye region of faces. These results also suggest that differences in the social behaviours of these two groups may be cognitively mediated rather than subcortically mediated.
... For instance, congenital bilateral amygdalar pathology due to Urbach-Wiethe disease (UWD) [7] is usually accompanied by alterations in emotion recognition [8]. Case studies by Adolphs and colleagues [9][10][11] involving a patient with UWD and complete bilateral amygdalar damage revealed that the amygdalae are important for directing attention to the eye region of faces, in particular during the recognition of fearful faces. Some studies on patients with unilateral amygdalar pathology due to temporal lobectomy [10,12] did not find a selective deficit in fear recognition, and a normal scanning of the eye region [10]. ...
... In concordance with prior research [15,16,26,27], the results of our exploratory study show an impaired capacity for recognizing facial emotional expressions in PWFE, while we did not find a selective impairment in fear recognition, as has been described in some studies [9,11], but not in others [15]. ...
Article
Background : Eye-movement patterns during facial emotion recognition are under-researched in patients with focal epilepsy (PWFE). Previous studies including other neurological patients indicate that bilateral mesiotemporal damage could be associated with impaired emotion recognition and abnormal eye-movement patterns. Aims : The current study addresses the question whether PWFE, in whom fronto-(mesio-)temporal networks are often disturbed, also show abnormal eye-movement patterns during facial emotion recognition. Method : 24 PWFE and a group of 29 healthy controls (HC) performed a facial emotion recognition task and a gender recognition task while eye movements were recorded with an eye-tracker. For this purpose, Areas of Interest (AOI) were defined in the presented faces: the eye region and the mouth region. In addition to the proportion of correctly recognized emotions, the following eye-tracking parameters were recorded: Relative fixation duration (FD)/fixation count (FC) in the mouth region/eye region (relative to the FD/FC on the entire screen). Results : PFWE showed an emotion recognition deficit compared to HC, whereas gender recognition performance did not differ between groups. In addition, PWFE showed significantly fewer and shorter fixations in the mouth region than HC, in both the emotion recognition task and the gender recognition task. Conclusions : When looking at faces, PFWE show eye-movement patterns different from those of healthy controls. Behaviorally, PWFE are only impaired in emotion recognition. Hence, PWFE possibly scan facial regions that are relevant to successful emotion recognition more diffusely and less efficiently than healthy control subjects. Future studies should investigate the etiology of such abnormal eye-movement patterns in PWFE.
... This, in turn, raises the question whether a special neuro-cognitive system, distinct from the ventral or dorsal network suggested for bottom-up and top-down attention, mediates social attention and its rapid allocation. The study of Kennedy & Adolphs (2010), who showed that irregular bottom-up processing caused by amygdala lesions can be overcome using a gaze-contingent paradigm, indicates how important the disentanglement of these processes are. Furthermore, such patient studies can offer insight to underlying mechanisms and further our understanding of brain areas involved in social processing. ...
... End and Gamer (adapted from Kennedy & Adolphs (2010). The stimuli in the gaze-contingent condition were masked with a fixed grid of small dots located 2.2° from one another with a 3-pixel diameter to offer a sense of coordination during stimulus exploration ( Figure 8). ...
Thesis
Full-text available
Humans in our environment are of special importance to us. Even if our minds are fixated on tasks unrelated to their presence, our attention will likely be drawn towards other people’s appearances and their actions. While we might remain unaware of this attentional bias at times, various studies have demonstrated the preferred visual scanning of other humans by recording eye movements in laboratory settings. The present thesis aims to investigate the circumstances under and the mechanisms by which this so-called social attention operates. The first study demonstrates that social features in complex naturalistic scenes are prioritized in an automatic fashion. After 200 milliseconds of stimulus presentation, which is too brief for top-down processing to intervene, participants targeted image areas depicting humans significantly more often than would be expected from a chance distribution of saccades. Additionally, saccades towards these areas occurred earlier in time than saccades towards non-social image regions. In the second study, we show that human features receive most fixations even when bottom-up information is restricted; that is, even when only the fixated region was visible and the remaining parts of the image masked, participants still fixated on social image regions longer than on regions without social cues. The third study compares the influence of real and artificial faces on gaze patterns during the observation of dynamic naturalistic videos. Here we find that artificial faces, belonging to humanlike statues or machines, significantly predicted gaze allocation but to a lesser extent than real faces. In the fourth study, we employed functional magnetic resonance imaging to investigate the neural correlates of reflexive social attention. Analyses of the evoked blood-oxygenation level dependent responses pointed to an involvement of striate and extrastriate visual cortices in the encoding of social feature space. Collectively, these studies help to elucidate under which circumstances social features are prioritized in a laboratory setting and how this prioritization might be achieved on a neuronal level. The final experimental chapter addresses the question whether these laboratory findings can be generalized to the real world. In this study, participants were introduced to a waiting room scenario in which they interacted with a confederate. Eye movement analyses revealed that gaze behavior heavily depended on the social context and were influenced by whether an interaction is currently desired. We further did not find any evidence for altered gaze behavior in socially anxious participants. Alleged gaze avoidance or hypervigilance in social anxiety might thus represent a laboratory phenomenon that occurs only under very specific real-life conditions. Altogether the experiments described in the present thesis thus refine our understanding of social attention and simultaneously challenge the inferences we can draw from laboratory research.
... This manipulation places demands on top-down 14 control to selectively direct eye-movements to the location of facial features that are 15 informative about the emotional expression in the absence of a stimulus. Previous 16 work has demonstrated that bilateral amygdala damage decreases fixations to the 17 eyes of face stimuli during free viewing, but not when faces are viewed through a 18 gaze-contingent aperture (Kennedy & Adolphs, 2010), suggesting that bottom-up 19 and top-down control over fixations to face stimuli are under the control of different 20 neural processes. Here, we tested if directed control of fixations to face stimuli is 21 affected following VMF damage. ...
... These findings complement the extensive work characterizing the deficits of 15 SM, a patient with selective bilateral amygdala lesions. SM showed reduced fixations 16 to the eyes during free viewing and naturalistic conversations, but looked to this 17 area in gaze-contingent viewing, and showed an improvement in fear recognition 18 when instructed to look at the eyes ( Adolphs et al., 2005;Kennedy & Adolphs, 2010;19 Spezio, Huang, Castelli, & Adolphs, 2007). Thus, unlike VMF damage, deficits in fear 20 recognition after amygdala lesions appear to be due to disruption of a bottom-up 21 biasing of gaze towards informative facial features during more naturalistic viewing, 22 which can be corrected by instruction or need for a top-down strategy in searching 23 facial expressions. ...
Article
Full-text available
Recognizing and distinguishing the emotional states of those around us is crucial for adaptive social behavior. Previous work has shown that damage to the ventromedial frontal lobe (VMF) impairs recognition of subtle emotional facial expressions and affects fixation patterns to face stimuli. However, whether this relates to deficits in acquiring or interpreting facial expression information remains unclear. We tested 37 patients with frontal lobe damage, including 17 subjects with VMF lesions, in a series of emotion recognition tasks with different gaze manipulations. Subjects were asked to rate neutral, subtle and extreme emotional expressions while freely examining faces, while instructed to look only at the eyes, and in a gaze-contingent condition that required top-down direction of eye movements to reveal the stimulus. People with VMF damage were worse at detecting subtle disgust during free viewing and confused extreme emotional expressions more than healthy controls. However, fixation patterns did not differ systematically between groups during free or gaze-contingent viewing conditions. Moreover, instruction to fixate only the eyes did not improve the performance of VMF damaged subjects. These data argue that VMF is not necessary for normal fixations to emotional face stimuli, and that impairments in emotion recognition after VMF damage do not stem from impaired information gathering, as indexed by patterns of fixation.
... This is thought to be due to impaired bottom--up (e.g. stimulus--driven or feature--based) attention in patients with amygdala lesions (Kennedy & Adolphs, 2010). ...
... Patients lacking bilateral amygdalae suffer from impaired automatic orientation towards the salient portions of a face, potentially due to impaired stimulus--driven attention (Tsuchiya et al., 2009;Kennedy and Adolphs, 2010). Within the realm of non--conscious vision research, connections between the amygdala and the pulvinar nucleus of the thalamus are associated with a hypothesized fast--track route for processing emotional stimuli. ...
Article
Motivationally-relevant stimuli summon our attention and benefit from enhanced processing, but the neural mechanisms underlying this prioritization are not well understood. Using an interocular suppression technique and functional neuroimaging, this work has the ultimate aim of understanding how motivation impacts visual perception. In Chapter 2a, we demonstrate that novel objects with a more rich reward history are prioritized in awareness more quickly than objects with a lean reward history. In Chapter 2b, we show that faces are prioritized in awareness following social rejection, and that the amount faces are prioritized correlates with individual differences in social motivation. Chapters 3 & 4 use a combination of functional neuroimaging and flash suppression to suppress fearful faces and houses from awareness. Using binocular rivalry and motion flash suppression in Chapter 3, we find that suppressed fearful faces activate the amygdala relative to suppressed houses, and the amygdala increases coherence with a network of regions involved in attention, including bilateral pulvinar, bilateral insula, left frontal eye fields, left inferior parietal cortex, and early visual cortex. Using the more robust technique, continuous flash suppression, in Chapter 4, we find no differentiation between stimuli based on mean amygdala responses. However, we show increased connectivity between the amygdala, the pulvinar, and inferior parietal cortex specific to fearful faces. Overall, these results indicate that motivationally-relevant stimuli activate the amygdala prior to awareness. Enhanced connectivity between the amygdala and regions involved in attention may underlie the enhanced processing seen for salient stimuli.
... Like in ASD, SM exhibits reduced gaze to the eye region of faces (Adolphs et al., 2005;Spezio, Huang, Castelli, & Adolphs, 2007b), with this effect being particularly pronounced in the first few fixations of a face. A study by one of us (Kennedy and Adolphs, 2010) found that when SM was shown a picture of a face to which she had to saccade, she only initially fixated the eyes on 15 % of the trials, in contrast to controls who first fixated eyes 78 % of the time. However, when this same patient was explicitly instructed to fixate the eye region, she was able to do so, and her previously deficient ability to identify fearful facial expressions became normal (Adolphs et al., 2005). ...
... This points toward a role of the amygdala in directing one's gaze to important social information. Kennedy and Adolphs (2010) further studied the relationship between the amygdala and gaze by having SM and controls explore faces using a gaze-contingent eye tracking paradigm that only revealed a small region of the face in real time at the location being fixated. This task eliminates the competition between facial features and forces the participant to deliberately seek out features to fixate using top-down attentional control, as opposed to bottom-up attentional processes. ...
Chapter
Preferential attention to social features in the environment emerge early in development. These early preferences are thought to set the stage for the development of complex social skills critical for one’s social competence throughout their lifetime. One group of individuals that exhibit striking abnormalities in social attention is that with autism spectrum disorder (ASD). Their abnormalities emerge within the first year of life and persist over the life span. Complementary to the work described in Chap. 7 (Shultz, Jones & Klin, this volume), which explores the early development of social attentional abnormalities in ASD, here we examine how differences in social attention are manifested in older adolescents and adults with ASD. We first review the behavioral literature detailing these differences in social attention, discuss the potential downstream consequences of social attentional disruptions, and examine the neural correlates of atypical social attention. We conclude by discussing possible future directions and outstanding questions in this field.
... Over the last decade, the neuropeptide oxytocin (OXT) has received considerable attention for its crucial role in social behavior and social cognition (Adolphs, 2010;Ebstein, Israel, Chew, Zhong, & Knafo, 2010;Insel, 2010). Moreover, OXT has been regarded as a promising target for the treatment of mental disorders that are characterized by marked deficits in social behavior and social cognition (Meyer-Lindenberg, Domes, Kirsch, & Heinrichs, 2011), such as autism spectrum disorders (ASD), borderline personality disorder (BPD) or social anxiety disorder (SAD). ...
... Normally, the eye region conveys the most important features for emotion recognition (Schyns, et al., 2002;Schyns, et al., 2007Schyns, et al., , 2009Smith, et al., 2005). This is particular true for static expressions, but in dynamic expressions such features are also provided by other regions, such as the mouth region (Kennedy & Adolphs, 2010). OXT may, thus, direct gaze to eye region of static rather than dynamic expressions, especially when time is too short to gaze to other regions than those that provide the most relevant information for emotion recognition. ...
Article
The present thesis comprises a series of studies investigating the effects of oxytocin on facial emotion recognition. Study I and Study II examined whether the effects of oxytocin on emotion recognition are related to shifts in overt and/or covert attention. To this end, participants’ eye gaze and pupil size were recorded while they performed an emotion recognition task that involved the presentation of dynamically changing expressions. Oxytocin enhanced participants’ recognition sensitivity for all expressions, irrespective of the expressions’ emotional valence. These effects appeared to be due to shifts in covert rather than overt attention because oxytocin affected participants’ pupil size but not eye gaze during face processing. Study III further examined whether oxytocin-induced changes in emotion recognition are unrelated to shifts in overt attention. To this end, participants performed an emotion recognition task that involved the masked presentation of static expressions. Oxytocin enhanced participants’ recognition accuracy for all expressions, presumably due to shifts in covert rather than overt attention because the task design precluded any gaze changes during face processing. Taken together these studies suggest that oxytocin generally improves the recognition of various facial expressions, even in the absence of overt attention shifts.
... Variants of this technique have been used to study the processing of objects within the focus of attention (e.g., Smilek, Frischen, Reynolds, Gerritsen, & Eastwood, 2007) and in combination with eye tracking to examine attentional allocation during scene viewing (Dalrymple, Birmingham, Bischof, Barton, & Kingstone, 2011). The MWT offers similar advantages to the gaze-contingent paradigm, in which the position of the window is linked to the observer's eye movements, with information outside the window masked or degraded (e.g., Caldara et al., 2010;Kennedy & Adolphs, 2010;McConkie & Rayner, 1975). However, the gaze-contingent technique eliminates the normal use of peripheral vision to guide eye movements, creating an unnatural viewing situation and abnormally shortened saccades (Foulsham, Teszka, & Kingstone, 2011;Loschky & McConkie, 2002). ...
... Finally, it is clear that during free viewing of full-resolution faces, such as in most eye tracking studies, gaze is guided by both bottom-up visual information and top-down control of attention. The eyes contain high-contrast information (Kobayashi & Koshima, 1997) and are likely to attract attention in part because they are visually salient (Kennedy & Adolphs, 2010), at least when the face is presented in isolation (Birmingham, Bischof, & Kingstone, 2009). In contrast, in the MWT paradigm exploration is largely under strategic control because there is a lack of high-resolution peripheral information to capture attention in a bottom-up manner. ...
Article
Full-text available
The strategies children employ to selectively attend to different parts of the face may reflect important developmental changes in facial emotion recognition. Using the Moving Window Technique (MWT), children aged 5–12 years and adults (N = 129) explored faces with a mouse-controlled window in an emotion recognition task. An age-related increase in attention to the left eye emerged at age 11–12 years and reached significance in adulthood. This left-eye bias is consistent with previous eye tracking research and findings of a perceptual bias for the left side of faces. These results suggest that a strategic attentional bias to the left eye begins to emerge at age 11–12 years and is likely established sometime in adolescence.
... A method called the gaze-contingent paradigm (GC) was introduced to measure the actual visited regions on faces and more precisely control extrafoveal vision. The GC is similar to the MWT, except that the window is contingent on the viewer's eye-gaze instead of the computer mouse (Caldara et al., 2010;Kennedy and Adolphs, 2010;Kim et al., 2017a,b). The GC is similar to the combination of the MWT and eye-tracking methods. ...
Article
Full-text available
With regard to facial emotion recognition, previous studies found that specific facial regions were attended more in order to identify certain emotions. We investigated whether a preferential search for emotion-specific diagnostic regions could contribute toward the accurate recognition of facial emotions. Twenty-three neurotypical adults performed an emotion recognition task using six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. The participants’ exploration patterns for the faces were measured using the Moving Window Technique (MWT). This technique presented a small window on a blurred face, and the participants explored the face stimuli through a mouse-controlled window in order to recognize the emotions on the face. Our results revealed that when the participants explored the diagnostic regions for each emotion more frequently, the correct recognition of the emotions occurred at a faster rate. To the best of our knowledge, this current study is the first to present evidence that an exploration of emotion-specific diagnostic regions can predict the reaction time of accurate emotion recognition among neurotypical adults. Such findings can be further applied in the evaluation and/or training (regarding emotion recognition functions) of both typically and atypically developing children with emotion recognition difficulties.
... Adolphs and colleagues' report a case study of SM, a patient who has rare bilateral amygdala damage, which resulted in a specific deficit in recognizing fear. In a line of elegant studies, the authors show that this deficit is related to an early attentional bias, specifically to a lack of spontaneous fixations on the eyes during free viewing of faces (Adolphs et al., 2005;Gosselin et al., 2011;Kennedy & Adolphs, 2010). To control for such attentional biases, we used an eye-tracker to track eye-movements of our participants while presented with the EA stimuli. ...
Article
Failing to understand others accurately can be extremely costly. Unfortunately, events such as strokes can lead to a decline in emotional understanding. Such impairments have been documented in stroke patients and are widely hypothesized to be related to right-hemisphere lesions, as well as to the amygdala, and are thought to be driven in part by attentional biases, for example, less fixation on the eyes. Notably, most of the previous research relied on measurements of emotional understanding from simplified cues, such as facial expressions or prosody. We hypothesize that chronic damage to the left hemisphere could hinder empathic accuracy and emotion recognition in naturalistic social settings that require complex language comprehension, even after a patient regains core language capacities. To assess this notion, we use an empathic accuracy task and eye-tracking measurements with chronic stroke patients with either right (N = 13) or left (N = 11) hemispheric damage—together with aged-matched controls (N = 15)—to explore the patients’ understanding of others’ affect inferred from stimuli that separates audio and visual cues. While we find that patients with right-hemisphere lesions showed visual attention bias compared to the other two groups, we uncover a disadvantage for patients with left-hemisphere lesions in empathic accuracy, especially when only auditory cues are present. These results suggest that patients with left-hemisphere damage have long-lasting difficulties comprehending real-world complex emotional situations.
... This is likely associated with feedback signals used to adjust human behavior after acknowledging and integrating emotional information from a stimulus. The amygdala is involved in attending to affective stimuli 13,[40][41][42][43][44] , and generating the appropriate responses 20,45,46 . In this study, we observed that the amygdala plays a significant role in modulating the neural pathways and exerted dominant influence on the pons and other corticolimbic regions during sad information processing. ...
Article
Full-text available
Knowledge of the neural underpinnings of processing sad information and how it differs in people with depression could elucidate the neural mechanisms perpetuating sad mood in depression. Here, we conduct a 7 T fMRI study to delineate the neural correlates involved only in processing sad information, including pons, amygdala, and corticolimbic regions. We then conduct a 3 T fMRI study to examine the resting-state connectivity in another sample of people with and without depression. Only clinically depressed people demonstrate hyperactive amygdala–pons connectivity. Furthermore, this connectivity is related to depression symptom severity and is a significant indicator of depression. We speculate that visual sad information reinforces depressed mood and stimulates the pons, strengthening the amygdala–pons connectivity. The relationship between this connectivity and depressive symptom severity suggests that guiding one’s visual attention and processing of sad information may benefit mood regulation.
... Indeed, its scores have been associated with both attentional biases while inspecting angry facial expressions (Honk et al., 2001;Veenstra et al., 2016) and bilateral dorsal amygdala activity (Carré et al., 2010). In fact, amygdala lesions impair the automatic allocation of attention to aversive stimuli (Anderson & Phelps, 2001) and socially relevant stimuli (Kennedy & Adolphs, 2010;Piretti et al., 2020). ...
Article
Full-text available
The ability to experience, use and eventually control anger, is crucial to maintain well-being and build healthy relationships. Despite its relevance, the neural mechanisms behind individual differences in experiencing and controlling anger are poorly understood. To elucidate these points, we employed an unsupervised machine learning approach based on Independent Component Analysis, to test the hypothesis that specific functional and structural networks are associated with individual differences in trait anger and anger control. Structural and functional resting state images of 71 subjects as well as their scores from the State-Trait Anger Expression Inventory entered the analyses. At a structural level, the concentration of gray matter in a network including ventromedial temporal areas, posterior cingulate, fusiform gyrus and cerebellum was associated with trait anger. The higher the concentration, the higher the proneness to experience anger in daily life due to the greater tendency to orient attention toward aversive events and interpret them with higher hostility. At a functional level, the activity of the Default Mode Network was associated with anger control. The higher the DMN temporal frequency, the stronger the exerted control over anger, thus extending previous evidence on the role of the DMN in regulating cognitive and emotional functions in the domain of anger. Taken together, these results show, for the first time, two specialized brain networks for encoding individual differences in trait anger and anger control.
... Valuable insights have been acquired from the study of patients with Urbach-Wiethe disease, a rare developmental disorder characterized by selective bilateral damage to the amygdala. When one such patient was instructed to look at the eye region, her ability to identify fearful facial expressions was reinstated [182]. Thus, the amygdala may be important for directing gaze to informative regions of the face, such as the eyes, perhaps specifically for fear recognition. ...
Article
Full-text available
This review addresses functional interactions between the primate prefrontal cortex (PFC) and the amygdala, with emphasis on their contributions to behavior and cognition. The interplay between these two telencephalic structures contributes to adaptive behavior and to the evolutionary success of all primate species. In our species, dysfunction in this circuitry creates vulnerabilities to psychopathologies. Here, we describe amygdala–PFC contributions to behaviors that have direct relevance to Darwinian fitness: learned approach and avoidance, foraging, predator defense, and social signaling, which have in common the need for flexibility and sensitivity to specific and rapidly changing contexts. Examples include the prediction of positive outcomes, such as food availability, food desirability, and various social rewards, or of negative outcomes, such as threats of harm from predators or conspecifics. To promote fitness optimally, these stimulus–outcome associations need to be rapidly updated when an associative contingency changes or when the value of a predicted outcome changes. We review evidence from nonhuman primates implicating the PFC, the amygdala, and their functional interactions in these processes, with links to experimental work and clinical findings in humans where possible.
... Finally, because engagement mechanisms toward emotional features have been repeatedly associated with amygdala activity (Anderson and Phelps, 2001;Kennedy and Adolphs, 2010;Vuilleumier, 2005;Vuilleumier et al., 2004), more particularly in increasing search efficiency for negative information (Bach et al., 2015;Jacobs et al., 2012;Ohrmann et al., 2007), our findings could reflect amygdala alterations, which have previously been reported in AD (Klein-Koerkamp et al., 2014;Poulin et al., 2011). The existence of emotional attention alterations in AD being established, it is now crucial to conduct emotional attention paradigms in functional magnetic resonance imaging studies to characterize the anatomical and functional sources of the impairment in AD. ...
Article
Impairments of emotional processing have been reported in Alzheimer’s disease (AD), consistently with the existence of early amygdala atrophy in the pathology. In this study, we hypothesized that patients with AD might show a deficit of orientation toward emotional information under conditions of visual search. Eighteen patients with AD, 24 age-matched controls, and 35 young controls were eye-tracked while they performed a visual search task on a computer screen. The target was a vehicle with implicit (negative or neutral) emotional content, presented concurrently with one, three, or five non-vehicle neutral distractors. The task was to find the target and to report whether a break in the target frame was on the left or on the right side. Both control groups detected negative targets more efficiently than they detected neutral targets, showing facilitated engagement toward negative information. In contrast, patients with AD showed no influence of emotional information on engagement delays. However, all groups reported the frame break location more slowly for negative than for neutral targets (after accounting for the last fixation delay), showing a more difficult disengagement from negative information. These findings are the first to highlight a selective lack of emotional influence on engagement processes in patients with AD. The involvement of amygdala alterations in this behavioral impairment remains to be investigated.
... Several human studies also imply an influence of the amygdala on motor and autonomic responses. Bilateral amygdala lesions led to impaired recognition of fearful faces due to an inability of gaze fixation on the eyes (Adolphs et al., 2005;Kennedy and Adolphs, 2010). Also the amygdala is responsible for defensive behaviors in response to acute threats (Klumpers et al., 2017) and increased connectivity between the amygdala and cortical regions is predictive of heart rate variability in patients suffering from generalized anxiety disorder (Makovac et al., 2016). ...
Article
Full-text available
The amygdala is a structure involved in emotions, fear, learning and memory and is highly interconnected with other brain regions, for example the motor cortex and the basal ganglia that are often targets of treatments involving electrical stimulation. Deep brain stimulation of the basal ganglia is successfully used to treat movement disorders, but can carry along non-motor side effects. The origin of these non-motor side effects is not fully understood yet, but might be altered oscillatory communication between specific motor areas and the amygdala. Oscillations in various frequency bands have been detected in the amygdala during cognitive and emotional tasks, which can couple with oscillations in cortical regions or the hippocampus. However, data on oscillatory coupling between the amygdala and motor areas are still lacking. This review provides a summary of oscillation frequencies measured in the amygdala and their possible functional relevance in different species, followed by evidence for connectivity between the amygdala and motor areas, such as the basal ganglia and the motor cortex. We hypothesize that the amygdala could communicate with motor areas through coherence of low frequency bands in the theta-alpha range. Furthermore, we discuss a potential role of the amygdala in therapeutic approaches based on electrical stimulation.
... Interestingly, studies of Patient SM, an informative neurological case in which bilateral calcification of the amygdala has occurred as a result of the rare genetic condition Urbach-Wiethe disease, have noted her inability to spontaneously direct her visual attention to the eyes of others. These data suggest that this may be the result of impaired amygdala-dependent bottom-up control of attentional processes, which alters the balance of attention largely in favor of top-down processes (Adolphs et al., 2005;Kennedy and Adolphs, 2010). Other studies in amygdala-damaged humans note the dynamic interaction of top-down and bottom-up processes in emotional processing and suggest that subjective ratings of emotional events via top-down processing impacts amygdala activity (Hsu and Pessoa, 2007;Pessoa, 2010;Taylor et al., 2003). ...
Article
Threat processing is central to understanding debilitating fear- and trauma-related disorders such as posttraumatic stress disorder (PTSD). Progress has been made in understanding the neural circuits underlying the “engram” of threat or fear memory formation that complements a decades-old appreciation of the neurobiology of fear and threat involving hub structures such as the amygdala. In this review, we examine key recent findings, as well as integrate the importance of hormonal and physiological approaches, to provide a broader perspective of how bodily systems engaged in threat responses may interact with amygdala-based circuits in the encoding and updating of threat-related memory. Understanding how trauma-related memories are encoded and updated throughout the brain and the body will ultimately lead to novel biologically-driven approaches for treatment and prevention. Threat processing is central to understanding posttraumatic stress disorder and other fear- and stress-related disorders. Maddox et al. describe recent progress in this area, understanding the neural circuits underlying trauma-related memories encoded throughout the brain and the body.
... When facing other humans, they rarely look at their eyes, unlike control subjects. This appears to be dependent on social context (in which fearful emotions might be evoked), because such saccade suppression can be eliminated when only a small region of the face is made visible [62]. Similar changes in social-emotional saccades occur in human subjects with amygdala dysfunctions (e.g., autism) [63,64]. ...
Article
Full-text available
Author summary The amygdala is known to control passive fear responses (e.g., freezing), but it is unclear if it also contributes to active behaviors. To reach certain goals, we (humans and animals) often need to go through fearful environments. We hypothesized that the amygdala contributes to such an active behavior and devised a new foraging task for macaque monkeys in which various emotional contexts changed across many environments. This “exciting” task provoked extremely fast learning and high-capacity memory of objects and environments, and thereby caused extremely fast goal-directed behaviors. We found that the goal-directed behavior was affected by the emotional context in two dimensions (dangerous–safe and rich–poor) separately from the object values. Then, many neurons in the amygdala responded to the environments before any object appeared and did so selectively, depending on the emotional context of the environment. The neuronal activity was tightly correlated with the reaction time of goal-directed behavior across the contexts: faster behavior in dangerous or rich context. These results suggest that the amygdala facilitates goal-directed behavior by focusing on emotional contexts. Such a function is also important for emotional–social behavior and its disorder, including averted eye gaze in autism.
... This is particularly important given that individuals can attend to areas that they are not directly fixating (Posner, 1980). Some work has used this method to demonstrate that simultaneous availability of the information from the entire face is crucial for efficient (holistic) face recognition (Maw & Pomplun, 2004;Van Belle, De Graef, Verfaillie, Rossion, & Lefèvre, 2010), that there are cross-cultural universals in face processing (Caldara, Zhou, & Miellet, 2010), and even restored fixation to the eyes in a patient with amygdala damage (Kennedy & Adolphs, 2010). Still other research has used moving windows, for example, where a mouse (held by the participant) controls the perceptual input rather than the participant's gaze. ...
Article
Full-text available
Although some facial expressions provide clear information about people’s emotions and intentions (happy, angry), others (surprise) are ambiguous because they can signal both positive (e.g., surprise party) and negative outcomes (e.g., witnessing an accident). Without a clarifying context, surprise is interpreted as positive by some and negative by others, and this valence bias is stable across time. When compared to fearful expressions, which are consistently rated as negative, surprise and fear share similar morphological features (e.g., widened eyes) primarily in the upper part of the face. Recently, we demonstrated that the valence bias was associated with a specific pattern of eye movements (positive bias associated with faster fixation to the lower part of the face). In this follow-up, we identified two participants from our previous study who had the most positive and most negative valence bias. We used their eye movements to create a moving window such that new participants viewed faces through the eyes of one our previous participants (subjects saw only the areas of the face that were directly fixated by the original participants in the exact order they were fixated; i.e., Simulated Eye-movement Experience). The input provided by these windows modulated the valence ratings of surprise, but not fear faces. These findings suggest there are meaningful individual differences in how people process faces, and that these differences impact our emotional perceptions. Furthermore, this study is unique in its approach to examining individual differences in emotion by creating a new methodology adapted from those used primarily in the vision/attention domain.
... It is curious to know that the amygdala, among its functions, integrates a system that automatically guides visual attention to the eyes every time we encounter a facial expression. Patients with a selective damage to the amygdala draw their visual attention somewhere else, such as to the nose, and struggle more recognising those emotions that are characterised primarily by modifications in the top half of the face, such as fear (Kennedy and Adolphs 2010). We could therefore hypothesise that, from an evolutionary point of view, a system was selected that makes us use the "window of the look" as the most important source of information on the others' state and, probably, on their more or less friendly intentions. ...
Chapter
Full-text available
Empathy, in the patient-clinician relationship, plays a key role. Here we address this issue from a neuroscientific perspective, as neuroscience research attempts are shedding much light on the mechanisms underlying empathy. In particular, we focus on the relationship between clinician and suicidal individuals that represents a difficult category of patients that puts the emotional and empathic regulation capacity to the test. Therefore, we provide the reader with an overview on the current neuroscientific knowledge about empathy, intending to return an interpretative clue and favour new intuitions promoting a better comprehension, based on a scientific use of empathy in the patient-clinician relationship. We then propose the concept of “empathic disconnection” referring to those situations in which the clinician, automatically and unconsciously, puts him/herself in the position of not taking any advantage from the empathic relationship with the patient. We propose the concept of “empathic moment” as a communicative strategy, whose goal is to intentionally use empathic mechanisms to gather information directed at identifying the inner state of the patient. We finally suggest the use of vitality forms as a relevant element for the cognitive analysis of the patient’s inner states. We conclude with some practical-applicative considerations based on what is discussed.
... This assumption is based on a series of physiological (LeDoux, 2000), patient (Adolphs,Tranel, Damasio, & Damasio, 1994;de Gelder, Vroomen, Pourtois, & Weiskrantz, 1999), and neuroimaging studies (Morris, Ohman, & Dolan, 1999; for a review, seeVuilleumier & Pourtois, 2007), which have suggested that unconscious perception of emotional faces can bypass sensory cortices (but seePessoa & Adolphs, 2010). The amygdala, a structure in the medial temporal lobe, is also involved in the detection of eye gaze and eye contact (Kawashima et al., 1999), and it may guide attention toward the eyes of perceived faces (Adolphs et al., 2005;Kennedy & Adolphs, 2010). This is relevant because the eye region carries important emotional cues (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), and because as we proposed earlier, eye contact appears to automatically trigger facial mimicry (Niedenthal et al., 2010;Schrammel, Pannasch, Graupner, Mojzisch, & Velichkovsky, 2009). ...
... Remarkably, the failure in top-down guidance of attention seems to be the basis of impaired recognition of fearful faces, which is the hallmark deficit of amygdala-damaged patients . Studies on SM, the single-most studied UWD case (2008; 2005) suggest that amygdala-damaged patients are not impaired in perception of fear per se, but fail to properly attend to parts of face images that are relevant for correct expression recognition (Kennedy & Adolphs, 2011). Intriguingly, SM's fear recognition deficit was corrected after an explicit instruction to attend to the eye region of faces (Adolphs et al., 2005). ...
Article
Full-text available
The amygdala is believed to play a major role in orienting attention towards threat-related stimuli. However, behavioral studies on amygdala-damaged patients have given inconsistent results—variously reporting decreased, persisted, and increased attention towards threat. Here we aimed to characterize the impact of developmental amygdala damage on emotion perception and the nature and time-course of spatial attentional bias towards fearful faces. We investigated SF, a 14-year-old with selective bilateral amygdala damage due to Urbach-Wiethe disease, and ten healthy controls. Participants completed a fear sensitivity questionnaire, facial expression classification task, and dot-probe task with fearful or neutral faces for spatial cueing. Three cue durations were used to assess the time-course of attentional bias. SF expressed significantly lower fear sensitivity, and showed a selective impairment in classifying fearful facial expressions. Despite this impairment in fear recognition, very brief (100ms) fearful cues could orient SF’s spatial attention. In healthy controls, the attentional bias emerged later and persisted longer. SF’s attentional bias was due solely to facilitated engagement to fear, while controls showed the typical phenomenon of difficulty in disengaging from fear. Our study is the first to demonstrate the separable effects of amygdala damage on engagement and disengagement of spatial attention. The findings indicate that multiple mechanisms contribute in biasing attention towards fear, which vary in their timing and dependence on amygdala integrity. It seems that the amygdala is not essential for rapid attention to emotion, but probably has a role in assessment of biological relevance.
... Un individuo con lesioni bilaterali dell'amigdala non è riuscito a riferire esperienze di paura, quando è stato messo a contatto con serpenti e ragni, o quando spaventato (Feinstein, Adolphs, Damasio & Tranel, 2011). Gli individui con lesioni dell'amigdala non assegnano automaticamente attenzione agli stimoli avversi e agli stimoli socialmente rilevanti (Kennedy & Adolphs, 2010), come fanno invece gli individui con amigdala intatta. Il ruolo dell'amigdala nel rilevare gli stimoli salienti a livello motivazionale potrebbe anche spiegare perché si osserva una maggiore attività dell'amigdala in casi che non coinvolgono l'esperienza di paura, come quando gli stimoli sono vissuti di valore (Jenison, Rangel, Oya, Kawasaki & Howard, 2011). ...
... Behavioral studies in patients with amygdala damage have also reported impaired performance in visual and attentional tasks with emotional stimuli. For example, in patient PS who has selective destruction of the amygdala (Kennedy & Adolphs, 2010), a deficit in recognizing fearful expression in faces was found to be primarily caused by a lack of attention to the eye region (which contains the most distinctive information), suggesting that amygdala damage did not abolish an internal representation of fear expression but rather disrupted oculomotor exploration of facial features. Furthermore, in an attentional blink task with a rapid succession of written words, no emotional advantage was observed in patients suffering from left amygdala damage after temporal lobectomy (Anderson & Phelps, 2001), even though these patients could still recognize the affective semantic meaning of words. ...
... To limit the number of statistical comparisons, we report eye-movement data only for Recently it has been shown that the lack of visual attention to the eye-region of faces after complete focal bilateral amygdala damage was mainly observed in the first fixations on a newly presented static face (Kennedy & Adolphs, 2010). Therefore we analyzed the first three fixations separately. ...
Thesis
Full-text available
The terms approach and avoidance are used to describe appetitive motivation and fear of punishment. In a social context fear is indeed linked to avoidance, but approach motivation is also expressed with anger and aggression as means to achieve goals at the expense, or through social correction, of others. Therefore, in the social competition of most mammalian species approach-avoidance is strongly linked to dominance-submissiveness, whereby the motivation to dominate others reflects appetitive motivation as well as socially aggressive tendencies. The steroid testosterone is a social hormone that is heavily involved in approach-avoidance and dominance-submissiveness, and is in many species associated with reduced fear and increased reactive social aggression. Recent evidence indicates, however, that testosterone can also promote fairness in humans. In this thesis we first describe a neural framework rooted in animal research and based on a study in five human subjects with selective brain damage in the basolateral amygdala, in which we show that this structure is heavily involved in the inhibition of fear-vigilance. Next, we use several newly developed interactive eye-tracking paradigms to show that anger is indeed related to reward sensitivity, and anxiety to threat avoidance. Furthermore, when confronted with angry facial expressions, submissiveness predicts rapid gaze aversion from eye-contact, and dominance motives as well as testosterone administration lead to reflexive preservation of eye-contact. In sum, dominance-submissiveness seems to involve reflexive mechanisms, and testosterone induces reactive dominance behavior. In the next part of this thesis we show that testosterone also influences cognitive behavior and decision-making in the absence of a direct status-threat. Earlier research already showed that testosterone can reduce cognitive empathy, and here we show that testosterone adaptively reduces trust, but can also increase social cooperative behavior. Translating evidence from rodent and primate research, we argue therefore that reactive-reflexive dominance, and deliberate social cooperation are both approach behaviors by which testosterone promotes survival and reproduction through an increase of social status. First, testosterone inhibits basal fear responsivity at the level of the basolateral amygdala and hypothalamus providing for a general fearlessness that facilitates approach oriented behavior. Second, when social status is directly challenged testosterone promotes reactive aggression and reflexive dominance by upregulating vasopressin gene-expression in the central-medial amygdala. Third, testosterone reduces cortical control over the amygdala, resulting in the general social vigilance that underlies conscious decision making that is beneficial to social status. In sum, testosterone boosts unconscious-reflexive dominance, but can also increase cooperative behaviors depending on the social context. However, both on the level of this reflexive behavior and in conscious decision making, testosterone promotes approach oriented behavior that helps to defend and increase social status.
... Previous studies have similarly manipulated the amount of visual information presented to a participant via gaze-contingent or mouse-contingent windowed viewing paradigms, to study how people utilize foveal information in isolation from the periphery [9][10][11]. Yet, these studies differed from the current one in that viewers still maintained control over the visual information they wished to foveate. ...
Article
Full-text available
We present a novel " Gaze-Replay " paradigm that allows the experimenter to directly test how particular patterns of visual input—generated from people's actual gaze patterns— influence the interpretation of the visual scene. Although this paradigm can potentially be applied across domains, here we applied it specifically to social comprehension. Participants viewed complex, dynamic scenes through a small window displaying only the foveal gaze pattern of a gaze " donor. " This was intended to simulate the donor's visual selection, such that a participant could effectively view scenes " through the eyes " of another person. Throughout the presentation of scenes presented in this manner, participants completed a social comprehension task, assessing their abilities to recognize complex emotions. The primary aim of the study was to assess the viability of this novel approach by examining whether these Gaze-Replay windowed stimuli contain sufficient and meaningful social information for the viewer to complete this social perceptual and cognitive task. The results of the study suggested this to be the case; participants performed better in the Gaze-Replay condition compared to a temporally disrupted control condition, and compared to when they were provided with no visual input. This approach has great future potential for the exploration of experimental questions aiming to unpack the relationship between visual selection, perception, and cognition.
... However, relevant stimuli such as faces are preferentially processed in central vision [2] [3]. The amygdala seeks information from the eye region in human faces (e.g., [4] [5] [6] [7] [8]), receiving direct input from ventral areas (e.g., [9] [10] [11]) which are known to be biased to foveal (central) input [12]. Clinical evidence also shows that foveal vision deficits are reflected in face-selective regions [13]. ...
Article
Full-text available
Introduction: Visual processing of ecologically relevant stimuli involves a central bias for stimuli demanding detailed processing (e.g., faces), whereas peripheral object processing is based on coarse identification. Fast detection of animal shapes holding a significant phylogenetic value, such as snakes, may benefit from peripheral vision. The amygdala together with the pulvinar and the superior colliculus are implicated in an ongoing debate regarding their role in automatic and deliberate spatial processing of threat signals. Methods: Here we tested twenty healthy participants in an fMRI task, and investigated the role of spatial demands (the main effect of central vs. peripheral vision) in the processing of fear-relevant ecological features. We controlled for stimulus dependence using true or false snakes; snake shapes or snake faces and for task constraints (implicit or explicit). The main idea justifying this double task is that amygdala and superior colliculus are involved in both automatic and controlled processes. Moreover the explicit/implicit instruction in the task with respect to emotion is not necessarily equivalent to explicit vs. implicit in the sense of endogenous vs. exogenous attention, or controlled vs. automatic processes. Results: We found that stimulus-driven processing led to increased amygdala responses specifically to true snake shapes presented in the centre or in the peripheral left hemifield (right hemisphere). Importantly, the superior colliculus showed significantly biased and explicit central responses to snake-related stimuli. Moreover, the pulvinar, which also contains foveal representations, also showed strong central responses, extending the results of a recent single cell pulvinar study in monkeys. Similar hemispheric specialization was found across structures: increased amygdala responses occurred to true snake shapes presented to the right hemisphere, with this pattern being closely followed by the superior colliculus and the pulvinar. Conclusion: These results show that subcortical structures containing foveal representations such as the amygdala, pulvinar and superior colliculus play distinct roles in the central and peripheral processing of snake shapes. Our findings suggest multiple phylogenetic fingerprints in the responses of subcortical structures to fear-relevant stimuli.
... In fact, some have shown that specifically the eye region of the face can evoke similar effects (Whalen et al., 2004;Gamer and Bü chel, 2009). Accordingly, individuals with bilateral amygdala lesions have difficulty spontaneously using the eye region of the face to identify fearful facial expressions (Kennedy and Adolphs, 2010). In addition to faces, other types of stimuli have been shown to receive preferential processing by the amygdala. ...
Article
Full-text available
Although the amygdala is often directly linked with fear and emotion, amygdala neurons are activated by a wide variety of emotional and non-emotional stimuli. Different subregions within the amygdala may be engaged preferentially by different aspects of emotional and non-emotional tasks. To test this hypothesis we measured and compared the effects of novelty and fear on amygdala activity. We used high-resolution BOLD imaging and streamline tractography to subdivide the amygdala into three distinct functional subunits. We identified a laterobasal subregion connected with the visual cortex that responds generally to visual stimuli, a non-projecting region that responds to salient visual stimuli, and a centromedial subregion connected with the diencephalon that responds only when a visual stimulus predicts an aversive outcome. We provide anatomical and functional support for a model of amygdala function where information enters through the laterobasal subregion, is processed by intrinsic circuits in the interspersed tissue, and is then passed to the centromedial subregion, where activation leads to behavioral output. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
... Dementia, Parkinson disease, progressive supranuclear palsy, schizophrenia, amyotrophic lateral sclerosis, autism, and fragile X syndrome are among the numerous diseases with characteristic anomalies that are detectable using eye-movement tracking technology. 4,20,21,23,24,27,32 conclusions In this paper we present proof of concept data supporting a method for tracking eye movements while a subject watches television or a video that reveals weakness of the abducent and oculomotor nerves. This represents the first report in the literature of the use of eye tracking to assess physiological functioning of the CNS. ...
Article
Full-text available
Object: Automated eye movement tracking may provide clues to nervous system function at many levels. Spatial calibration of the eye tracking device requires the subject to have relatively intact ocular motility that implies function of cranial nerves (CNs) III (oculomotor), IV (trochlear), and VI (abducent) and their associated nuclei, along with the multiple regions of the brain imparting cognition and volition. The authors have developed a technique for eye tracking that uses temporal rather than spatial calibration, enabling detection of impaired ability to move the pupil relative to normal (neurologically healthy) control volunteers. This work was performed to demonstrate that this technique may detect CN palsies related to brain compression and to provide insight into how the technique may be of value for evaluating neuropathological conditions associated with CN palsy, such as hydrocephalus or acute mass effect. Methods: The authors recorded subjects' eye movements by using an Eyelink 1000 eye tracker sampling at 500 Hz over 200 seconds while the subject viewed a music video playing inside an aperture on a computer monitor. The aperture moved in a rectangular pattern over a fixed time period. This technique was used to assess ocular motility in 157 neurologically healthy control subjects and 12 patients with either clinical CN III or VI palsy confirmed by neuro-ophthalmological examination, or surgically treatable pathological conditions potentially impacting these nerves. The authors compared the ratio of vertical to horizontal eye movement (height/width defined as aspect ratio) in normal and test subjects. Results: In 157 normal controls, the aspect ratio (height/width) for the left eye had a mean value ± SD of 1.0117 ± 0.0706. For the right eye, the aspect ratio had a mean of 1.0077 ± 0.0679 in these 157 subjects. There was no difference between sexes or ages. A patient with known CN VI palsy had a significantly increased aspect ratio (1.39), whereas 2 patients with known CN III palsy had significantly decreased ratios of 0.19 and 0.06, respectively. Three patients with surgically treatable pathological conditions impacting CN VI, such as infratentorial mass effect or hydrocephalus, had significantly increased ratios (1.84, 1.44, and 1.34, respectively) relative to normal controls, and 6 patients with supratentorial mass effect had significantly decreased ratios (0.27, 0.53, 0.62, 0.45, 0.49, and 0.41, respectively). These alterations in eye tracking all reverted to normal ranges after surgical treatment of underlying pathological conditions in these 9 neurosurgical cases. Conclusions: This proof of concept series of cases suggests that the use of eye tracking to detect CN palsy while the patient watches television or its equivalent represents a new capacity for this technology. It may provide a new tool for the assessment of multiple CNS functions that can potentially be useful in the assessment of awake patients with elevated intracranial pressure from hydrocephalus or trauma.
... Based on studies in amygdala-damaged patients, it is clear that the amygdala plays a role in perceptual and motivational aspects of social behavior. In line with the view that the amygdala is important for directing attentional resources to relevant social stimuli, amygdala-damaged patients tend to make less eye-contact (Spezio et al., 2007), are insensitive to personal boundaries (Kennedy, Glascher, Tyszka, & Adolphs, 2009), and have difficulty spontaneously orienting to (Adolphs et al., 2005) and interpreting signals from the eye region of others' faces (Kennedy & Adolphs, 2010;Young et al., 1995). Furthermore, such patients do not exhibit the normal enhancement of fMRI signal in the fusiform gyrus and superior temporal sulcus to facial expressions (Vuilleumier et al., 2004). ...
Article
A growing body of evidence suggests that the amygdala is central to handling the demands of complex social life in primates. In this paper, we synthesize extant anatomical and functional data from rodents, monkeys, and humans to describe the topography of three partially distinct large-scale brain networks anchored in the amygdala that each support unique functions for effectively managing social interactions and maintaining social relationships. These findings provide a powerful componential framework for parsing social behavior into partially distinct neural underpinnings that differ among healthy people and disintegrate or fail to develop in neuropsychiatric populations marked by social impairment, such as autism, antisocial personality disorder, and frontotemporal dementia.
... These distinctions may be clinically relevant: bottomup and top-down abnormalities in gaze may be dissociable in patient groups that share abnormal fixation patterns to faces (Birmingham, Cerf, & Adolphs, 2011). Kennedy and Adolphs (2010) have suggested that bilateral amygdala damage impairs bottom-up effects of face stimuli on gaze, whereas top-down modulation of gaze might be impaired in autism spectrum disorders (Neumann, Spezio, Piven, & Adolphs, 2006). ...
... The causal mechanism operating between amygdala activation and eye gaze movements remains unclear. The first neural response could precede attention deployment, guiding the eyes to explore the detected stimuli (Kennedy and Adolphs, 2011), but time spent gazing over the stimuli could equally increase the limbic neural signal (Pessoa and Adolphs, 2010). SSRI-driven modifications could occur at one or both stages of this process. ...
Article
Full-text available
Background:Anxiety and depression are associated with altered ocular exploration of facial stimuli, which could play a role in the misinterpretation of ambiguous emotional stimuli. However, it is unknown whether a similar pattern is seen in individuals at risk for psychopathology, and whether this can be modified by pharmacological interventions used in these disorders.Methods:In Study 1, eye gaze movement during face discrimination was compared in volunteers with high vs. low neuroticism scores on the Eysenck Personality Questionnaire. Facial stimuli either displayed a neutral, happy or fearful expression. In Study 2, volunteers with high neuroticism were randomised in a double-blind design to receive the selective serotonin reuptake inhibitor citalopram (20 mg) or placebo for 7 days. On the last day of treatment, eye gaze movement during face presentation and the recognition of different emotional expressions was assessed.Results:In Study 1, highly neurotic volunteers showed reduced eye gaze towards the eyes vs. mouth region of the face compared to low neurotic volunteers. In Study 2, citalopram increased gaze maintenance over the face stimuli compared to placebo and enhanced recognition of positive vs. negative facial expressions. Longer ocular exploration of happy faces correlated positively with recognition of positive emotions.Conclusions:Individuals at risk for psychopathology presented an avoidant pattern of ocular exploration of faces. Short-term SSRI administration reversed this bias before any mood or anxiety changes. This treatment effect may improve the capacity to scan social stimuli and contribute to the remediation of clinical symptoms related to interpersonal difficulties.Neuropsychopharmacology accepted article preview online, 18 July 2014; doi:10.1038/npp.2014.159.
... The best characterized neural systems are those involved in orienting attention to faces. Together, the superior colliculus, pulvinar and amygdala use coarse spatial frequency information to orient attention to faces or eyes (Kennedy & Adolphs, 2010;Nguyen et al., 2013). Difficulties with orienting towards and engaging with people and their faces and voices have been frequently described in older children with autism (Chawarska, Macari, & Shic, 2012;Kikuchi, Senju, Tojo, & Osanai, 2009). ...
Article
Full-text available
A fast growing field, the study of infants at risk because of having an older sibling with autism (i.e. infant sibs) aims to identify the earliest signs of this disorder, which would allow for earlier diag-nosis and intervention. More importantly, we argue, these studies offer the opportunity to validate existing neuro-developmental models of autism against experimental evidence. Although autism is mainly seen as a disorder of social interaction and communica-tion, emerging early markers do not exclusively reflect impairments of the ''social brain''. Evidence for atypical development of sensory and attentional systems highlight the need to move away from localized deficits to models suggesting brain-wide involvement in autism pathology. We discuss the implications infant sibs findings have for future work into the biology of autism and the development of interventions. Ó 2014 The Authors. Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/ licenses/by/3.0/).
... The superior temporal sulcus (STS) (Kamphuis et al., 2009;Laube et al., 2011) and amygdala (Emery, 2000;Tazumi et al., 2010;Gordon et al., 2013), in monkeys and humans, respond to the sight of other individuals orienting in a particular direction. Further, impaired amygdala function in monkeys and humans disrupts gaze-following behavior (Kennedy and Adolphs, 2010;Roy et al., 2012). In macaques, the activity of neurons in the lateral intraparietal area-a brain region implicated in attention and orienting-is modulated by the gaze of others, a potential mechanism for directing attention to objects and locations attended by them . ...
Article
Full-text available
Decisions made by individuals can be influenced by what others think and do. Social learning includes a wide array of behaviors such as imitation, observational learning of novel foraging techniques, peer or parental influences on individual preferences, as well as outright teaching. These processes are believed to underlie an important part of cultural variation among human populations and may also explain intraspecific variation in behavior between geographically distinct populations of animals. Recent neurobiological studies have begun to uncover the neural basis of social learning. Here we review experimental evidence from the past few decades showing that social learning is a widespread set of skills present in multiple animal species. In mammals, the temporoparietal junction, the dorsomedial, and dorsolateral prefrontal cortex, as well as the anterior cingulate gyrus, appear to play critical roles in social learning. Birds, fish, and insects also learn from others, but the underlying neural mechanisms remain poorly understood. We discuss the evolutionary implications of these findings and highlight the importance of emerging animal models that permit precise modification of neural circuit function for elucidating the neural basis of social learning.
... Ventromedial PFC and amygdala share robust bidirectional projections with each other; both regions receive projections from high-level visual areas in temporal cortex; and both regions interconnect densely with areas of posterior lateral orbital cortex, which in turn project to the lateral frontal eye fields that control eye movement (Barbas, 2000;Cavada et al., 2000). This proposed early detection/attention-allocation function is consistent with previous neuropsychological data showing that bilateral amygdala damage specifically impairs visual attention to the eye region of faces for the first fixation following stimulus onset (Kennedy and Adolphs, 2010), as well as our own follow-up analyses with ventromedial PFC lesion patients demonstrating the most pronounced deficit of eye fixations during the first second of face viewing. To further explore the putative relationship between ventromedial PFC and amygdala for this function, an important follow-up study in ventromedial PFC patients will be to determine whether some type of exogenous direction of attention to the eye region of the face rescues the observed deficits in emotion recognition, as was the case for a patient with bilateral amygdala lesion (Adolphs et al., 2005). ...
Article
Full-text available
The ventromedial prefrontal cortex is known to play a crucial role in regulating human social and emotional behaviour, yet the precise mechanisms by which it subserves this broad function remain unclear. Whereas previous neuropsychological studies have largely focused on the role of the ventromedial prefrontal cortex in higher-order deliberative processes related to valuation and decision-making, here we test whether ventromedial prefrontal cortex may also be critical for more basic aspects of orienting attention to socially and emotionally meaningful stimuli. Using eye tracking during a test of facial emotion recognition in a sample of lesion patients, we show that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. This finding demonstrates a heretofore unrecognized function of the ventromedial prefrontal cortex-the basic attentional process of controlling eye movements to faces expressing emotion.
... Ventromedial PFC and amygdala share robust bidirectional projections with each other; both regions receive projections from high-level visual areas in temporal cortex; and both regions interconnect densely with areas of posterior lateral orbital cortex, which in turn project to the lateral frontal eye fields that control eye movement (Barbas, 2000;Cavada et al., 2000). This proposed early detection/attention-allocation function is consistent with previous neuropsychological data showing that bilateral amygdala damage specifically impairs visual attention to the eye region of faces for the first fixation following stimulus onset (Kennedy and Adolphs, 2010), as well as our own follow-up analyses with ventromedial PFC lesion patients demonstrating the most pronounced deficit of eye fixations during the first second of face viewing. To further explore the putative relationship between ventromedial PFC and amygdala for this function, an important follow-up study in ventromedial PFC patients will be to determine whether some type of exogenous direction of attention to the eye region of the face rescues the observed deficits in emotion recognition, as was the case for a patient with bilateral amygdala lesion (Adolphs et al., 2005). ...
Article
Full-text available
The ventromedial prefrontal cortex is known to play a crucial role in regulating human social and emotional behaviour, yet the precise mechanisms by which it subserves this broad function remain unclear. Whereas previous neuropsychological studies have largely focused on the role of the ventromedial prefrontal cortex in higher-order deliberative processes related to valuation and decision-making, here we test whether ventromedial prefrontal cortex may also be critical for more basic aspects of orienting attention to socially and emotionally meaningful stimuli. Using eye tracking during a test of facial emotion recognition in a sample of lesion patients, we show that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. This finding demonstrates a heretofore unrecognized function of the ventromedial prefrontal cortex—the basic attentional process of controlling eye movements to faces expressing emotion.
... Characterization of the scanning patterns indicated that rhesus monkeys attended to the eye regions of the stimulus animals as they evaluated the dynamic, bimodal vocalizations. This interest in the eye region adds to a number of previous studies reporting that both humans and monkeys preferentially investigate the eye regions of conspecifics presented either in static images [24][25][26][27][28][29][30][31][32][33][34] or dynamic, naturalistic videos [18,[35][36][37]. Both humans and rhesus monkeys broadcast important socioemotional information through their eyes (e.g., their emotional or mental state, social intentions, or focus of their attention), thus attending to the eye region provides the observer with a wealth of socially relevant information [38]. ...
Article
Full-text available
Crossmodal integration of audio/visual information is vital for recognition, interpretation and appropriate reaction to social signals. Here we examined how rhesus macaques process bimodal species-specific vocalizations by eye tracking, using an unconstrained preferential looking paradigm. Six adult rhesus monkeys (3M, 3F) were presented two side-by-side videos of unknown male conspecifics emitting different vocalizations, accompanied by the audio signal corresponding to one of the videos. The percentage of time animals looked to each video was used to assess crossmodal integration ability and the percentages of time spent looking at each of the six a priori ROIs (eyes, mouth, and rest of each video) were used to characterize scanning patterns. Animals looked more to the congruent video, confirming reports that rhesus monkeys spontaneously integrate conspecific vocalizations. Scanning patterns showed that monkeys preferentially attended to the eyes and mouth of the stimuli, with subtle differences between males and females such that females showed a tendency to differentiate the eye and mouth regions more than males. These results were similar to studies in humans indicating that when asked to assess emotion-related aspects of visual speech, people preferentially attend to the eyes. Thus, the tendency for female monkeys to show a greater differentiation between the eye and mouth regions than males may indicate that female monkeys were slightly more sensitive to the socio-emotional content of complex signals than male monkeys. The current results emphasize the importance of considering both the sex of the observer and individual variability in passive viewing behavior in nonhuman primate research.
... Urbach-Wiethe disease primarily is a dermatological disease condition which, however, in most cases leads in adulthood to amygdala calcification. In the United States of America, there is especially the patient SM who has been studied extensively with respect to deficits resulting from this disease [299][300][301][302][303][304][305][306]. In Europe and South Africa larger numbers of patients with Urbach-Wiethe disease were studied in recent years [18, 20, 282, 285-287, 307, 308]. ...
Article
Full-text available
Relations between memory and the self are framed from a number of perspectives—developmental aspects, forms of memory, interrelations between memory and the brain, and interactions between the environment and memory. The self is seen as dividable into more rudimentary and more advanced aspects. Special emphasis is laid on memory systems and within them on episodic autobiographical memory which is seen as a pure human form of memory that is dependent on a proper ontogenetic development and shaped by the social environment, including culture. Self and episodic autobiographical memory are seen as interlocked in their development and later manifestation. Aside from content-based aspects of memory, time-based aspects are seen along two lines—the division between short-term and long-term memory and anterograde—future-oriented—and retrograde—past-oriented memory. The state dependency of episodic autobiographical is stressed and implications of it—for example, with respect to the occurrence of false memories and forensic aspects—are outlined. For the brain level, structural networks for encoding, consolidation, storage, and retrieval are discussed both by referring to patient data and to data obtained in normal participants with functional brain imaging methods. It is elaborated why descriptions from patients with functional or dissociative amnesia are particularly apt to demonstrate the facets in which memory, self, and personal temporality are interwoven.
Article
The amygdala is the main emphasis in the study of psychological mechanism of emotion and possess important fear function. This study discussed the effect of amygdala on emotion of fear from two aspects: fear recognition and fear experience. Conclusion from recent and early studies has been reviewed. Amygdala has no specific effect on fear recognition but engage higher cognitive function. In addition, Amygdala plays a substantial role on fear experience with the function of acquiring, remembering and expressing about fear. Future research should focus on the following areas: such as the effect of amygdala on fear recognition and its role to connect cognition and experience, the effect of brain regions around the amygdala on fear and eventually, the effect of other brain regions except for amygdala involved in fear response which triggered by interoceptive stimulus.
Article
The neural basis of empathy and prosociality has received much interest over the past decades. Neuroimaging studies localized a network of brain regions with activity that correlates with empathy. Here, we review how the emergence of rodent and nonhuman primate models of empathy-related phenomena supplements human lesion and neuromodulation studies providing evidence that activity in several nodes is necessary for these phenomena to occur. We review proof that (i) affective states triggered by the emotions of others, (ii) motivations to act in ways that benefit others, and (iii) emotion recognition can be altered by perturbing brain activity in many nodes identified by human neuroimaging, with strongest evidence for the cingulate and the amygdala. We also include evidence that manipulations of the oxytocin system and analgesics can have such effects, the latter providing causal evidence for the recruitment of an individual's own nociceptive system to feel with the pain of others.
Article
Looking at other people allows us to collect information about them, but it can also reveal our attentional state when we would rather conceal it. We report that individuals spontaneously employ sustained covert monitoring, rather than direct looking, when evaluating the actions of a live stranger. In contrast, individuals look directly at the actions of a stranger on video. We argue that the ability to secretly monitor live others without executing a look towards them is an important process that compensates for the risk of looking directly during certain social situations. Covert monitoring allows people to avoid visually communicating to others that they are the focus of one's attention. This represents a previously undocumented function of covert attention outside of the laboratory. It suggests that the relationship between covert attention and looking is dynamic and likely to be foundational to the successful navigation of real-world social situations.
Article
Full-text available
To further assess orbitofrontal cortex (OFC) contribution to the processing of socioemotional signals, spontaneous scanning patterns and pupil diameter variations were measured while adult rhesus macaques with either bilateral lesions of OFC areas 11 and 13 (Group O-asp) or sham-operations (Group C) freely viewed pictures of neutral and expressive faces of conspecifics, of other nonhuman primates and humans, and of objects with and without facial features. As compared to Group C, Group O-asp displayed (a) increased overall spontaneous visual exploration and increased scanning of primate neutral faces regardless of species and face orientation (upright/inverted), (b) longer gazes at the eyes of faces and of objects with facial features, and (c) intact ability to discriminate emotional from neutral faces, but (d) altered scanning patterns at emotional macaque faces coupled with (e) increased pupil dilation for conspecific faces according to face emotion and orientation (profile/stare). Thus, the primate OFC appears essential in the attention to and processing of faces, especially attention to the eyes and arousal self-regulation. (PsycINFO Database Record (c) 2020 APA, all rights reserved).
Article
Full-text available
Studies of humans with focal brain damage and non-human animals with experimentally induced brain lesions have provided pivotal insights into the neural basis of behavior. As the repertoire of neural manipulation and recording techniques expands, the utility of studying permanent brain lesions bears re-examination. Studies on the effects of permanent lesions provide vital data about brain function that are distinct from those of reversible manipulations. Focusing on work carried out in humans and nonhuman primates, we address the inferential strengths and limitations of lesion studies, recent methodological developments, the integration of this approach with other methods, and the clinical and ecological relevance of this research. We argue that lesion studies are essential to the rigorous assessment of neuroscience theories.
Article
Depending on a subject's attentional bias, robust changes in emotional perception occur when facial blends (different emotions expressed on upper/lower face) are presented tachistoscopically. If no instructions are given, subjects overwhelmingly identify the lower facial expression when blends are presented to either visual field. If asked to attend to the upper face, subjects overwhelmingly identify the upper facial expression in the left visual field but remain slightly biased to the lower facial expression in the right visual field. The current investigation sought to determine whether differences in initial saccadic targets could help explain the perceptual biases described above. Ten subjects were presented with full and blend facial expressions under different attentional conditions. No saccadic differences were found for left versus right visual field presentations or for full facial versus blend stimuli. When asked to identify the presented emotion, saccades were directed to the lower face. When asked to attend to the upper face, saccades were directed to the upper face. When asked to attend to the upper face and try to identify the emotion, saccades were directed to the upper face but to a lesser degree. Thus, saccadic behavior supports the concept that there are cognitive-attentional pre-attunements when subjects visually process facial expressions. However, these pre-attunements do not fully explain the perceptual superiority of the left visual field for identifying the upper facial expression when facial blends are presented tachistoscopically. Hence other perceptual factors must be in play, such as the phenomenon of virtual scanning.
Article
The eye-region conveys important emotional information that we spontaneously attend to. Socially submissive individuals avoid other's gaze which is regarded as avoidance of others' emotional face expressions. But this interpretation ignores the fact that there are other sources of emotional information besides the face. Here we investigate whether gaze-aversion is associated with increased attention to emotional signals from the hands. We used eye-tracking to compare eye-fixations of pre-selected high and low socially anxious students when labeling bodily expressions (Experiment 1) with (non)-matching facial expressions (Experiment 2) and passively viewed (Experiment 3). High compared to low socially anxious individuals attended more to hand-regions. Our findings demonstrate that socially anxious individuals do attend to emotions, albeit to different signals than the eyes and the face. Our findings call for a closer investigation of alternative viewing patterns explaining gaze-avoidance and underscore that other signals besides the eyes and face must be considered to reach conclusions about social anxiety.
Article
Full-text available
Fragile X syndrome (FXS) and autism spectrum disorders (ASD) are characterized by impaired social functioning. We examined the spontaneous discrimination of happy and disgusted facial expressions, from neutral faces, in individuals with FXS (n = 13, Mage = 19.70) and ASD (n = 15, Mage = 11.00) matched on adaptive behavior and verbal abilities measured by the Vineland Adaptive Behavior Scale. Eye gaze to the eyes and mouth of neutral faces was also measured. Results suggest individuals with FXS and ASD distinguish facial expressions spontaneously in the same way. Individuals with FXS looked significantly less at the eye region of neutral faces than individuals with ASD. These results provide insight into similarities and differences in face processing in two neurodevelopmental disorders noted for their similarities in social behavior.
Article
Neuroscience research is making rapid gains toward identifying associations between atypical brain development and mental health problems in children. The goal of such research is to identify risk and protective factors that affect treatment outcome, to personalize treatment to meet specific individual needs. In this review, we discuss the potential for developmental neuroscience research to directly inform professional practice in child and adolescent psychology. Beginning with the case presentation of an 11-year-old girl with multiple developmental difficulties, we highlight research using animal models along with studies of genetics and human neuroimaging that target three different dimensions: (a) face processing in autism; (b) response inhibition and impulsive behavior; and (c) the problems caused by anxiety that can cut across all child mental health conditions. We note that neuroscience research necessarily depends on dimensional rather than categorical diagnostic approaches and discuss the benefits as well as some drawbacks to the clinical utility of dimensional definitions. As the technology to study brain development continues to improve, the need for basic researchers and practicing clinicians to communicate effectively with each other becomes all the more critical. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Autism spectrum disorders (ASD) are neurodevelopmental disorders characterized by social communication impairments and repetitive behaviors. Although the prevalence of ASD is estimated at 1 in 88, understanding of the neural mechanisms underlying the disorder is still emerging. Regions including the amygdala, superior temporal sulcus, orbitofrontal cortex, fusiform gyrus, medial prefrontal cortex, and insula have been implicated in social processing. Neuroimaging studies have demonstrated both anatomical and functional differences in these areas of the brain in individuals with ASD when compared to controls; however, research on the neural basis for response to treatment in ASD is limited. Results of the three studies that have examined the neural mechanisms underlying treatment response are promising; following treatment, the brains of individuals with ASD seem to "normalize," responding more similarly to those of typically developing individuals. The research in this area is in its early stages, and thus a focused effort examining the neural basis of treatment response in ASD is crucial.
Article
Full-text available
The standard method for comparing an individual's test score with a normative sample involves converting the score to a z score and evaluating it using a table of the area under the normal curve. When the norma- tive sample is small, a more appropriate method is to treat the individual as a sample of N = 1 and use a modifiedt test described by Sokal and Rohlf (1995). The use of this t test is illustrated with examples and its results compared to those from the standard procedure. It is suggested that the t test be used when the N of the normative sample is less than 50. Finally, a computer program that implements the modifiedt-test procedure is described. This program can be downloaded from the first author's website.
Article
Full-text available
The human amygdala can be robustly activated by presenting fearful faces, and it has been speculated that this activation has functional relevance for redirecting the gaze toward the eye region. To clarify this relationship between amygdala activation and gaze-orienting behavior, functional magnetic resonance imaging data and eye movements were simultaneously acquired in the current study during the evaluation of facial expressions. Fearful, angry, happy, and neutral faces were briefly presented to healthy volunteers in an event-related manner. We controlled for the initial fixation by unpredictably shifting the faces downward or upward on each trial, such that the eyes or the mouth were presented at fixation. Across emotional expressions, participants showed a bias to shift their gaze toward the eyes, but the magnitude of this effect followed the distribution of diagnostically relevant regions in the face. Amygdala activity was specifically enhanced for fearful faces with the mouth aligned to fixation, and this differential activation predicted gazing behavior preferentially targeting the eye region. These results reveal a direct role of the amygdala in reflexive gaze initiation toward fearfully widened eyes. They mirror deficits observed in patients with amygdala lesions and open a window for future studies on patients with autism spectrum disorder, in which deficits in emotion recognition, probably related to atypical gaze patterns and abnormal amygdala activation, have been observed.
Article
Full-text available
Gaze avoidance is a hallmark behavioral feature of fragile X syndrome (FXS), but little is known about whether abnormalities in the visual processing of faces, including disrupted autonomic reactivity, may underlie this behavior. Eye tracking was used to record fixations and pupil diameter while adolescents and young adults with FXS and sex- and age-matched typically developing controls passively viewed photographs of faces containing either a calm, happy, or fearful expression, preceded by a scrambled face matched on luminance. Results provide quantitative evidence for significant differences in gaze patterns and increased pupillary reactivity when individuals with FXS passively view static faces. Such abnormalities have significant implications in terms of understanding causes of gaze avoidance observed in individuals with FXS.
Article
Full-text available
People with autism are impaired in their social behavior, including their eye contact with others, but the processes that underlie this impairment remain elusive. We combined high-resolution eye tracking with computational modeling in a group of 10 high-functioning individuals with autism to address this issue. The group fixated the location of the mouth in facial expressions more than did matched controls, even when the mouth was not shown, even in faces that were inverted and most noticeably at latencies of 200–400 ms. Comparisons with a computational model of visual saliency argue that the abnormal bias for fixating the mouth in autism is not driven by an exaggerated sensitivity to the bottom-up saliency of the features, but rather by an abnormal top-down strategy for allocating visual attention.
Article
Full-text available
Social contact often initially depends on ascertaining the direction of the other person's gaze. We determined the brain areas involved in gaze monitoring by a functional neuroimaging study. Discrimination between the direction of gaze significantly activated a region in the left amygdala during eye-contact and no eye-contact tasks to the same extent. However, a region in the right amygdala was specifically activated only during the eye-contact task. Results confirm that the left amygdala plays a general role in the interpretation of eye gaze direction, and that the activity of the right amygdala of the subject increases when another individual's gaze is directed towards him. This suggests that the human amygdala plays a role in reading social signals from the face.
Article
Full-text available
The visual scanpaths of five high-functioning adult autistic males and five adult male controls were recorded using an infrared corneal reflection technique as they viewed photographs of human faces. Analyses of the scanpath data revealed marked differences in the scanpaths of the two groups. The autistic participants viewed nonfeature areas of the faces significantly more often and core feature areas of the faces (i.e., eyes, nose, and mouth) significantly less often than did control participants. Across both groups of participants, scanpaths generally did not differ as a function of the instructions given to the participants (i.e., "Please look at the faces in any manner you wish." vs. "Please identify the emotions portrayed in these faces."). Autistic participants showed a deficit in emotion recognition, but this effect was driven primarily by deficits in the recognition of fear. Collectively, these results indicate disorganized processing of face stimuli in autistic individuals and suggest a mechanism that may subserve the social information processing deficits that characterize autism spectrum disorders.
Article
Full-text available
The Eyelink Toolbox software supports the measurement of eye movements. The toolbox provides an interface between a high-level interpreted language (MATLAB), a visual display programming toolbox (Psychophysics Toolbox), and a video-based eyetracker (Eyelink). The Eyelink Toolbox enables experimenters to measure eye movements while simultaneously executing the stimulus presentation routines provided by the Psychophysics Toolbox. Example programs are included with the toolbox distribution. Information on the Eyelink Toolbox can be found at http://psychtoolbox.org/.
Article
Full-text available
The amygdala was more responsive to fearful (larger) eye whites than to happy (smaller) eye whites presented in a masking paradigm that mitigated subjects' awareness of their presence and aberrant nature. These data demonstrate that the amygdala is responsive to elements of.
Article
Full-text available
Ten years ago, we reported that SM, a patient with rare bilateral amygdala damage, showed an intriguing impairment in her ability to recognize fear from facial expressions. Since then, the importance of the amygdala in processing information about facial emotions has been borne out by a number of lesion and functional imaging studies. Yet the mechanism by which amygdala damage compromises fear recognition has not been identified. Returning to patient SM, we now show that her impairment stems from an inability to make normal use of information from the eye region of faces when judging emotions, a defect we trace to a lack of spontaneous fixations on the eyes during free viewing of faces. Although SM fails to look normally at the eye region in all facial expressions, her selective impairment in recognizing fear is explained by the fact that the eyes are the most important feature for identifying this emotion. Notably, SM's recognition of fearful faces became entirely normal when she was instructed explicitly to look at the eyes. This finding provides a mechanism to explain the amygdala's role in fear recognition, and points to new approaches for the possible rehabilitation of patients with defective emotion perception.
Article
Full-text available
This article examines the human face as a transmitter of expression signals and the brain as a decoder of these expression signals. If the face has evolved to optimize transmission of such signals, the basic facial expressions should have minimal overlap in their information. If the brain has evolved to optimize categorization of expressions, it should be efficient with the information available from the transmitter for the task. In this article, we characterize the information underlying the recognition of the six basic facial expression signals and evaluate how efficiently each expression is decoded by the underlying brain structures.
Article
Full-text available
We review the evidence implicating the amygdala as a critical component of a neural network of social cognition, drawing especially on research involving the processing of faces and other visual social stimuli. We argue that, although it is clear that social behavioral representations are not stored in the amygdala, the most parsimonious interpretation of the data is that the amygdala plays a role in guiding social behaviors on the basis of socioenvironmental context. Thus, it appears to be required for normal social cognition. We propose that the amygdala plays this role by attentionally modulating several areas of visual and somatosensory cortex that have been implicated in social cognition, and in helping to direct overt visuospatial attention in face gaze. We also hypothesize that the amygdala exerts attentional modulation of simulation in somatosensory cortices such as supramarginal gyrus and insula. Finally, we argue that the term emotion be broadened to include increased attention to bodily responses and their representation in cortex.
Article
Full-text available
The role of the human amygdala in real social interactions remains essentially unknown, although studies in nonhuman primates and studies using photographs and video in humans have shown it to be critical for emotional processing and suggest its importance for social cognition. We show here that complete amygdala lesions result in a severe reduction in direct eye contact during conversations with real people, together with an abnormal increase in gaze to the mouth. These novel findings from real social interactions are consistent with an hypothesized role for the amygdala in autism and the approach taken here opens up new directions for quantifying social behavior in humans.
Article
Accurate and efficient interpretation of facial expressions of emotion is essential for humans to socially interact with others. Facial expressions communicate information from which we can quickly infer the state of mind of our peers and adjust our behavior accordingly. Considering the face as a transmitter of emotion signals and the brain as a decoder we expect minimal overlap in the specific information used for each expression. Here we characterize the information underlying the recognition of the six basic facial expressions (fear, anger, sadness, happiness, surprise and disgust) and evaluate how well each expression is interpreted. Using the Bubbles method with human observers and a model observer for benchmarking we characterize the specific information subsets corresponding to diagnostic (decoded, human) and available (transmitted, model) information for each expression and neutral. We found in general low correlations (m = .28, s = .14) in the available informative regions across expressions with further de-correlations in the diagnostic regions of human observers (m = .12, s = .09). In particular, for human observers we found the informative regions for anger and fear to be orthogonal to all other expressions. Furthermore, for each expression, we determine the optimality of information use by human observers from a pixel-wise comparison of the human and model informative regions. The de-correlated information subsets of human observers can be considered as optimized inputs with which the specific response of brain structures to facial features transmitting emotion signals can be isolated.
Article
The VideoToolbox is a free collection of two hundred C subroutines for Macintosh computers that calibrates and controls the computer-display interface to create accurately specified visual stimuli. High-level platform-independent languages like MATLAB are best for creating the numbers that describe the desired images. Low-level, computer-specific VideoToolbox routines control the hardware that transforms those numbers into a movie. Transcending the particular computer and language, we discuss the nature of the computer-display interface, and how to calibrate and control it.
Article
The amygdala has received intense recent attention from neuroscientists investigating its function at the molecular, cellular, systems, cognitive, and clinical level. It clearly contributes to processing emotionally and socially relevant information, yet a unifying description and computational account have been lacking. The difficulty of tying together the various studies stems in part from the sheer diversity of approaches and species studied, in part from the amygdala's inherent heterogeneity in terms of its component nuclei, and in part because different investigators have simply been interested in different topics. Yet, a synthesis now seems close at hand in combining new results from social neuroscience with data from neuroeconomics and reward learning. The amygdala processes a psychological stimulus dimension related to saliency or relevance; mechanisms have been identified to link it to processing unpredictability; and insights from reward learning have situated it within a network of structures that include the prefrontal cortex and the ventral striatum in processing the current value of stimuli. These aspects help to clarify the amygdala's contributions to recognizing emotion from faces, to social behavior toward conspecifics, and to reward learning and instrumental behavior.
Article
The presence of neurons in macaque temporal cortex and amygdala which fire selectively in response to social stimuli has been demonstrated by several investigators. The extent to which such neuronal populations may respond to a broad range of social features, including expressive movements and interactions, has not been fully explored due to the difficulty of presenting such complex stimuli in a controlled fashion. We describe a method for presenting moving segments of macaque behavior, visual and auditory, to animal subjects during single unit recording. The method permits a broad range of stimuli to be used both as probes and as controls. In addition, a novel technique for monitoring eye position in alert macaque subjects is described. We present results from the medial amygdala and adjacent cortex, demonstrating that neurons in these regions respond selectively to features of the social environment.
Article
Studies in animals have shown that the amygdala receives highly processed visual input, contains neurons that respond selectively to faces, and that it participates in emotion and social behaviour. Although studies in epileptic patients support its role in emotion, determination of the amygdala's function in humans has been hampered by the rarity of patients with selective amygdala lesions. Here, with the help of one such rare patient, we report findings that suggest the human amygdala may be indispensable to: (1) recognize fear in facial expressions; (2) recognize multiple emotions in a single facial expression; but (3) is not required to recognize personal identity from faces. These results suggest that damage restricted to the amygdala causes very specific recognition impairments, and thus constrains the broad notion that the amygdala is involved in emotion.
Article
The Psychophysics Toolbox is a software package that supports visual psychophysics. Its routines provide an interface between a high-level interpreted language (MATLAB on the Macintosh) and the video display hardware. A set of example programs is included with the Toolbox distribution.
Article
Findings from several case studies have shown that bilateral amygdala damage impairs recognition of emotions in facial expressions, especially fear. However, one study did not find such an impairment, and, in general, comparison across studies has been made difficult because of the different stimuli and tasks employed. In a collaborative study to facilitate such comparisons, we report here the recognition of emotional facial expressions in nine subjects with bilateral amygdala damage, using a sensitive and quantitative assessment. Compared to controls, the subjects as a group were significantly impaired in recognizing fear, although individual performances ranged from severely impaired to essentially normal. Most subjects were impaired on several negative emotions in addition to fear, but no subject was impaired in recognizing happy expressions. An analysis of response consistency showed that impaired recognition of fear could not be attributed simply to mistaking fear for another emotion. While it remains unclear why some subjects with amygdala damage included here are not impaired on our task, the results overall are consistent with the idea that the amygdala plays an important role in triggering knowledge related to threat and danger signaled by facial expressions.
Article
Gaze is an important component of social interaction. The function, evolution and neurobiology of gaze processing are therefore of interest to a number of researchers. This review discusses the evolutionary role of social gaze in vertebrates (focusing on primates), and a hypothesis that this role has changed substantially for primates compared to other animals. This change may have been driven by morphological changes to the face and eyes of primates, limitations in the facial anatomy of other vertebrates, changes in the ecology of the environment in which primates live, and a necessity to communicate information about the environment, emotional and mental states. The eyes represent different levels of signal value depending on the status, disposition and emotional state of the sender and receiver of such signals. There are regions in the monkey and human brain which contain neurons that respond selectively to faces, bodies and eye gaze. The ability to follow another individual's gaze direction is affected in individuals with autism and other psychopathological disorders, and after particular localized brain lesions. The hypothesis that gaze following is "hard-wired" in the brain, and may be localized within a circuit linking the superior temporal sulcus, amygdala and orbitofrontal cortex is discussed.
Article
Fearful facial expressions evoke increased neural responses in human amygdala. We used event-related fMRI to investigate whether eye or mouth components of a fearful face are critical in evoking this increased amygdala activity. In addition to prototypical fearful (FF) and neutral (NN) faces, subjects viewed two types of chimerical face: fearful eyes combined with a neutral mouth (FN), and neutral eyes combined with a fearful mouth (NF). FE faces evoked specific responses in left anterior amygdala. FN faces evoked responses in bilateral posterior amygdala and superior colliculus. Responses in right amygdala, superior colliculus, and pulvinar exhibited significant time x condition interactions with respect to faces with fearful eyes (FF, FN) vs neutral eyes (NF, NN). These data indicate that fearful eyes alone are sufficient to evoke increased amygdala activity. In addition, however, left amygdala displayed discriminatory responses to fearful eyes in different configural contexts (i.e., in FF and FN faces). These results suggest, therefore, that human amygdala responds to both feature-specific and configural aspects of fearful facial expressions.
Article
We combined eye-tracking technology with a test of facial affect recognition and a measure of self-reported social anxiety in order to explore the aetiology of social-perceptual deficits in Asperger's syndrome (AS). Compared to controls matched for age, IQ and visual-perceptual ability, we found a group of AS adults was impaired in their recognition of fearful and sad expressions and spent significantly less time fixating the eye region of all faces. For AS subjects, but not controls, the extent of the failure to fixate the eyes predicted the degree of impairment at recognising fearful expressions. In addition, poor fear recognition and reduced fixation of the eyes were independently associated with greater levels of social anxiety in AS individuals. These findings support the hypothesis that avoidance of emotionally arousing stimuli, such as eyes, contributes to social-perceptual impairment in AS. Furthermore, our findings are consistent with theories implicating amygdala-mediated over-arousal and anxiety in the development of these social-perceptual deficits.
Article
The amygdala is clearly implicated in the processing of biologically relevant stimuli, particularly those that can lead to a state of fear. A new study by Herry, Bach and colleagues using both mouse and human subjects seemingly throws a wrench in the spokes by demonstrating that the amygdala is sensitive to non-biologically relevant stimuli (i.e. tones) when they occur in an unpredictable fashion. The implications of this finding for understanding the role of the amygdala in vigilance, threat assessment and anxiety are considered here.
Article
The genetic disorder Williams syndrome (WS) is associated with a propulsion towards social stimuli and interactions with people. In contrast, the neuro-developmental disorder autism is characterised by social withdrawal and lack of interest in socially relevant information. Using eye-tracking techniques we investigate how individuals with these two neuro-developmental disorders associated with distinct social characteristics view scenes containing people. The way individuals with these disorders view social stimuli may impact upon successful social interactions and communication. Whilst individuals with autism spend less time than is typical viewing people and faces in static pictures of social interactions, the opposite is apparent for those with WS whereby exaggerated fixations are prevalent towards the eyes. The results suggest more attention should be drawn towards understanding the implications of atypical social preferences in WS, in the same way that attention has been drawn to the social deficits associated with autism.
Eye Movements and Vision. Haigh, Basil, translator
  • Al Yarbus
Yarbus, AL. Eye Movements and Vision. Haigh, Basil, translator. New York: Plenum Press; 1967. (Originally published in Russian in 1965)
The human amygdala in social function. The human amygdala
  • T W Buchanan
  • D Tranel
  • R Adolphs
Buchanan, T. W., Tranel, D., & Adolphs, R. (2009). In P. W. Whalen, & L. Phelps (Eds.), The human amygdala in social function. The human amygdala (pp. 289–320). New York: Oxford UP.
Adolphs, R. The human amygdala in social function
  • Tw Buchanan
  • D Tranel
Buchanan, TW.; Tranel, D.; Adolphs, R. The human amygdala in social function. In: Whalen, PW.; Phelps, L., editors. The human amygdala. New York: Oxford UP; 2009. p. 289-320.