Tuning the developing brain to social signals of emotion

Article · January 2009with66 Reads
DOI: 10.1038/nrn2554 · Source: PubMed
Abstract
Humans in different cultures develop a similar capacity to recognize the emotional signals of diverse facial expressions. This capacity is mediated by a brain network that involves emotion-related brain circuits and higher-level visual-representation areas. Recent studies suggest that the key components of this network begin to emerge early in life. The studies also suggest that initial biases in emotion-related brain circuits and the early coupling of these circuits and cortical perceptual areas provide a foundation for a rapid acquisition of representations of those facial features that denote specific emotions.
4 Figures
    • The relationship between memory load and BOLD signal changes in the inferior cerebellum has also been proven (Kirschen, Chen, & Desmond, 2010;Ng et al., 2016;Tamagni et al., 2010). The SOFG has core components of the emotional processing network and is involved in the perceptual processing of emotions in facial expressions (Leppanen & Nelson, 2009). Furthermore, it has been linked to the experience of anger and aggression (Lindquist, Wager, Kober, Bliss-Moreau, & Barrett, 2012;Vytal & Hamann, 2010).Given that the SOFG is associated with negative emotion processing,with the finding that impaired white matter is linked to anxiety and depression (Coplan et al., 2010).
    [Show abstract] [Hide abstract] ABSTRACT: Introduction This study aimed to investigate the cerebral function deficits in patients with leukoaraiosis (LA) and the correlation with white matter hyperintensity (WMH) using functional MRI (fMRI) technology. Materials and Methods Twenty‐eight patients with LA and 30 volunteers were enrolled in this study. All patients underwent structural MRI and resting‐state functional MRI (rs‐fMRI) scanning. The amplitude of low‐frequency fluctuations (ALFF) of rs‐fMRI signals for the two groups was compared using two‐sample t tests. A one‐sample t test was performed on the individual z‐value maps to identify the functional connectivity of each group. The z values were compared between the two groups using a two‐sample t test. Partial correlations between ALFF values and functional connectivity of the brain regions that showed group differences and Fazekas scores of the WMH were analyzed. Results Compared with the control group, the LA group showed a significant decrease in the ALFF in the left parahippocampal gyrus (PHG) and an increased ALFF in the left inferior semi‐lunar lobule and right superior orbital frontal gyrus (SOFG). The patients with LA showed an increased functional connectivity between the right insular region and the right SOFG and between the right calcarine cortex and the left PHG. After the effects of age, gender, and years of education were corrected as covariates, the functional connectivity strength of the right insular and the right SOFG showed close correlations with the Fazekas scores. Conclusion Our results enhance the understanding of the pathomechanism of LA. Leukoaraiosis is associated with widespread cerebral function deficits, which show a close correlation with WMH and can be measured by rs‐fMRI.
    Full-text · Article · May 2017
    • The voice not only carries speech information but it can also be seen as an " auditory face " that enables us to recognize individuals and their emotional states (Belin et al., 2004). Responding to others' emotional expressions is an essential and early developing social skill among humans (Grossmann, 2010;Grossmann and Friederici, 2012;Leppaenen and Nelson, 2009). Behavioral and neuroscientific studies provide evidence that within the first year of life, infants begin to reliably discriminate between a variety of positive and negative emotional facial expressions, such as happy, sad, angry and fearful expressions (Grossmann, 2013;Kotsoni et al., 2001;LaBarbera et al., 1976;Nelson and de Haan, 1996;Serrano et al., 1992).
    [Show abstract] [Hide abstract] ABSTRACT: Responding to others’ emotional expressions is an essential and early developing social skill among humans. Much research has focused on how infants process facial expressions, while much less is known about infants’ processing of vocal expressions. We examined 8-month-old infants’ processing of other infants’ vocalizations by measuring event-related brain potentials (ERPs) to positive (infant laughter), negative (infant cries), and neutral (adult hummed speech) vocalizations. Our ERP results revealed that hearing another infant cry elicited an enhanced negativity (N200) at temporal electrodes around 200 ms, whereas listening to another infant laugh resulted in an enhanced positivity (P300) at central electrodes around 300 ms. This indexes that infants’ brains rapidly respond to a crying peer during early auditory processing stages, but also selectively respond to a laughing peer during later stages associated with familiarity detection processes. These findings provide evidence for infants’ sensitivity to vocal expressions of peers and shed new light on the neural processes underpinning emotion processing in infants.
    Full-text · Article · Apr 2017
    • Therefore, studying Amg's role in emotion perception without stimulus awareness enables us to focus on processes representing 'primitives' that evolved before, and likely shaped, more sophisticated functions, such as those involved in sustaining perceptual awareness, core feelings or intentionality. Likewise, these primordial Amg functions have been implicated in the specialization of more recent cortical functions across the primate lineage as well as during development and maturation (Leppanen and Nelson, 2009), including present-day organization of the cortical visual system specialized for face and body processing (Johnson, 2005; Liddell et al., 2005;). Hence, this also provides a valuable testing ground for gaging cross-species continuity of functions and comparison.
    [Show abstract] [Hide abstract] ABSTRACT: Over the past two decades, evidence has accumulated that the human amygdala exerts some of its functions also when the observer is not aware of the content, or even presence, of the triggering emotional stimulus. Nevertheless, there is as of yet no consensus on the limits and conditions that affect the extent of amygdala’s response without focused attention or awareness. Here we review past and recent studies on this subject, examining neuroimaging literature on healthy participants as well as brain-damaged patients, and we comment on their strengths and limits. We propose a theoretical distinction between processes involved in attentional unawareness, wherein the stimulus is potentially accessible to enter visual awareness but fails to do so because attention is diverted, and in sensory unawareness, wherein the stimulus fails to enter awareness because its normal processing in the visual cortex is suppressed. We argue this distinction, along with data sampling amygdala responses with high temporal resolution, helps to appreciate the multiplicity of functional and anatomical mechanisms centered on the amygdala and supporting its role in non-conscious emotion processing. Separate, but interacting, networks relay visual information to the amygdala exploiting different computational properties of subcortical and cortical routes, thereby supporting amygdala functions at different stages of emotion processing. This view reconciles some apparent contradictions in the literature, as well as seemingly contrasting proposals, such as the dual stage and the dual route model. We conclude that evidence in favor of the amygdala response without awareness is solid, albeit this response originates from different functional mechanisms and is driven by more complex neural networks than commonly assumed. Acknowledging the complexity of such mechanisms can foster new insights on the varieties of amygdala functions without awareness and their impact on human behavior.
    Full-text · Article · Jan 2017
    • Numerous studies suggest that humans are responsive to face-like stimuli soon after birth, likely by virtue of the biological and social relevance of faces to infant survival (Serrano, Iglesias, & Loeches, 1992; Young-Browne, Rosenfeld, & Horowitz, 1997). Although the development of face processing has often been studied within a visual context (Haxby, Hoffman, & Gobbini, 2000; Leppänen & Nelson, 2009), human faces are rarely encountered in sensory isolation; rather, they are embedded in a multi-sensory environment. 'Talking faces', for example, provide both audio and visual information that, when integrated, may support or hinder face processing.
    [Show abstract] [Hide abstract] ABSTRACT: Recognition of emotional facial expressions is a crucial skill for adaptive behavior that most often occurs in a multi-sensory context. Affective matching tasks have been used across development to investigate how people integrate facial information with other senses. Given the relative affective strength of olfaction and its relevance in mediating social information since birth, we assessed olfactory-visual matching abilities in a group of 140 children between the ages of 3 and 11 years old. We presented one of three odor primes (rose, fish and no-odor, rated as pleasant or unpleasant by individual children) before a facial choice task (happy vs. disgusted face). Children were instructed to select one of two faces. As expected, children of all ages tended to choose happy faces. Children younger than 5 years of age were biased towards choosing the happy face, irrespective of the odor smelled. After age 5, an affective matching strategy guided children's choices. Smelling a pleasant odor predicted the choice of happy faces, whereas smelling the unpleasant or fish odor predicted the choice of disgusted faces. The present study fills a gap in the developmental literature on olfactory-visual affective strategies that affect decision-making, and represents an important step towards understanding the underlying developmental processes that shape the typical social mind.
    Full-text · Article · Nov 2016
    • This suggests that infants, by the second half of their first year, possess detection mechanisms that operate outside conscious awareness (Jessen and Grossmann, 2015). Collectively (for review, see Leppänen and Nelson, 2009), this evidence suggests that the neural systems that underlie more automatic components of affect-biased attention are functional at the time these biases are observed behaviorally, during the first year of life (Leppänen et al., 2007).
    [Show abstract] [Hide abstract] ABSTRACT: There is growing interest regarding the impact of affect-biased attention on psychopathology. However, most of the research to date lacks a developmental approach. In the present review, we examine the role affect-biased attention plays in shaping socioemotional trajectories within a developmental neuroscience framework. We propose that affect-biased attention, particularly if stable and entrenched, acts as a developmental tether that helps sustain early socioemotional and behavioral profiles over time, placing some individuals on maladaptive developmental trajectories. Although most of the evidence is found in the anxiety literature, we suggest that these relations may operate across multiple domains of interest, including positive affect, externalizing behaviors, drug use, and eating behaviors. We also review the general mechanisms and neural correlates of affect-biased attention, as well as the current evidence for the co-development of attention and affect. Based on the reviewed literature, we propose a model that may help us better understand the nuances of affect-biased attention across development. The model may serve as a strong foundation for ongoing attempts to identify neurocognitive mechanisms and intervene with individuals at risk. Finally, we discuss open issues for future research that may help bridge existing gaps in the literature.
    Full-text · Article · Sep 2016
    • In view of previous observations that amygdalar connectivity is reduced in individuals with mood disorders, and that this dysconnectivity relates to clinical symptoms and emotionprocessing performance (e.g. Peng et al. 2014), we hypothesized that VPT individuals would exhibit: (1) reduced rs-fc between the amygdala and key nodes of an emotion-processing network (Leppänen & Nelson, 2009); (2) lower accuracy and longer reaction times than controls at recognizing emotions at lower intensity levels; and (3) functional integrity of the amygdala connectivity network would be related to performance on the ERT. Post-hoc exploratory analyses investigating associations between rs-fc, emotion recognition , full-scale IQ and perinatal variables were additionally conducted.
    [Show abstract] [Hide abstract] ABSTRACT: Background: Very preterm birth (VPT; <32 weeks of gestation) has been associated with impairments in emotion regulation, social competence and communicative skills. However, the neuroanatomical mechanisms underlying such impairments have not been systematically studied. Here we investigated the functional integrity of the amygdala connectivity network in relation to the ability to recognize emotions from facial expressions in VPT adults. Method: Thirty-six VPT-born adults and 38 age-matched controls were scanned at rest in a 3-T MRI scanner. Resting-state functional connectivity (rs-fc) was assessed with SPM8. A seed-based analysis focusing on three amygdalar subregions (centro-medial/latero-basal/superficial) was performed. Participants' ability to recognize emotions was assessed using dynamic stimuli of human faces expressing six emotions at different intensities with the Emotion Recognition Task (ERT). Results: VPT individuals compared to controls showed reduced rs-fc between the superficial subregion of the left amygdala, and the right posterior cingulate cortex (p = 0.017) and the left precuneus (p = 0.002). The VPT group further showed elevated rs-fc between the left superficial amygdala and the superior temporal sulcus (p = 0.008). Performance on the ERT showed that the VPT group was less able than controls to recognize anger at low levels of intensity. Anger scores were significantly associated with rs-fc between the superficial amygdala and the posterior cingulate cortex in controls but not in VPT individuals. Conclusions: These findings suggest that alterations in rs-fc between the amygdala, parietal and temporal cortices could represent the mechanism linking VPT birth and deficits in emotion processing.
    Full-text · Article · Aug 2016
Show more