Tuning the developing brain to social signals of emotion

Department of Psychology, University of Tampere, Tampere, Finland.
Nature Reviews Neuroscience (Impact Factor: 31.43). 01/2009; 10(1):37-47. DOI: 10.1038/nrn2554
Source: PubMed


Humans in different cultures develop a similar capacity to recognize the emotional signals of diverse facial expressions. This capacity is mediated by a brain network that involves emotion-related brain circuits and higher-level visual-representation areas. Recent studies suggest that the key components of this network begin to emerge early in life. The studies also suggest that initial biases in emotion-related brain circuits and the early coupling of these circuits and cortical perceptual areas provide a foundation for a rapid acquisition of representations of those facial features that denote specific emotions.

5 Reads
  • Source
    • "Similar biases are proposed for the representation of emotional expressions (Lepp€ anen & Nelson, 2009). Humans convey emotional states not only through facial expressions but also through vocalizations as well as body motion, and consistently interpret emotions of moving agents (Atkinson, Dittrich, Gemmell, & Young, 2004; Crane & Gross, 2007; Karg, Kühnlenz, & Buss, 2010), independent of shape (McDonnell, J€ org, McHugh, Newell, & O'Sullivan, 2009). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Humans readily attribute intentionality and mental states to living and nonliving entities, a phenomenon known as anthropomorphism. Recent efforts to understand the driving forces behind anthropomorphism have focused on its motivational underpinnings. By contrast, the underlying cognitive and neu-ropsychological processes have not been considered in detail so far. The marked increase in interest in anthropomorphism and its consequences for animal welfare, conservation and even as a potential constraint in animal behaviour research call for an integrative review. We identify a set of potential cognitive mechanisms underlying the attribution of mental states to nonhuman animals using a dual process framework. We propose that mental state attributions are supported by processes evolved in the social domain, such as motor matching mechanisms and empathy, as well as by domain-general mechanisms such as inductive and causal reasoning. We conclude that the activation of these domain-specific and domain-general mechanisms depend on the type of information available to the observer, and suggest a series of hypotheses for testing the proposed model.
    Animal Behaviour 11/2015; 109:167-176. DOI:10.1016/j.anbehav.2015.08.011 · 3.14 Impact Factor
  • Source
    • "However, from a developmental perspective, it is not known when in infancy this unconscious fear processing from eyes emerges. More specifically, it is unclear whether it is present before 7 months of age or, similar to the conscious perception of fearful faces (Peltola, Leppänen, Mäki, et al., 2009), only emerges around 7 months of age. In order to test between these two possibilities, we measured 5-month-old infants' ERPs in response to subliminally presented fearful and non-fearful eyes and compared these to 7-month-old infants' ERP responses from a previous study (Jessen & Grossmann, 2014). "
    [Show abstract] [Hide abstract]
    ABSTRACT: From early in life, emotion detection plays an important role during social interactions. Recently, 7-month-old infants have been shown to process facial signs of fear in others without conscious perception and solely on the basis of their eyes. However, it is not known whether unconscious fear processing from eyes is present before 7months of age or only emerges at around 7months. To investigate this question, we measured 5-month-old infants' event-related potentials (ERPs) in response to subliminally presented fearful and non-fearful eyes and compared these with 7-month-old infants' ERP responses from a previous study. Our ERP results revealed that only 7-month-olds, but not 5-month-olds, distinguished between fearful and non-fearful eyes. Specifically, 7-month-olds' processing of fearful eyes was reflected in early visual processes over occipital cortex and later attentional processes over frontal cortex. This suggests that, in line with prior work on the conscious detection of fearful faces, the brain processes associated with the unconscious processing of fearful eyes develop between 5 and 7months of age. More generally, these findings support the notion that emotion perception and the underlying brain processes undergo critical change during the first year of life. Therefore, the current data provide further evidence for viewing infancy as a formative period in human socioemotional functioning.
    Journal of Experimental Child Psychology 10/2015; DOI:10.1016/j.jecp.2015.09.009 · 3.12 Impact Factor
  • Source
    • "An early review states that children have difficulty recognizing neutral expressions, and to our knowledge no recent behavioral studies have addressed the development of this expression specifically (Gross & Ballif, 1991). Our finding of a steep increase in improvement between the youngest and oldest age groups accords with this reported early difficulty and could be explained by a general bias to attend more to emotive faces throughout our social experiences (Leppanen & Nelson, 2009). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Reading the non-verbal cues from faces to infer the emotional states of others is central to our daily social interactions from very early in life. Despite the relatively well-documented ontogeny of facial expression recognition in infancy, our understanding of the development of this critical social skill throughout childhood into adulthood remains limited. To this end, using a psychophysical approach we implemented the QUEST threshold-seeking algorithm to parametrically manipulate the quantity of signals available in faces normalized for contrast and luminance displaying the six emotional expressions, plus neutral. We thus determined observers' perceptual thresholds for effective discrimination of each emotional expression from 5 years of age up to adulthood. Consistent with previous studies, happiness was most easily recognized with minimum signals (35% on average), whereas fear required the maximum signals (97% on average) across groups. Overall, recognition improved with age for all expressions except happiness and fear, for which all age groups including the youngest remained within the adult range. Uniquely, our findings characterize the recognition trajectories of the six basic emotions into three distinct groupings: expressions that show a steep improvement with age - disgust, neutral, and anger; expressions that show a more gradual improvement with age - sadness, surprise; and those that remain stable from early childhood - happiness and fear, indicating that the coding for these expressions is already mature by 5 years of age. Altogether, our data provide for the first time a fine-grained mapping of the development of facial expression recognition. This approach significantly increases our understanding of the decoding of emotions across development and offers a novel tool to measure impairments for specific facial expressions in developmental clinical populations. © 2015 John Wiley & Sons Ltd.
    Developmental Science 02/2015; DOI:10.1111/desc.12281 · 3.89 Impact Factor
Show more

Preview (2 Sources)

5 Reads
Available from