Context Is Routinely Encoded During Emotion Perception

Department of Psychology, Boston College, Chestnut Hill, MA 02467, USA.
Psychological Science (Impact Factor: 4.43). 04/2010; 21(4):595-9. DOI: 10.1177/0956797610363547
Source: PubMed

ABSTRACT In the present study, we investigated whether context is routinely encoded during emotion perception. For the first time, we show that people remember the context more often when asked to label an emotion in a facial expression than when asked to judge the expression's simple affective significance (which can be done on the basis of the structural features of the face alone). Our findings are consistent with an emerging literature showing that facial muscle actions (i.e., structural features of the face), when viewed in isolation, might be insufficient for perceiving emotion.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Qualitative behavioural assessment (QBA) is based on observers' ability to capture the dynamic complexity of an animal's demeanour as it interacts with the environment, in terms such as tense, anxious or relaxed. Sensitivity to context is part of QBA's integrative capacity and discriminatory power; however, when not properly managed it can also be a source of undesirable variability and bias. This study investigated the sensitivity of QBA to variations in the visual or verbal information provided to observers, using free-choice profiling (FCP) methodology. FCP allows observers to generate their own descriptive terms for animal demeanour, against which each animal's expressions are quantified on a visual analogue scale. The resulting scores were analysed with Generalised Procrustes Analysis (GPA), generating two or more multi-variate dimensions of animal expression. Study 1 examined how 63 observers rated the same video clips of individual sheep during land transport, when these clips were interspersed with two different sets of video footage. Scores attributed to the sheep in the two viewing sessions correlated significantly (GPA dimension 1: r s =0.95, P<0.001, GPA dimension 2: r s =0.66, P=0.037) indicating that comparative rankings of animals on expressive dimensions were highly similar, however, their mean numerical scores on these dimensions had shifted (RM-ANOVA: Dim1: P<0.001, Dim2: P<0.001). Study 2 investigated the effect of being given different amounts of background information on two separate groups of observers assessing footage of 22 individual sheep in a behavioural demand facility. One group was given no contextual information regarding this facility, whereas the second group was told that animals were moving towards and away from a feeder (in view) to access feed. Scores attributed to individual sheep by the two observer groups correlated significantly (Dim1: r s =0.92, P<0.001, Dim2: r s =0.52, P=0.013). A number of descriptive terms were generated by both observer groups and used in similar ways, other terms were unique to each group. The group given additional information about the experimental facility scored the sheep's behaviour as more 'directed' and 'focused' than observers who had not been told. Thus, in neither of the two studies did experimentally imposed variations in context alter the characterisations of animals relative to each other, but in Study 1 this did affect the mean numerical values underlying these characterisations, indicating a need for careful attention to the use of visual analogue scales.
    animal 01/2015; DOI:10.1017/S1751731114003164 · 1.78 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This article presents six ideas about the construction of emotion: (a) Emotions are more readily distinguished by the situations they signify than by patterns of bodily responses; (b) emotions emerge from, rather than cause, emotional thoughts, feelings, and expressions; (c) the impact of emotions is constrained by the nature of the situations they represent; (d) in the OCC account (the model proposed by Ortony, Clore, and Collins in 1988), appraisals are psychological aspects of situations that distinguish one emotion from another, rather than triggers that elicit emotions; (e) analyses of the affective lexicon indicate that emotion words refer to internal mental states focused on affect; (f) the modularity of emotion, long sought in biology and behavior, exists as mental schemas for interpreting human experience in story, song, drama, and conversation.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Emotional facial expressions play a critical role in theories of emotion and figure prominently in research on almost every aspect of emotion. This article provides a background for a new database of basic emotional expressions. The goal in creating this set was to provide high quality photographs of genuine facial expressions. Thus, after proper training, participants were inclined to express “felt” emotions. The novel approach taken in this study was also used to establish whether a given expression was perceived as intended by untrained judges. The judgment task for perceivers was designed to be sensitive to subtle changes in meaning caused by the way an emotional display was evoked and expressed. Consequently, this allowed us to measure the purity and intensity of emotional displays, which are parameters that validation methods used by other researchers do not capture. The final set is comprised of those pictures that received the highest recognition marks (e.g. accuracy with intended display) from independent judges, totaling 210 high quality photographs of 30 individuals. Descriptions of the accuracy, intensity, and purity of displayed emotion as well as FACS AU’s codes are provided for each picture. Given the unique methodology applied to gathering and validating this set of pictures, it may be a useful tool for research using face stimuli. The Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) is freely accessible to the scientific community for noncommercial use by request at
    Frontiers in Psychology 12/2014; 5. DOI:10.3389/fpsyg.2014.01516 · 2.80 Impact Factor

Full-text (2 Sources)

Available from
Jun 6, 2014