Are facial displays social? Situational influences in the attribution of emotion to facial expressions.

Facultad de Psicologia, Universidad Autónoma de Madrid, Ciudad Universitaria de Cantoblanco, 28049, Madrid, Spain.
The Spanish Journal of Psychology (Impact Factor: 0.74). 12/2002; 5(2):119-24. DOI: 10.1017/S1138741600005898
Source: PubMed

ABSTRACT Observers are remarkably consistent in attributing particular emotions to particular facial expressions, at least in Western societies. Here, we suggest that this consistency is an instance of the fundamental attribution error. We therefore hypothesized that a small variation in the procedure of the recognition study, which emphasizes situational information, would change the participants' attributions. In two studies, participants were asked to judge whether a prototypical "emotional facial expression" was more plausibly associated with a social-communicative situation (one involving communication to another person) or with an equally emotional but nonsocial, situation. Participants were found more likely to associate each facial display with the social than with the nonsocial situation. This result was found across all emotions presented (happiness, fear, disgust, anger, and sadness) and for both Spanish and Canadian participants.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Free download on the link: DynEmo is a database available to the scientific community ( It contains dynamic and natural emotional facial expressions (EFEs) displaying subjective affective states rated by both the expresser and observers. Methodological and contextual information is provided for each expression. This multimodal corpus meets psychological, ethical, and technical criteria. It is quite large, containing two sets of 233 and 125 recordings of EFE of ordinary Caucasian people (ages 25 to 65, 182 females and 176 males) filmed in natural but standardized conditions. In the Set 1, EFE recordings are associated with the affective state of the expresser (self-reported after the emotion inducing task, using dimensional, action readiness, and emotional labels items). In the Set 2, EFE recordings are both associated with the affective state of the expresser and with the time line (continuous annotations) of observers’ ratings of the emotions displayed throughout the recording. The time line allows any researcher interested in analysing non-verbal human behavior to segment the expressions into emotions. Free download here:
    10/2013; 5(5):61-80. DOI:10.5121/ijma.2013.5505
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Collective rituals are biologically ancient and culturally pervasive, yet few studies have quantified their effects on participants. We assessed two plausible models from qualitative anthropology: ritual empathy predicts affective convergence among all ritual participants irrespective of ritual role; rite-of-passage predicts emotional differences, specifically that ritual initiates will express relatively negative valence when compared with non-initiates. To evaluate model predictions, images of participants in a Spanish fire-walking ritual were extracted from video footage and assessed by nine Spanish raters for arousal and valence. Consistent with rite-of-passage predictions, we found that arousal jointly increased for all participants but that valence differed by ritual role: fire-walkers exhibited increasingly positive arousal and increasingly negative valence when compared with passengers. This result offers the first quantified evidence for rite of passage dynamics within a highly arousing collective ritual. Methodologically, we show that surprisingly simple and non-invasive data structures (rated video images) may be combined with methods from evolutionary ecology (Bayesian Generalized Linear Mixed Effects models) to clarify poorly understood dimensions of the human condition.
    Frontiers in Psychology 01/2013; 4:960. DOI:10.3389/fpsyg.2013.00960 · 2.80 Impact Factor
  • Source
    Psychological Inquiry 07/2011; 22(3):210-216. DOI:10.1080/1047840X.2011.567960 · 4.73 Impact Factor

Full-text (3 Sources)

Available from
May 17, 2014