The Face in the Crowd Effect Unconfounded: Happy Faces, Not Angry Faces, Are More Efficiently Detected in Single- and Multiple-Target Visual Search Tasks

Department of Psychology, Arizona State University, AZ, USA.
Journal of Experimental Psychology General (Impact Factor: 5.5). 07/2011; 140(4):637-59. DOI: 10.1037/a0024060
Source: PubMed

ABSTRACT Is it easier to detect angry or happy facial expressions in crowds of faces? The present studies used several variations of the visual search task to assess whether people selectively attend to expressive faces. Contrary to widely cited studies (e.g., Öhman, Lundqvist, & Esteves, 2001) that suggest angry faces "pop out" of crowds, our review of the literature found inconsistent evidence for the effect and suggested that low-level visual confounds could not be ruled out as the driving force behind the anger superiority effect. We then conducted 7 experiments, carefully designed to eliminate many of the confounding variables present in past demonstrations. These experiments showed no evidence that angry faces popped out of crowds or even that they were efficiently detected. These experiments instead revealed a search asymmetry favoring happy faces. Moreover, in contrast to most previous studies, the happiness superiority effect was shown to be robust even when obvious perceptual confounds--like the contrast of white exposed teeth that are typically displayed in smiling faces--were eliminated in the happy targets. Rather than attribute this effect to the existence of innate happiness detectors, we speculate that the human expression of happiness has evolved to be more visually discriminable because its communicative intent is less ambiguous than other facial expressions.

Download full-text


Available from: D. Vaughn Becker, Nov 21, 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Prior reports of preferential detection of emotional expressions in visual search have yielded inconsistent results, even for face stimuli that avoid obvious expression-related perceptual confounds. The current study investigated inconsistent reports of anger and happiness superiority effects using face stimuli drawn from the same database. Experiment 1 excluded procedural differences as a potential factor, replicating a happiness superiority effect in a procedure that previously yielded an anger superiority effect. Experiments 2a and 2b confirmed that image colour or poser gender did not account for prior inconsistent findings. Experiments 3a and 3b identified stimulus set as the critical variable, revealing happiness or anger superiority effects for two partially overlapping sets of face stimuli. The current results highlight the critical role of stimulus selection for the observation of happiness or anger superiority effects in visual search even for face stimuli that avoid obvious expression related perceptual confounds and are drawn from a single database.
    Cognition and Emotion 04/2015; DOI:10.1080/02699931.2015.1027663 · 2.52 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Recently, D.V. Becker, Anderson, Mortensen, Neufeld, and Neel (2011) proposed recommendations to avoid methodological confounds in visual search studies using emotional photographic faces. These confounds were argued to cause the frequently observed Anger Superiority Effect (ASE), the faster detection of angry than happy expressions, and conceal a true Happiness Superiority Effect (HSE). In Experiment 1, we applied these recommendations (for the first time) to visual search among schematic faces that previously had consistently yielded a robust ASE. Contrary to the prevailing literature, but consistent with D.V. Becker et al. (2011), we observed a HSE with schematic faces. The HSE with schematic faces was replicated in Experiments 2 and 3 using a similar method in discrimination tasks rather than fixed target searches. Experiment 4 isolated background heterogeneity as the key determinant leading to the HSE. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
    Emotion 05/2014; 14(4). DOI:10.1037/a0036043 · 3.88 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This study investigated the neurocognitive mechanisms underlying the role of the eye and the mouth regions in the recognition of facial happiness, anger, and surprise. To this end, face stimuli were shown in three formats (whole face, upper half visible, and lower half visible) and behavioral categorization, computational modeling, and ERP (event-related potentials) measures were combined. N170 (150-180ms post-stimulus; right hemisphere) and EPN (early posterior negativity; 200-300ms; mainly, right hemisphere) were modulated by expression of whole faces, but not by separate halves. This suggests that expression encoding (N170) and emotional assessment (EPN) require holistic processing, mainly in the right hemisphere. In contrast, the mouth region of happy faces enhanced left temporo-occipital activity (150-180ms), and also the LPC (late positive complex; centro-parietal) activity (350-450ms) earlier than the angry eyes (450-600ms) or other face regions. Relatedly, computational modeling revealed that the mouth region of happy faces was also visually salient by 150ms following stimulus onset. This suggests that analytical or part-based processing of the salient smile occurs early (150-180ms) and lateralized (left), and is subsequently used as a shortcut to identify the expression of happiness (350-450ms). This would account for the happy face advantage in behavioral recognition tasks when the smile is visible.
    NeuroImage 02/2014; 92. DOI:10.1016/j.neuroimage.2014.01.048 · 6.13 Impact Factor