Processing emotional category congruency between emotional facial expressions and emotional words.

Macquarie Centre for Cognitive Science (MACCS), Macquarie University, NSW, 2109 Australia.
Cognition and Emotion (Impact Factor: 2.52). 02/2011; 25(2):369-79. DOI: 10.1080/02699931.2010.488945
Source: PubMed

ABSTRACT Facial expressions are critical for effective social communication, and as such may be processed by the visual system even when it might be advantageous to ignore them. Previous research has shown that categorising emotional words was impaired when faces of a conflicting valence were simultaneously presented. In the present study, we examined whether emotional word categorisation would also be impaired when faces of the same (negative) valence but different emotional category (either angry, sad or fearful) were simultaneously presented. Behavioural results provided evidence for involuntary processing of basic emotional facial expression category, with slower word categorisation when the face and word categories were incongruent (e.g., angry word and sad face) than congruent (e.g., angry word and angry face). Event-related potentials (ERPs) time-locked to the presentation of the word-face pairs also revealed that emotional category congruency effects were evident from approximately 170 ms after stimulus onset.

1 Follower
8 Reads
  • Source
    • "Targets and distractors (either faces or lions) were superimposed to ensure that the spatial location of the images was held constant (similar to previous studies with words: [5]–[8]). In contrast to other phenomena such as bistable perception or binocular rivalry (e.g., [17], [20], [21]), the superimposing of targets and distractors was also important for ensuring equivalence between the veridical and perceived stimuli. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Facial expressions play an important role in successful social interactions, with previous research suggesting that facial expressions may be processed involuntarily. In the current study, we investigate whether involuntary processing of facial expressions would also occur when facial expression distractors are simultaneously presented in the same spatial location as facial expression targets. Targets and distractors from another stimulus class (lions) were also used. Results indicated that angry facial expression distractors interfered more than neutral face distractors with the ability to respond to both face and lion targets. These findings suggest that information from angry facial expressions can be extracted rapidly from a very brief presentation (50 ms), providing compelling evidence that angry facial expressions are processed involuntarily.
    PLoS ONE 07/2011; 6(7):e22287. DOI:10.1371/journal.pone.0022287 · 3.23 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
    Emotion 05/2015; DOI:10.1037/a0039166 · 3.88 Impact Factor