Article

Impact of depression on response to comedy: A dynamic facial coding analysis

Department of Psychology, University of Pittsburgh, Pittsburgh, PA 15260, USA.
Journal of Abnormal Psychology (Impact Factor: 4.86). 11/2007; 116(4):804-9. DOI: 10.1037/0021-843X.116.4.804
Source: PubMed

ABSTRACT Individuals suffering from depression show diminished facial responses to positive stimuli. Recent cognitive research suggests that depressed individuals may appraise emotional stimuli differently than do nondepressed persons. Prior studies do not indicate whether depressed individuals respond differently when they encounter positive stimuli that are difficult to avoid. The authors investigated dynamic responses of individuals varying in both history of major depressive disorder (MDD) and current depressive symptomatology (N = 116) to robust positive stimuli. The Facial Action Coding System (Ekman & Friesen, 1978) was used to measure affect-related responses to a comedy clip. Participants reporting current depressive symptomatology were more likely to evince affect-related shifts in expression following the clip than were those without current symptomatology. This effect of current symptomatology emerged even when the contrast focused only on individuals with a history of MDD. Specifically, persons with current depressive symptomatology were more likely than those without current symptomatology to control their initial smiles with negative affect-related expressions. These findings suggest that integration of emotion science and social cognition may yield important advances for understanding depression.

1 Bookmark
 · 
71 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we present the design of a wearable device that reads positive facial expressions using physiological signals. We first analyze facial morphology in 3 dimensions and facial electromyographic signals on different facial locations and show that we can detect electromyographic signals with high amplitude on areas of low facial mobility on the side of the face, which are correlated to ones obtained from electrodes on traditional surface electromyographic capturing positions on top of facial muscles on the front of the face. We use a multi-attribute decision-making method to find adequate electrode positions on the side of face to capture these signals. Based on this analysis, we design and implement an ergonomic wearable device with high reliability. Because the signals are recorded distally, the proposed device uses independent component analysis and an artificial neural network to analyze them and achieve a high facial expression recognition rate on the side of the face. The recognized emotional facial expressions through the wearable interface device can be recorded during therapeutic interventions and for long-term facial expression recognition to quantify and infer the user's affective state in order to support medical professionals.
    IEEE Transactions on Affective Computing 03/2014; 5(3):227 - 237. DOI:10.1109/TAFFC.2014.2313557 · 3.47 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Methods to assess individual facial actions have potential to shed light on important behavioral phenomena ranging from emotion and social interaction to psychological disorders and health. However, manual coding of such actions is labor intensive and requires extensive training. To date, establishing reliable automated coding of unscripted facial actions has been a daunting challenge impeding development of psychological theories and applications requiring facial expression assessment. It is therefore essential that automated coding systems be developed with enough precision and robustness to ease the burden of manual coding in challenging data involving variation in participant gender, ethnicity, head pose, speech, and occlusion. We report a major advance in automated coding of spontaneous facial actions during an unscripted social interaction involving three strangers. For each participant (n = 80, 47% women, 15% Nonwhite), 25 facial action units (AUs) were manually coded from video using the Facial Action Coding System. Twelve AUs occurred more than 3% of the time and were processed using automated FACS coding. Automated coding showed very strong reliability for the proportion of time that each AU occurred (mean intraclass correlation = 0.89), and the more stringent criterion of frame-by-frame reliability was moderate to strong (mean Matthew’s correlation = 0.61). With few exceptions, differences in AU detection related to gender, ethnicity, pose, and average pixel intensity were small. Fewer than 6% of frames could be coded manually but not automatically. These findings suggest automated FACS coding has progressed sufficiently to be applied to observational research in emotion and related areas of study.
    Behavior Research Methods 10/2014; DOI:10.3758/s13428-014-0536-1 · 2.12 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Recently there has been arising interest in automatically recognizing nonverbal behaviors that are linked with psychological conditions. Work in this direction has shown great potential for cases such as depression and post-traumatic stress disorder (PTSD), however most of the times gender differences have not been explored. In this paper, we show that gender plays an important role in the automatic assessment of psychological conditions such as depression and PTSD. We identify a directly interpretable and intuitive set of predictive indicators, selected from three general categories of nonverbal behaviors: affect, expression variability and motor variability. For the analysis, we employ a semi-structured virtual human interview dataset which includes 53 video recorded interactions. Our experiments on automatic classification of psychological conditions show that a gender-dependent approach significantly improves the performance over a gender agnostic one.
    Journal on Multimodal User Interfaces 03/2014; 9(1). DOI:10.1007/s12193-014-0161-4 · 0.46 Impact Factor

Full-text (2 Sources)

Download
44 Downloads
Available from
May 21, 2014