Two techniques for inducing depressed mood in the laboratory are described and evaluated. The Velten mood induction procedure has been shown to mimic the effects of naturally occurring depressed mood on a wide range of variables, some of which are unlikely to be susceptible to faking. It therefore appears that the Velten depression induction produces a state which is a good analogue of mild, naturally occurring retarded depression. However, between 30% and 50% of subjects fail to respond to the Velten. This makes it cumbersome for research purposes and raises questions about the generalizability of results obtained using it. The Musical mood induction procedure has been less extensively researched than the Velten. However the available evidence suggests that it also produces a good analogue of mild, naturally occurring retarded depression. In addition, it has the advantage that almost all subjects respond to it. Some commentators have taken the fact that the Velten procedure can induce depressed mood as evidence for the cognitive theory of depression. It is argued that this conclusion is invalid as it makes unwarranted assumptions about the strategies subjects use in order to change mood during the Velten procedure. Several practical points relating to the use of Velten and Musical induction procedures are discussed.
"Remarkably, our effect suggests that comfortable/uncomfortable actions can be conceived as a new powerful mood inducer. Hence, our Motor Action Mood Induction Procedure, MAMIP, should be added to the list including the Musical Mood Induction Technique, MMIT , the Velten Mood Induction Procedure, VMIP , and the self-referential mood induction , to name only a few procedures used in controlled settings. "
[Show abstract][Hide abstract] ABSTRACT: Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the comfort/discomfort of body actions. Using a novel Motor Action Mood Induction Procedure, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant's global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience.
PLoS ONE 09/2014; 9(9):e108211. DOI:10.1371/journal.pone.0108211 · 3.23 Impact Factor
"Taken together, these experimental comparisons and metaanalysis attempts gradually add to our knowledge of different affect/emotion induction procedures' efficacy. However, most studies to date either only focused on a couple of affect induction methods (Brewer et al., 1980; Clark, 1983; Baumgartner et al., 2006a; Jallais and Gilet, 2010) or just targeted one aspect of affective experience (e.g., anger in Lobbestael et al., 2008; sadness in Vuoskoski and Eerola, 2012). To provide a better-rounded picture of major affect induction techniques, a more extensive "
[Show abstract][Hide abstract] ABSTRACT: Affect is a fundamental aspect of the human mind. An increasing number of experiments attempt to examine the influence of affect on other psychological phenomena. To accomplish this research, it is necessary to experimentally modify participants' affective states. In the present experiment, we compared the efficacy of four commonly used affect induction procedures. Participants (38 healthy undergraduate students: 18 males) were randomly assigned to either a pleasant or an unpleasant affect induction group, and then underwent four different affect induction procedures: (1) recall of an affectively salient event accompanied by affectively congruent music, (2) script-driven guided imagery, (3) viewing images while listening to affectively congruent music, and (4) posing affective facial actions, body postures, and vocal expressions. All four affect induction methods were successful in inducing both pleasant and unpleasant affective states. The viewing image with music and recall with music procedures were most effective in enhancing positive affect, whereas the viewing image with music procedure was most effective in enhancing negative affect. Implications for the scientific study of affect are discussed.
Frontiers in Psychology 07/2014; 5:689. DOI:10.3389/fpsyg.2014.00689 · 2.80 Impact Factor
"Interviews elicit a wide range of emotion and interpersonal behavior   . Film clips and games    are well-validated approaches to elicit Fig. 1. Upper-left: general view from a regular camera; Upper-right: 2D video; Lower-left: 3D dynamic geometric model; Lower-right: 3D dynamic geometric model with mapped texture. "
[Show abstract][Hide abstract] ABSTRACT: Facial expression is central to human experience. Its efficiency and valid measurement are challenges that automated facial image analysis seeks to address. Most publically available databases are limited to 2D static images or video of posed facial behavior. Because posed and un-posed (aka “spontaneous”) facial expressions differ along several dimensions including complexity and timing, well-annotated video of un-posed facial behavior is needed. Moreover, because the face is a three-dimensional deformable object, 2D video may be insufficient, and therefore 3D video archives are required. We present a newly developed 3D video database of spontaneous facial expressions in a diverse group of young adults. Well-validated emotion inductions were used to elicit expressions of emotion and paralinguistic communication. Frame-level ground-truth for facial actions was obtained using the Facial Action Coding System. Facial features were tracked in both 2D and 3D domains. To the best of our knowledge, this new database is the first of its kind for the public. The work promotes the exploration of 3D spatiotemporal features in subtle facial expression, better understanding of the relation between pose and motion dynamics in facial action units, and deeper understanding of naturally occurring facial action.
Note: This list is based on the publications in our database and might not be exhaustive.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.