Affective engagement for facial expressions and emotional scenes: The influence of social anxiety

University of Florida, Gainesville, FL 32608, United States.
Biological psychology (Impact Factor: 3.47). 05/2012; 91(1):103-10. DOI: 10.1016/j.biopsycho.2012.05.002
Source: PubMed

ABSTRACT Pictures of emotional facial expressions or natural scenes are often used as cues in emotion research. We examined the extent to which these different stimuli engage emotion and attention, and whether the presence of social anxiety symptoms influences responding to facial cues. Sixty participants reporting high or low social anxiety viewed pictures of angry, neutral, and happy faces, as well as violent, neutral, and erotic scenes, while skin conductance and event-related potentials were recorded. Acoustic startle probes were presented throughout picture viewing, and blink magnitude, probe P3 and reaction time to the startle probe also were measured. Results indicated that viewing emotional scenes prompted strong reactions in autonomic, central, and reflex measures, whereas pictures of faces were generally weak elicitors of measurable emotional response. However, higher social anxiety was associated with modest electrodermal changes when viewing angry faces and mild startle potentiation when viewing either angry or smiling faces, compared to neutral. Taken together, pictures of facial expressions do not strongly engage fundamental affective reactions, but these cues appeared to be effective in distinguishing between high and low social anxiety participants, supporting their use in anxiety research.

Download full-text


Available from: Bethany Wangelin, Mar 06, 2015
1 Follower
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: There is emerging evidence for a positivity effect in healthy aging, which describes an age-specific increased focus on positive compared to negative information. Life-span researchers have attributed this effect to the selective allocation of cognitive resources in the service of prioritized emotional goals. We explored the basic principles of this assumption by assessing selective attention and memory for visual stimuli, differing in emotional content and self-relevance, in young and old participants. To specifically address the impact of cognitive control, voluntary attentional selection during the presentation of multiple-item displays was analyzed and linked to participants' general ability of cognitive control. Results revealed a positivity effect in older adults' selective attention and memory, which was particularly pronounced for self-relevant stimuli. Focusing on positive and ignoring negative information was most evident in older participants with a generally higher ability to exert top-down control during visual search. Our findings highlight the role of controlled selectivity in the occurrence of a positivity effect in aging. Since the effect has been related to well-being in later life, we suggest that the ability to selectively allocate top-down control might represent a resilience factor for emotional health in aging.
    PLoS ONE 08/2014; 9(8):e104180. DOI:10.1371/journal.pone.0104180 · 3.53 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Time perception has been shown to be altered by emotions. This study employed event-related potentials (ERPs) to examine the effects of two threat-related emotions on the judgment of time intervals in the range of 490-910 ms. We demonstrated that disgust and fear have distinct influences on time perception. At the behavioral level, disgusted faces were estimated longer and fearful faces were estimated shorter (i.e., the generalization gradient for the disgusted faces was shifted left while the generalization gradient for the fearful faces was shifted right) when compared with neutral faces. Accordingly, the contingent negative variation, an online ERP index of timing, displayed larger area in disgust and smaller area in fear conditions when compared with neutral condition (disgust = 1.94 ± 2.35 μV•s, neutral = 1.40 ± 2.5 μV•s, and fear = 1.00 ± 2.26 μV•s). These findings indicated that specific neural mechanisms may underlie the attention effects of different subtypes of threat-related emotions on timing; compared with neutral faces, fearful faces are likely to attract more attentional resources while disgusted faces may attract less attentional resources for emotional processing. The major contribution of the current study is to provide neural correlates of fear vs. disgust divergence in the aspect of time perception and to demonstrate beyond the behavioral level that the categorization of threat-related emotions should be refined so to highlight the adaptability of the human defense system.
    Frontiers in Behavioral Neuroscience 08/2014; 8:293. DOI:10.3389/fnbeh.2014.00293 · 4.16 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The paper deals with the increasing accuracy of voice authentication methods. The developed algorithm first extracts segmental parameters, such as Zero Crossing Rate, the Fundamental Frequency and Mel-frequency cepstral coefficients from voice. Based on these parameters, the neural network classifier detects the speaker's emotional state. These parameters shape the distribution of neurons in Kohonen maps, forming clusters of neurons on the map characterizing a particular emotional state. Using regression analysis, we can calculate the function of the parameters of individual emotional states. This relationship increases voice authentication accuracy and prevents unjust rejection. © 2013 SPIE.