Article

Eyes Wide Shut: Amygdala Mediates Eyes-Closed Effect on Emotional Experience with Music

University of Leuven, Belgium
PLoS ONE (Impact Factor: 3.23). 02/2009; 4(7):e6230. DOI: 10.1371/journal.pone.0006230
Source: PubMed

ABSTRACT

The perceived emotional value of stimuli and, as a consequence the subjective emotional experience with them, can be affected by context-dependent styles of processing. Therefore, the investigation of the neural correlates of emotional experience requires accounting for such a variable, a matter of an experimental challenge. Closing the eyes affects the style of attending to auditory stimuli by modifying the perceptual relationship with the environment without changing the stimulus itself. In the current study, we used fMRI to characterize the neural mediators of such modification on the experience of emotionality in music. We assumed that closed eyes position will reveal interplay between different levels of neural processing of emotions. More specifically, we focused on the amygdala as a central node of the limbic system and on its co-activation with the Locus Ceruleus (LC) and Ventral Prefrontal Cortex (VPFC); regions involved in processing of, respectively, 'low', visceral-, and 'high', cognitive-related, values of emotional stimuli. Fifteen healthy subjects listened to negative and neutral music excerpts with eyes closed or open. As expected, behavioral results showed that closing the eyes while listening to emotional music resulted in enhanced rating of emotionality, specifically of negative music. In correspondence, fMRI results showed greater activation in the amygdala when subjects listened to the emotional music with eyes closed relative to eyes open. More so, by using voxel-based correlation and a dynamic causal model analyses we demonstrated that increased amygdala activation to negative music with eyes closed led to increased activations in the LC and VPFC. This finding supports a system-based model of perceived emotionality in which the amygdala has a central role in mediating the effect of context-based processing style by recruiting neural operations involved in both visceral (i.e. 'low') and cognitive (i.e. 'high') related processes of emotions.

Download full-text

Full-text

Available from: Andrey Zhdanov
  • Source
    • "It is generally agreed that music evokes emotions and stimulates physiological and behavioral responses (Habibi and Damasio, 2014). In addition, more recent neuroscience research indicates there are shared neural networks implicated in both emotion and music processing (Blood and Zatorre, 2001;Satoh et al., 2001;Brown et al., 2004;Menon and Levitin, 2005;Baumgartner et al., 2006;Koelsch et al., 2006Koelsch et al., , 2008Bengtsson et al., 2007;Brown and Martinez, 2007;Foss et al., 2007;Kleber et al., 2007;Mitterschiffthaler et al., 2007;Mizuno and Sugishita, 2007;Berkowitz and Ansari, 2008; Limb andBraun, 2008;Lerner et al., 2009), and specifically between music and ER processing (Sena Moore, 2013). There is also evidence to support the developmentally appropriate use of music-based experiences to target ER development in preschoolers because music stimulates physiologic arousal and induces emotions, and assumes a natural role in bonding and social interactions. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Emotion regulation (ER) is an umbrella term to describe interactive, goal-dependent explicit, and implicit processes that are intended to help an individual manage and shift an emotional experience. The primary window for appropriate ER development occurs during the infant, toddler, and preschool years. Atypical ER development is considered a risk factor for mental health problems and has been implicated as a primary mechanism underlying childhood pathologies. Current treatments are predominantly verbal- and behavioral-based and lack the opportunity to practice in-the-moment management of emotionally charged situations. There is also an absence of caregiver–child interaction in these treatment strategies. Based on behavioral and neural support for music as a therapeutic mechanism, the incorporation of intentional music experiences, facilitated by a music therapist, may be one way to address these limitations. Musical Contour Regulation Facilitation (MCRF) is an interactive therapist-child music-based intervention for ER development practice in preschoolers. The MCRF intervention uses the deliberate contour and temporal structure of a music therapy session to mirror the changing flow of the caregiver–child interaction through the alternation of high arousal and low arousal music experiences. The purpose of this paper is to describe the Therapeutic Function of Music (TFM), a theory-based description of the structural characteristics for a music-based stimulus to musically facilitate developmentally appropriate high arousal and low arousal in-the-moment ER experiences. The TFM analysis is based on a review of the music theory, music neuroscience, and music development literature and provides a preliminary model of the structural characteristics of the music as a core component of the MCRF intervention.
    Full-text · Article · Oct 2015 · Frontiers in Human Neuroscience
  • Source
    • "open state while listening to negative-valence music. This result may stem from participants' experiencing stronger feelings of negative emotion with their eyes closed. This view was supported by participants' evaluation of the emotional stimuli; their experienced emotion valence was more negative when they listened to music with their eyes-closed.Lerner et al. (2009)demonstrated that the eyes-closed state enhanced the negative emotion and arousal level during exposure to affective music, and a greater amygdala response to negative-valence music was found with closed eyes compared to open eyes. They therefore concluded that specific styles of attending can modify the activation of the amygdala in res"
    [Show abstract] [Hide abstract]
    ABSTRACT: In real life, listening to music may be associated with an eyes-closed or eyes-open state. The effect of eye state on listeners' reaction to music has attracted some attention, but its influence on brain activity has not been fully investigated. The present study aimed to evaluate the electroencephalographic (EEG) markers for the emotional valence of music in different eye states. Thirty participants listened to musical excerpts with different emotional content in the eyes-closed and eyes-open states. The results showed that participants rated the music as more pleasant or with more positive valence under an eyes-open state. In addition, we found that the alpha asymmetry indices calculated on the parietal and temporal sites reflected emotion valence in the eyes-closed and eyes-open states, respectively. The theta power in the frontal area significantly increased while listening to emotional-positive music compared to emotional-negative music under the eyes-closed condition. These effects of eye states on EEG markers are discussed in terms of brain mechanisms underlying attention and emotion.
    Full-text · Article · Aug 2015 · Frontiers in Psychology
  • Source
    • "Visual imagery is considered as one basic emotion-evoking principle during music listening [68] and anatomical studies indicate that auditory core, belt and parabelt regions project to V1 and V2 of the visual cortex, and that neurons in V2 project back into these auditory regions [69]. Note that the eyes-closed requirement of the experimental task used in the present study was motivated by evidence suggesting that affective activity is enhanced when the eyes are closed [70], a condition that practically minimizes any vision-specific sensory contributions to visual cortex activity. Evidence suggesting that the occipital visual cortex is also involved in spatial hearing, in people with normal sight, have also been observed during several different auditory tasks (for details see 71). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The purpose of the present study was the investigation of interaction effects between functional MRI scanner noise and affective neural processes. Stimuli comprised of psychoacoustically balanced musical pieces, expressing three different emotions (fear, neutral, joy). Participants (N=34, 19 female) were split into two groups, one subjected to continuous scanning and another subjected to sparse temporal scanning that features decreased scanner noise. Tests for interaction effects between scanning group (sparse/quieter vs continuous/noisier) and emotion (fear, neutral, joy) were performed. Results revealed interactions between the affective expression of stimuli and scanning group localized in bilateral auditory cortex, insula and visual cortex (calcarine sulcus). Post-hoc comparisons revealed that during sparse scanning, but not during continuous scanning, BOLD signals were significantly stronger for joy than for fear, as well as stronger for fear than for neutral in bilateral auditory cortex. During continuous scanning, but not during sparse scanning, BOLD signals were significantly stronger for joy than for neutral in the left auditory cortex and for joy than for fear in the calcarine sulcus. To the authors' knowledge, this is the first study to show a statistical interaction effect between scanner noise and affective processes and extends evidence suggesting scanner noise to be an important factor in functional MRI research that can affect and distort affective brain processes.
    Full-text · Article · Nov 2013 · PLoS ONE
Show more