Article

Functional asymmetry and interhemispheric cooperation in the perception of emotions from facial expressions.

Department of Psychology and Center for Cognitive Science, University of Turin, Via Po 14, 10123, Turin, Italy.
Experimental Brain Research (Impact Factor: 2.17). 06/2006; 171(3):389-404. DOI: 10.1007/s00221-005-0279-4
Source: PubMed

ABSTRACT The present study used the redundant target paradigm on healthy subjects to investigate functional hemispheric asymmetries and interhemispheric cooperation in the perception of emotions from faces. In Experiment 1 participants responded to checkerboards presented either unilaterally to the left (LVF) or right visual half field (RVF), or simultaneously to both hemifields (BVF), while performing a pointing task for the control of eye movements. As previously reported (Miniussi et al. in J Cogn Neurosci 10:216-230, 1998), redundant stimulation led to shorter latencies for stimulus detection (bilateral gain or redundant target effect, RTE) that exceeded the limit for a probabilistic interpretation, thereby validating the pointing procedure and supporting interhemispheric cooperation. In Experiment 2 the same pointing procedure was used in a go/no-go task requiring subjects to respond when seeing a target emotional expression (happy or fearful, counterbalanced between blocks). Faster reaction times to unilateral LVF than RVF emotions, regardless of valence, indicate that the perception of positive and negative emotional faces is lateralized toward the right hemisphere. Simultaneous presentation of two congruent emotional faces, either happy or fearful, produced an RTE that cannot be explained by probability summation and suggests interhemispheric cooperation and neural summation. No such effect was present with BVF incongruent facial expressions. In Experiment 3 we studied whether the RTE for emotional faces depends on the physical identity between BVF stimuli, and we set a second BVF congruent condition in which there was only emotional but not physical or gender identity between stimuli (i.e. two different faces expressing the same emotion). The RTE and interhemispheric cooperation were present also in this second BVF congruent condition. This shows that emotional congruency is the sufficient condition for the RTE to take place in the intact brain and that the cerebral hemispheres can interact in spite of physical differences between stimuli.

0 Bookmarks
 · 
195 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Studies indicate that perceiving emotional body language recruits fronto-parietal regions involved in action execution. However, the nature of such motor activation is unclear. Using transcranial magnetic stimulation (TMS) we provide correlational and causative evidence of two distinct stages of motor cortex engagement during emotion perception. Participants observed pictures of body expressions and categorized them as happy, fearful or neutral while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed a reduction of excitability for happy and fearful emotional bodies that was specific to the right hemisphere and correlated with participants' disposition to feel personal distress. This 'orienting' inhibitory response to emotional bodies was also paralleled by a general drop in categorization accuracy when stimulating the right but not the left motor cortex. Conversely, at 300 ms, greater excitability for negative, positive and neutral movements was found in both hemispheres. This later motor facilitation marginally correlated with participants' tendency to assume the psychological perspectives of others and reflected simulation of the movement implied in the neutral and emotional body expressions. These findings highlight the motor system's involvement during perception of emotional bodies. They suggest that fast orienting reactions to emotional cues-reflecting neural processing necessary for visual perception-occur before motor features of the observed emotional expression are simulated in the motor system and that distinct empathic dispositions influence these two neural motor phenomena. Implications for theories of embodied simulation are discussed.
    Brain Structure and Function 07/2014; DOI:10.1007/s00429-014-0825-6 · 7.84 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Visual threat-related signals are not only processed via a cortical geniculo-striatal pathway to the amygdala but also via a subcortical colliculo-pulvinar-amygdala pathway, which presumably mediates implicit processing of fearful stimuli. Indeed, hemianopic patients with unilateral damage to the geniculo-striatal pathway have been shown to respond faster to seen happy faces in their intact visual field when unseen fearful faces were concurrently presented in their blind field [Bertini, C., Cecere, R., & Làdavas, E. I am blind, but I "see" fear. Cortex, 49, 985-993, 2013]. This behavioral facilitation in the presence of unseen fear might reflect enhanced processing of consciously perceived faces because of early activation of the subcortical pathway for implicit fear perception, which possibly leads to a modulation of cortical activity. To test this hypothesis, we examined ERPs elicited by fearful and happy faces presented to the intact visual field of right and left hemianopic patients, whereas fearful, happy, or neutral faces were concurrently presented in their blind field. Results showed that the amplitude of the N170 elicited by seen happy faces was selectively increased when an unseen fearful face was concurrently presented in the blind field of right hemianopic patients. These results suggest that when the geniculo-striate visual pathway is lesioned, the rapid and implicit processing of threat signals can enhance facial encoding. Notably, the N170 modulation was only observed in left-lesioned patients, favoring the hypothesis that implicit subcortical processing of fearful signals can influence face encoding only when the right hemisphere is intact.
    Journal of Cognitive Neuroscience 06/2014; 26(11):1-14. DOI:10.1162/jocn_a_00671 · 4.69 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background objectives: Studies suggest that the right hemisphere is dominant for emotional facial recognition. In addition, whereas some studies suggest the right hemisphere mediates the processing of all emotions (dominance hypothesis), other studies suggest that the left hemisphere mediates positive emotions the right mediates negative emotions (valence hypothesis). Since each hemisphere primarily attends to contralateral space, the goals of this study was to learn if emotional faces would induce a leftward deviation of attention and if the valence of facial emotional stimuli can influence the normal viewer's spatial direction of attention. Methods: Seventeen normal right handed participants were asked to bisect horizontal lines that had all combinations of sad, happy or neutral faces at ends of these lines. During this task the subjects were never requested to look at these faces and there were no task demands that depended on viewing these faces. Results: Presentation of emotional faces induced a greater leftward deviation compared to neutral faces, independent of where (spatial position) these faces were presented. However, faces portraying negative emotions tended to induce a greater leftward bias than positive emotions. Conclusions: Independent of location, the presence of emotional faces influenced the spatial allocation of attention, such that normal subjects shift the direction of their attention toward left hemispace and this attentional shift appears to be greater for negative (sad) than positive faces (happy).
    Brain and Cognition 10/2014; 91C:108-112. DOI:10.1016/j.bandc.2014.09.006 · 2.68 Impact Factor

Full-text (2 Sources)

Download
68 Downloads
Available from
May 27, 2014