ArticlePDF Available

Hemodynamic (fNIRS) and EEG (N200) correlates of emotional inter-species interactions modulated by visual and auditory stimulation


Abstract and Figures

The brain activity, considered in its hemodynamic (optical imaging: functional Near-Infrared Spectroscopy, fNIRS) and electrophysiological components (event-related potentials, ERPs, N200) was monitored when subjects observed (visual stimulation, V) or observed and heard (visual + auditory stimulation, VU) situations which represented inter-species (human-animal) interactions, with an emotional positive (cooperative) or negative (uncooperative) content. In addition, the cortical lateralization effect (more left or right dorsolateral prefrontal cortex, DLPFC) was explored. Both ERP and fNIRS showed significant effects due to emotional interactions which were discussed at light of cross-modal integration effects. The significance of inter-species effect for the emotional behavior was considered. In addition, hemodynamic and EEG consonant results and their value as integrated measures were discussed at light of valence effect.
Content may be subject to copyright.
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
Hemodynamic (fNIRS) and EEG
(N200) correlates of emotional
inter-species interactions
modulated by visual and auditory
Michela Balconi
& Maria Elide Vanutelli
The brain activity, considered in its hemodynamic (optical imaging: functional Near-Infrared
Spectroscopy, fNIRS) and electrophysiological components (event-related potentials, ERPs, N200)
was monitored when subjects observed (visual stimulation, V) or observed and heard (visual + auditory
stimulation, VU) situations which represented inter-species (human-animal) interactions, with
an emotional positive (cooperative) or negative (uncooperative) content. In addition, the cortical
lateralization eect (more left or right dorsolateral prefrontal cortex, DLPFC) was explored. Both ERP
and fNIRS showed signicant eects due to emotional interactions which were discussed at light of
cross-modal integration eects. The signicance of inter-species eect for the emotional behavior
was considered. In addition, hemodynamic and EEG consonant results and their value as integrated
measures were discussed at light of valence eect.
Previous research has revealed that the processing of emotional visual and auditory stimuli leads to increased
activation of various cortical areas, including the amygdala, the prefrontal cortex (PFC), the dorsolateral pre-
frontal cortex (DLPFC) and the specic sensory areas
. More recent studies have identied the DLPFC as a key
region in the experience and regulation of visual emotional responses
. Also, auditory emotional stimulation
was examined, showing a similar eect to that found for the visual condition. However, in some studies which
focalized on EEG frequency band analysis or event-related potentials (ERPs), it was found that frontal areas are
responsive to some specic valence
and, in other cases, mainly to arousal of auditory stimuli more than to their
specic valence
Limited research explored the eect of combined emotional visual and auditory stimulation. A main caveat
of previous studies was that they were focalized on typical human conditions or situations (stimuli with emo-
tional value, i.e. human faces and voices)
, and limited research considered the eect of visual and auditory
emotional stimulation in human/non-human social interactions. Some of them focalized on the empathic emo-
tional response by humans to dierent species
, or on the brain correlates of pain perception in observing
erefore, actually there is a need to improve our knowledge about the nature and the cortical correlates of
the emotional behavior in response to inter-species emotional condition. In the present study we specically con-
sidered the cortical response to emotional interactions induced by visual and visual-auditory stimulation more
directly related to inter-species relationships, where an emotionally positive or negative human-animal interac-
tions were represented. Indeed the great majority of previous studies focused only on human-human context,
even if we dont exclusively interact with other people: in fact, as part of our everyday life, we share our social con-
texts with also non-human animals. Few previous studies explored emotions in inter-species contexts, examining
the dierences between the emotional response for other humans or animals, but none of them, at our knowledge,
considered the social meaning of intra- and inter-species contexts. More specically, the contribution of specic
Research Unit in Aective and Social Neuroscience, Department of Psychology, Catholic University of Milan, Milan,
Department of Psychology, Catholic University of the Sacred Heart, Milan. Correspondence and requests for
materials should be addressed to M.B. (email:
received: 16 December 2015
accepted: 01 March 2016
Published: 15 March 2016
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
brain areas implicated in human/non-human interactions was scarcely considered. e comparison between dif-
ferent species could or could not support the homogeneity of the emotional behavior and of the emotional brain
response to the inter-species interactional context. In addition, the specicity of visual and auditory stimulation
was not explored, since many studies focused on each stimulation condition separately, or took into consideration
only the inter-species condition
Previous ndings on human-human interactions provided compelling evidence of an early integration of
visual (i.e. face) and auditory (i.e. voice) information while processing emotional features
. Both behavioral
and electroencephalographic (ERPs) data revealed some, although non-identical, patterns of cross-modal inu-
ences: for example, a modulation of P200 ERP-component suggested the presence of a relatively early integration
. Also, congruous and incongruous cross-modal stimulation was found to induce specic and distinct
. e largest EEG alpha-power-density was observed for the sound conditions, with intermediate
values for the picture conditions, and the lowest ones for the combined conditions, thus indicating the strong-
est activation (alpha decreasing) in the combined condition within a distributed emotion and arousal network
comprising frontal, temporal, parietal and occipital neural structures
. Also, strong similarities in alpha and
beta bands during visual and audiovisual conditions were found, suggesting the intervention of a strong visual
processing component in the perception of audiovisual stimuli
In addition, recent results mainly focused on visual stimulation indicated a signicant and specic laterali-
zation eect of PFC activation, based on positive (more directly processed by the le hemisphere) and negative
(more directly processed by the right hemisphere) valence of emotions
. Indeed, the valence model supposes
that cortical dierences between the two hemispheres are attributable to positive vs. negative valence of emo-
. However, some other perspectives suggested a dichotomy on approach/avoidance motivation to emo-
tions, the rst being more le frontal-related and the second more right frontal-related
. Indeed, based on the
approach-withdrawal model of emotion regulation, the emotional behavior should be associated with a balanced
activity in the le and right frontal brain areas which can be explained in an asymmetry measurement
However, previous research did not verify simultaneously the specic visual and auditory lateralization eect
based on valence, but more oen only auditory
or visual stimuli
were used.
Among the different modalities available for monitoring brain activity, EEG/ERPs and Near-Infrared
Spectroscopy (NIRS) are non-invasive and particularly well-suited for evaluating the PFC activity. In fact,
although studies have provided functional images of activated brain areas associated with emotional tasks, they
have seldom addressed the temporal course of such activation. Due to its fast temporal evolution and its rep-
resentation and integration among widespread neural networks, emotion representation, together with its neu-
robiological correlates, should preferably be examined by means of imaging and EEG methods that oer good
resolution in both temporal and spatial domains.
To verify these eects, we applied ERPs analysis to investigate the neural correlates in response to emotional
contexts and the relation between these brain-based potentials and the simple (only visual) or cross-modal (visual
and auditory) stimulation. One specic ERP deection was analyzed, that is the N200 eect. is deection was
considered a specic marker of the emotional value, the relevance and the salience of the situation, as well as of
the emotional involvement (arousal) induced by the aective condition. Indeed, the motivational signicance of
emotions aects subjects’ cortical responses also at longer latencies, since it was found that emotionally salient
stimuli generate greater amplitudes of ERP response for N200
, for the positive deection P300
, and also
for late positivity (LPP) measures
In some studies that examined contextual emotional impact on behavior, the specic ERP component N200
was found to be directly related to the emotional content of stimuli or situations
. Specically, it was previ-
ously related to the degree of the attentional and emotional relevance of the context
, and it was observed to be
related to subjects’ emotional involvement in terms of arousal
. More specically, N200 was found to be induced
by the emotional cues (such as faces) more than neutral cues for explicit
or implicit tasks
, and it was inter-
preted as a task-specic emotional index
, able to highlight the comprehension of the emotional signicance
of the stimulus. Indeed, this ERP component was found to be modulated by the judgment of aective arousal and
valence of emotional stimuli
About the second measure, the temporal resolution of fNIRS is high enough for measuring event-related
hemodynamic responses. In fact, one of the most relevant features of NIRS is its high temporal resolution com-
pared to other imaging techniques
. is is an important feature for the present study, since the integration
between EEG and hemodynamic measure requires an adequate comparable temporal resolution. Previous studies
have provided functional images of activated areas in the brain which are associated to emotional tasks, but they
have seldom addressed the temporal course of such activation. erefore, to study the integration between elec-
trophysiological and hemodynamic data, these specic measures seem indicated.
Although other neuroimaging techniques (such as fMRI) may offer a complete view of the cortical and
sub-cortical areas implicated in emotional processing, their low temporal resolution prevents a deep exploration
of the dynamic of the emotional experience. erefore, neither the classical neuroimaging nor the electrophysio-
logical measures seem to completely describe in depth the nature of the emotional correlates. For this reason the
integration of the hemodynamic and EEG measures may oer a complete overview of the brain activity modula-
tion with a more adequate spatial and temporal resolution
In addition, some specic areas more directly related to emotional processing, i.e. the PFC, are easily acces-
sible for fNIRS measurements. Interestingly, recent studies using fNIRS investigating the neural correlates of
emotion regulation processes also described an activation of the PFC
. Recently, fNIRS has been successfully
applied to investigate the emotional modulation of the visual cortex
, but further investigations on the auditory
domain are pending. It should be noted that some advantages are related to fNIRS studies with auditory stimula-
, whereas it can be dicult to explore the eects of emotional sounds in conventional functional Magnetic
Resonance Imaging (fMRI) scanners. As shown by Plichta et al.
the auditory cortex activation is modulated by
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
emotionally arousing complex auditory stimuli. Consistent with this hypothesis, pleasant as well as unpleasant
sounds led to higher auditory cortex activation, as compared to neutral sounds. is nding is in line with pre-
vious functional imaging studies on visual stimuli, where occipital regions were found to be more activated by
emotionally arousing compared to neutral stimuli
However, at present no specic fNIRS study analyzed the PFC contribution in auditory emotional process-
ing. In addition, no previous research by ERP and fNIRS considered the eect of auditory and visual condition
together in the case emotional interactive situations where inter-species relationships are represented. us, the
rst goal of this study was to investigate the brain response to interpersonal inter-species contexts with dierent
emotional valence by examining ERPs and fNIRS components elicited in individuals’ responses. Specically, three
dierent emotional situations were included: a rst condition where subjects observed a conictual and emotion-
ally discomfortable situation (negative situations); a second condition where the subject observed a cooperative
and emotionally comfortable situation (positive situations); and a third condition (control condition) where a
more neutral interaction was represented (no positive or negative emotional situation). Moreover, the eect of
simple visual (V) stimulation and combined visual and auditory (VU) stimulation was considered during these
dierently valenced situations.
We hypothesized that N200 and fNIRS-measured oxygenated haemoglobin (O2Hb) changes may be related
to the emotional content of the stimuli when subjects have to observe emotional inter-species interactions.
Specically, based on previous evidence, we supposed an increased le or right PFC activity as a function of
valence. Indeed we expected signicant dierences in response to dierent valenced situations: based on valence
and approach/withdrawal models of emotions, a signicant and consistent higher le prefrontal activation
(increased O2Hb; higher N200) was supposed for positive emotional conditions, whereas a consistent higher
right prefrontal activation was expected in response to negative conditions
. Moreover, this relationship should
be accompanied by a specic stimulation eect: compared to V, VU could show an increased brain response
for both ERP and fNIRS in concomitance with valence, since the cross-modal perception could increase and
strength the basic eects expected for the simple emotional visual condition. Finally, a signicant correlation was
supposed between ERP and fNIRS components, considered as markers of electrophysiological and hemodynamic
measures, respectively.
e following set of analyses was performed on the data: a rst repeated measures ANOVA was applied to the
ERP peak amplitude (N200); a second analysis was applied to O2Hb d values. Finally correlational analysis was
applied to N200 and d measures.
A preliminary analysis tested the signicant dierences between the baseline (neutral) and emotional con-
dition. For both EEG and NIRS measures the negative and positive conditions revealed signicant dierences
compared (P 0.01) with neutral condition. Due to the systematic eect and the preliminary value of this com-
parison, we did not included the neutral condition in the successive analyses.
EEG. ERPs data were entered into four-ways repeated measure ANOVAs, with factors condition (2,
V-VU) × lateralization (2, left-right) × valence (2, positive-negative) × localization (3, frontal-temporo/
central-parietal) applied to the peak amplitude. Type I errors associated with inhomogeneity of variance were
controlled by decreasing the degrees of freedom using the Greenhouse-Geiser epsilon. Bonferroni correction was
applied to the statistical data for multiple comparisons. Post hoc comparisons were successively applied to the
data (contrast analyses for repeated measure ANOVA).
As shown by ANOVA, the peak amplitude was modulated by localization (F(1,14) = 7.09, P 0.001,
= 0.34), valence (F(1,14) = 6.56, P 0.001, η
= 0.33), and valence × condition × l ateralization (F(1,28) = 9.13,
P 0.001, η
= 0.42). No other main or interaction eect was statistically signicant. As observed, peak ampli-
tude was higher for negative than positive stimuli. In addition, post-hoc comparison revealed that the frontal
areas showed higher peak amplitude compared to other cortical sites: respectively compared to temporo-central
(F(1,14) = 8.71, P 0.001, η
= 0.39) and to parietal (F(1,14) = 7.56, P 0.001, η
= 0.37) sites (Fig.1).
In addition, about the simple eects for the three-way interaction, signicant dierences were observed
between VU (higher peak deection) and V in response to negative stimuli within the right side (F(1,14) = 6.98,
P 0.001, η
= 0.35). Secondly, in VU signicant dierences were found between positive and negative stimuli
within the right side (F(1,14) = 8.87, P 0.001, η
= 0.41), with increased peak amplitude for negative than pos-
itive stimuli.
fNIRS. e statistical analysis was applied to d dependent measure for O2Hb concentration. Since at the anal-
ysis HHb was not signicant, we report only results for O2Hb-values. D was subjected to three factor (condition,
2 × l ateralization, 2 × valence, 2) repeated measures ANOVA. Data were averaged over le (Ch 1: AF3-F3; Ch2:
AF3-AFF1; Ch3: F5-F3) and right (Ch4: AF4-F4; Ch5: AF4-AFF2; Ch6: F6-F4) channels.
As shown by ANOVA, interaction eect condition × va lence (F(1,14) = 6.92, P 0.001, η
= 0.33) was sig-
nicant (Fig.2). No other main or interaction eect was signicant. Indeed, as shown by paired comparisons,
for VU negative stimuli in comparison to positive stimuli revealed increased d values (F(1,14) = 6.79, P 0.001,
= 0.33). In contrast, for V positive stimuli in comparison with negative stimuli showed increased d values
(F(1,14) = 6.13, P 0.001, η
= 0.32).
Correlational analysis between fNIRS and EEG. Pearsons correlation analysis (across-subject corre-
lations) was applied to N200 and d values. Correlational values were calculated distinctly for each condition
(V/VU) emotional valence (positive/negative stimuli) and lateralization (le/right).
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
ere was a signicant positive correlation between d values and N200 for negative patterns within the right
site for VU (r = 0.513; P 0.01). at is, in case of increased peak amplitude a concomitant O2Hb increasing was
observed within the right hemisphere in response to negative stimuli. No other correlation was signicant at the
analysis (Fig.3).
e present research explored the role of the prefrontal cortex while processing inter-species emotional interac-
tions. Visual and visuo-auditory emotional stimulation was provided in dierent relational contexts (positive,
negative and neutral). DLPFC was mainly implicated in aective response to inter-species emotional relation-
ship. However, signicant dierences were found in inter-species relationships based on stimulation type, with
increased cortical response for cross-modal stimulation than simple visual stimulation. Moreover, stimulus
Figure 1. N200 peak amplitude (mvolt) for le (up) and right (down) side in response to positive and
negative interactions as a function of V and VU.
Figure 2. Cortical maps of O2Hb as a function of VU (up) and V (down) in response to positive (right
heads) and negative (le heads) interactions.
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
valence aected both the N200 ERP and the hemodynamic response, with a more relevant impact of the negative
valence. is cortical activity was shown to be lateralized within the right hemisphere in response to negative
situations. Finally, the systematic relationship between hemodynamic and EEG measures was elucidated.
In general, the present results conrmed the crucial role of the PFC in processing emotional interactions also
in case of inter-species condition. Indeed, for the rst time this study demonstrated the contribution of this cor-
tical areas in response to human-non human interactions, considering both O2Hb and N200 eect. e specic
DLPFC contribution in processing emotional interactions was supported by the hemodynamic (increased O2Hb)
and ERP (higher N200 peak amplitude) proles. is result conrmed previous research within the visual domain
in the case of human contexts, which found that the PFC plays a crucial role in the integration of dierent aspects
of emotional regulation by managing the cognitive control over emotional stimuli and behavior
. erefore
we may suggest that a specic prefrontal cortical area may mediate the emotional processing and behavior of
subjects who are observing inter-subjective interactions, independently from the human specicity of such inter-
action. In fact, whereas in previous research only human conditions (single subjects or interactive situations) were
monitored, in the present study the aspecic human-animal interactions were analyzed. In addition, previous
study which included inter-species condition did not considered visuo-auditory stimulation. erefore, a direct
comparison between these two stimulation types was not considered
. Based on our results we may suppose that
the DLPFC is involved in processing situations where emotions are represented independently from the exclusive
presence of human actors
Specically, about EEG, within this prefrontal network, the N200 amplitude appeared to be signicantly mod-
ulated over the anterior brain sectors and to be valence-related. Indeed, the N200 higher amplitude within the
anterior brain region was directly associated with negatively valenced situations. Based on previous results, and in
accordance with its frontal localization, we may suggest that the N200 is involved in the detection and evaluation
of relevant and threatening patterns
. is observation is in accordance with other ndings which showed that
the N200 is most pronounced over the frontal cortex for negative related stimuli
. About the fNIRS, the PFC
contribution in processing emotional interactions was conrmed by the hemodynamic measure. Indeed a similar
prole was observed for O2Hb, with more intense DLPFC responses for negative categories. As shown in previous
research on visual stimulation, negative emotional conditions produced increased cortical response
, pro-
cessed as being more salient for the subjective safeguard
. ese results appear partially in contrast with a previ-
ous study, which more directly compared intra- and inter-species stimuli
. Indeed it was found that intra-species
compared to inter-species conditions elicited higher responses for negative stimuli, whereas the opposite result
was found for positive stimuli (with higher response in the case of inter-species interactions). However, also due
to the specic empathic task, in that case it was supposed that in negative situations human-human interactions
may produce higher controlled cortical activity than human-animal ones, because they could raise more cogni-
tive and mediated processes to represent higher social and culture-based interactions. In addition, in the present
study we did not directly compare intra- and inter-species interactions and, therefore, the direction of potential
dierences between these two specie-specic and specie-aspecic situations cannot be appropriately discussed.
Finally, only visual stimulation was included in this previous study, choice that may prevent to produce a com-
plete comparison between the results.
A second main factor able to aect the prefrontal response was related to stimulation type in integration
with the stimulus valence, that is the presence of a simple visual emotional display or a cross-modal stimula-
tion. Indeed, the integrated visual-auditory condition was able to induce a sort of “reinforce eect” for specic
situations, that is the negative ones. Whereas the negative stimuli generally supported an increased prefrontal
response, in the case of VU subjects showed a higher N200 peak amplitude and an O2Hb increasing during inter-
actions processing compared to simple visual stimulation.
e correlational results reinforced this hypothesis. Indeed we found a higher and coherent response across
EEG and fNIRS in concomitance with the cross-modal stimulation when negative situations were represented. In
contrast positive valence revealed a signicant eect only for the hemodynamic prole, with an O2Hb increasing
for the positive category in the case of simple V. For the rst time, by using two specic cortical measures, which
were able to allow an adequate spatial and temporal resolution, we demonstrated the direct implication of the
DLPFC in response to inter-species interactions when dierent stimulus modalities were considered (V and VU).
Figure 3. Scatterplot of EEG (N200 amplitude) and fNIRS (d values) measures in relationship with
negative interactions for VU within the right side.
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
erefore, as shown in previous studies
we may suppose the auditory component may induce a higher emo-
tional value in the subjective perception of the interactions, especially when the represented emotional content is
negative and aversive. However for the rst time this eect was shown in the case of cross-modal integration and
not exclusively related to visual or auditory stimulation per se. Specically, we may suppose that the negative and
aggressive valence of the integrated VU conditions made the emotional relevance of those contexts as more sali-
ent. is result is consistent with the position that it may exist an approach motivational tendency toward stimuli
with negative emotional valence
, and, more generally, an active response may be associated with approach
motivation, in case of more prominent emotional cues, such as negative and unpleasant situations which the
subject is processing.
is may be also due to the specicity of the inter-species context which produces the potential uncontrolled
situation which the subject is faced with: in that case the unpredictable outcomes of the aggressive inter-species
interactions may be evaluated as more dangerous than the positive interactions. In addition, in relation to the
prefrontal localization of the cortical network, the general electrophysiological response and the hemodynamic
increased responsiveness for VU in concomitance with negative patters may suggest the existence of frontally
distributed vigilance mechanism activated during the detection and evaluation of potentially more dangerous
emotional interactions, which is likely to be located at the extended anterior sites. at is, an attentional network
involving the frontal site is argued to maintain a state of alertness when salient and more negative interpersonal
conditions are encountered
It has to be noted that, compared with some previous research on neuroimaging and NIRS
, we found
that valence was more relevant to process emotional cue, and it may be in relation to hemispheric lateralization.
Indeed, about the contribution of the two hemispheres, the specic right lateralization found in response to
negative stimuli, as shown by N200 peak distribution, may indicate that a clearer cortical lateralization regards
largely this specic negative valence category. Indeed, this eect was observed not indistinctly, but it was noted
mainly in response to certain emotional categories, such aggressive negative conditions, that is emotions with a
potentially involving arousing power by the subjects
. To summarize, a general right/negative association was
observed in the subjects and it was mainly supported by EEG modulation. at is, negative, aversive interactions
showed a more consistent lateralized brain activation in comparison with other emotional categories (i.e. positive
is fact may be explained taking into account the impact of threatening and negative interactions which
may be more “salient” for the subject’s safeguard, with a more specific contribution of the right DLPFC.
erefore, based on these eects, it should be noted that the valence model with an expected lateralized response
(right-negative; le-positive distinction) is only partially supported by the present results. is “lateralized mech-
anism” may be represented as nalized to alert the emotional behavior in response to highly signicant emotional
situations subjects are faced with. However, due to the partial verication of the underlying valence model of
emotions, future research should better explore the signicance of the positive/negative valence distinction in
concomitance with the approach/avoidance attitude model of emotions, to better clarify the contribution of the
le vs. right hemisphere in response to emotional visual/auditory cues. In addition, the specic role that arousal
may have in aecting the lateralized emotional response should be considered to explain these data more deeply
A relevant result was also the presence of a general direct link between the dierent levels of analysis (ERP and
fNIRS measures), taking into account that EEG activity was systematically associated with the cortical hemod-
ynamic responsiveness to the emotional situations. Indeed, important eects were derived from to the correla-
tional analyses between hemodynamic and cortical EEG. e joined EEG-NIRS couple revealed signicant linear
associations between the hemodynamic O2Hb values and N200 peak amplitude. e signicant and consistent
positive relation between NIRS and EEG measures mainly in response to negative stimuli may suggest, from one
hand a strength relation between the PFC and emotional stimuli processing, with signicant eect of valence
(more for negative) factor; from the other hand, it may support the consistence of these two brain measures.
More generally, the simultaneous application of EEG and fNIRS was found to be particularly useful for emotional
studies. Specically, the use of fNIRS in a topographic approach for measuring responses to emotions allows to
investigate regional cortical activation changes that are related to emotional manipulations in general, and to
link certain EEG eects to the regional hemodynamic changes
. e latter strategy enabled us to investigate
whether and how specic emotion-related electrophysiological eects are associated with distinct cortical activa-
tion patterns within the emotion perception network
However, some limitations of the present research should be underlined. Firstly, the limited number of sub-
jects implicated in the study required further research to generalize the present results. Secondly, a more direct
comparison between dierent types of relationships (i.e. intra-species and inter-species) should be included to
better distinguish the two domains and to extend our results to dierent human-human/human-animal contexts.
Finally, a complete research design, which may add to V and UV condition also a simple U condition, could fur-
nish important elements to discriminate between simple sensory or cross-modal perception.
Subjects. 15 subjects, 8 females and 7 males (M age = 26.33; SD = 2.5; range = 23–33) participated at the
experiment. All subjects were right-handed, with normal or corrected-to-normal visual acuity. Exclusion cri-
teria were neurological or psychiatric pathologies. ey gave informed written consent for participating in the
study, and the research protocol was approved by the Ethical Committee institution where the work was carried
out (Department of Catholic University of Milan, Italy). e experiment was conducted in accordance with the
Declaration of Helsinki and all the procedure were carried out with adequate understanding of the subjects.
Research Consent Form was submitted before participating in this research). No payment was provided for their
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
Stimuli. Subjects were required to view aective pictures depicting human-animal (for animals: cats and dogs)
interactions. 48 colored pictures were selected representing positive (24) and negative (24) interactions. Positive
pictures represented emotionally comfortable interactions between humans and animals; negative pictures rep-
resented emotionally uncomfortable interactions between humans animals. 24 Neutral stimuli (interactions
without a specic emotional signicance) were used as control condition. Other 48 pictures (positive; negative)
associated with sounds which simulated animals’ positive (for cat meow/purr; for dog barks of joy) or negative
(for cat growl; for dog snarl) noise (Fig.4). Neutral condition included both visual and auditory neutral stimuli.
erefore each subject was submitted to visual (72) or visual/auditory (72) condition for a total of 144 stimuli.
All pictures had same size (14 cm in weight × 10 cm in height) and were similar for perceptual features (lumi-
nance; complexity, i.e. number of details in the scene; characters’ gender: half males and half females actors; ani-
mals’ species: half dogs and half cats). Sounds were taken from some internet databases and downloaded as wav
les. ey were reproduced taking into account some acoustic parameters (pitch; intensity; range) to guarantee
similar prole across the noises.
A pre-experimental procedure was adopted to validate the picture/sound dataset. Each stimulus (visual or
auditory) was evaluated by six judges on valence and arousal dimensions, using the Self-Assessment Manikin
Scale with a ve-point Likert Scale
. Ratings were averaged across all presented pictures/sounds for each
As shown by statistical analysis (repeated measures ANOVA), both visual and auditory stimuli diered in
term of valence (for all signicant contrast comparisons P = 0.01), it being more positive for positive pictures/
sounds than the other two categories; more negative for negative pictures/sounds than the other two categories;
with intermediate values for neutral pictures/sounds than the other two categories. About arousal, positive and
negative pictures/sounds were more arousing than neutral pictures/sounds. However, negative and positive stim-
uli did not dier in terms of arousal. e cross-modal stimulation was perceived as coherent in term of valence
(negative valenced for negative stimuli combination; positive valence for positive stimuli combination).
Procedure. Subjects were seated in a dimly lit room, facing a computer monitor that was placed 70 cm from
the subject. Stimuli were presented using E-Prime 2.0 soware (Psychology Soware Tools, Inc., Sharpsburg, PA,
USA) running on a personal computer with a 15-inch screen. Participants were required to process each stimulus
during fNIRS/EEG measures recording, and they should attend to the pictures/sounds the entire time of exposi-
tion, focalizing on the emotional conditions which characterize the represented human actors. Subjects were sub-
mitted to V and VU blocks with a random order to avoid condition order eects (randomization of the six blocks
across-subjects). Within each block, pictures or pictures/sound were displayed in a random order across-subject
in the center of a computer monitor for 6 seconds, with an inter-stimulus interval of 8 seconds (Fig.5). Auditory
stimuli were reproduced by the PC loudspeakers at a listening level of approximatively 70dB. V and VU condi-
tions were randomized across-subjects.
120 seconds resting period was registered at the beginning of the experiment before the picture/sounds series.
Aer the experimental phase, subjects were required to rate pictures/sounds on SAM evaluating valence and
arousal on a ve-point Likert scale. As shown by statistical analysis (repeated measures ANOVAs) pictures/
Figure 4. Examples of stimuli (neutral, positive, negative) (photographed by Maria Elide Vanutelli).
Figure 5. Experimental procedure (EEG and fNIRS acquisition) (photographed by Maria Elide Vanutelli).
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
sounds diered in term of valence (for all paired comparisons P = 0.01; for visual, positive M = 4.22 SD = 0.06;
negative M = 2.08 SD = 0.07; neutral M = 3.17 SD = 0.05, for auditory, positive M = 4.32 SD = 0.04; negative
M = 2.13 SD = 0.05; neutral M = 3.12 SD = 0.08) and arousal (signicant dierences between neutral/positive and
neutral/negative comparisons for both visual and auditory, P = 0.01; for visual, positive M = 4.21 SD = 0.07;
negative M = 4.72 SD = 0.03; neutral M = 3.11 SD = 0.04, for auditory, positive M = 4.02 SD = 0.04; negative
M = 4.80 SD = 0.08; neutral M = 3.44 SD = 0.09). In contrast no dierences were found between positive and
negative stimuli for both visual and auditory condition in term of arousal.
EEG recordings and data reduction. A 32-channel portable EEG-System (V-AMP: Brain Products,
München) was used for data acquisition. A NIRS-EEG compatible ElectroCap with Ag/AgCl electrodes was used
to record EEG from active scalp sites referred to earlobe (10/5 system of electrode placement). EEG activity was
recorded from channels on the following positions: AFF3, AFF4, Fz, AFp1, AFp2, C3, C4, Cz, P3, P4, Pz, T7, T8,
O1, O2 (Fig.6). e cap was xed with a chin strap to prevent shiing during the task. e data were recorded
during the stimulation using sampling rate of 500 Hz, with a frequency band of 0.01–50 Hz and with a notch lter
of 50 Hz. e impedance of recording electrodes was monitored for each subject prior to data collection and it
was always kept below 5 k. Additionally, one EOG electrodes was sited on the outer canthus to detect eye move-
ments. Ocular artefacts (eye movements and blinks) were corrected using an eye-movement correction algorithm
that employs a regression analysis in combination with artifact averaging. Aer performing EOG correction and
visual inspection, only artifact-free trials were considered (rejected epochs, 4%). e signal was visually scored,
and portion of the data that contained artifacts were removed to increase specicity. Artifact-free epochs (850 ms)
were considered. An averaged waveform (o-line) was obtained for each condition (not less than 22 epochs were
averaged). e peak amplitude (higher peak amplitude from the baseline) was quantied relative to the 150 ms
pre-stimulus. e onset was coincident with the appearance of the stimulus on the monitor, taking into account
the most negative peak-value within the temporal window of 150–250 ms post-stimulus, since the morphological
analysis of EEG prole revealed that the peak deection was within this time range. Since the latency measure
was previously tested without showing signicant dierences across condition, we did not include this variable
within the nal analysis. e mean latency of N200 was approximately 210 ms. Successively, localization (three
Figure 6. Locations of measurement channels. fNIRS: Emmiters were placed on positions AF3-AF4 and
F5-F6 (red dots), while detectors were placed on AFF1-AFF2 and F3-F4 (pink dots). e 6 resulting channels
are displayed with yellow dots. EEG: EEG activity was recorded from channels on the following positions: AFF3,
AFF4, Fz, AFp1, AFp2, C3, C4, Cz, P3, P4, Pz, T7, T8, O1, O2 (green dots). fNIRS optodes and EEG channels
were attached to the subject’s head using a NIRS-EEG compatible cup, with respect to the international 10/5
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
sites: frontal, AFF3/AFF4 and AFp1/AFp2; temporo-central, C3/C4 and T7/T8 and parietal, P3/P4 and P7/P8)
and lateralization (two sides: le channels and right channels) factors were considered to apply statistical analyses.
fNIRS. fNIRS measurements were conducted with the NIRScout System (NIRx Medical Technologies, LLC.
Los Angeles, California) using a 6-channel array of optodes (4 light sources/emitters and 4 detectors) covering
the prefrontal area. Emmiters were placed on positions AF3-AF4 and F5-F6, while detectors were placed on
AFF1-AFF2 and F3-F4. Emitter-detector distance was 30 mm for contiguous optodes and near-infrared light of
two wavelengths (760 and 850 nm) was used. NIRS optodes were attached to the subjects head using a NIRS-EEG
compatible cup, with respect to the international 10/5 system
With NIRStar Acquisition Soware, changes in the concentration of oxygenated (O2Hb) and deoxygenated
haemoglobin (HHb) were recorded from a 120 s starting resting phase. Signals obtained from the 6 NIRS chan-
nels were measured with a sampling rate of 6.25 Hz, and analyzed and transformed according to their wavelength
and location, resulting in values for the changes in the concentration of oxygenated and deoxygenated hemo-
globin for each channel. Hemoglobin quantity is scaled in mmol mm, implying that all concentration changes
depend on the path length of the NIR light in the brain.
e raw data of O2Hb, and HHb from individual channels were digitally band-pass ltered at 0.01–0.3 Hz.
en, the mean concentration of each channel within a subject was calculated by averaging data across the trials
from the trial onset for 6 s. Based on the mean concentrations in the time series, we calculated the eect size in
every condition for each channel within a subject. e eect sizes (Cohens d) were calculated as the dierence
of the means of the baseline and trial divided by the standard deviation (Sd) of the baseline: d = (m1 m2)/s.
Accordingly, m1 and m2 are the mean concentration values during the baseline and trial, and s means the
Sd of the baseline. en, the eect sizes obtained from the 6 channels were averaged in order to increase the
signal-to-noise ratio. Although the raw data of NIRS were originally relative values and could not be averaged
directly across subjects or channels, the normalized data such as the eect size could be averaged regardless of the
. In fact, the eect size is not aected by dierential pathlength factor (DPF)
1. Pessoa, L., astner, S. & Ungerleider, L. G. Attentional control of the processing of neutral and emotional stimuli. Cogn. Brain es.
15, 31–45 (2002).
2. Phan, . L., Wager, T., Taylor, S. F. & Liberzon, I. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies
in PET and fMI. Neuroimage 16, 331–348 (2002).
3. Davis, M. & Whalen, P. J. e amygdala: vigilance and emotion. Mol. Psychiatry 6, 13–34 (2001).
4. Balconi, M. & Bortolotti, A. Detection of the facial expression of emotion and self-report measures in empathic situations are
inuenced by sensorimotor circuit inhibition by low-frequency rTMS. Brain Stimul. 5, 330–336 (2012).
5. Balconi, M., Bortolotti, A. & Gonzaga, L. Emotional face recognition, EMG response, and medial prefrontal activity in empathic
behaviour. Neurosci. es. 71, 251–259 (2011).
6. Damasio, A. ., Everitt, B. J. & Bishop, D. e Somatic Marer Hypothesis and the Possible Functions of the Prefrontal Cortex.
Philos. Trans. . Soc. London B Biol. Sci. 351, 1413–1420 (1996).
7. Davidson, . J. Anxiety and aective style: ole of prefrontal cortex and amygdala. Biol. Psychiatry 51, 68–80 (2002).
8. Ochsner, . N. & Gross, J. J. e cognitive control of emotion. Trends in Cognitive Sciences 9, 242–249 (2005).
9. Sauter, D. A. & Eimer, M. apid detection of emotion from human vocalizations. J. Cogn. Neurosci. 22, 474–481 (2010).
10. Suraa, V., Tenhunen-Eselinen, M., Hietanen, J. . & Sams, M. Modulation of human auditory information processing by
emotional visual stimuli. Brain es. Cogn. Brain es. 7, 159–163 (1998).
11. Beedal, M. Y. V., ossi, J. & Pansepp, J. Human brain EEG indices of emotions: Delineating responses to aective vocalizations
by measuring frontal theta event-related synchronization. Neurosci. Biobehav. ev. 35, 1959–1970 (2011).
12. Balconi, M., Bortolotti, A. & Crivelli, D. Self-report measures, facial feedbac, and personality dierences (BEES) in cooperative vs.
noncooperative situations: Contribution of the mimic system to the sense of empathy. Int. J. Psychol. 48, 1–10 (2012).
13. Balconi, M. & Canavesio, Y. Prosocial attitudes and empathic behavior in emotional positive versus negative situations: Brain
response (EPs) and source localization (LOETA) analysis. Cogn. Process. 14, 63–72 (2013).
14. Preston, S. D. & de Waal, F. B. M. Empathy: Its ultimate and proximate bases. Behav. Brain Sci. 25, 1–20 (2002).
15. uby, P. & Decety, J. How would you feel versus how do you thin she would feel? A neuroimaging study of perspective-taing with
social emotions. J. Cogn. Neurosci. 16, 988–999 (2004).
16. Westbury, H. . & Neumann, D. L. Empathy-related responses to moving lm stimuli depicting human and non-human animal
targets in negative circumstances. Biol. Psychol. 78, 66–74 (2008).
17. Balconi, M. & Vanutelli, M. E. Vocal and visual stimulation, congruence and lateralization aect brain oscillations in interspecies
emotional positive and negative interactions. Soc. Neurosci.
18. Franlin, . G. et al. Neural responses to perceiving suering in humans and animals. Soc. Neurosci. 8, 217–27 (2013).
19. Vanutelli, M. E. & Balconi, M. Perceiving emotions in human–human and human–animal interactions: Hemodynamic prefrontal
activity (fNIS) and empathic concern. Neurosci. Lett. 605, 1–6 (2015).
20. De Gelder, B., Böcer, . B. E., Tuomainen, J., Hensen, M. & Vroomen, J. e combined perception of emotion from voice and face:
Early interaction revealed by human electric brain responses. Neurosci. Lett. 260, 133–136 (1999).
21. Sprecelmeyer, . N., utas, M., Urbach, T. P., Altenmüller, E. & Münte, T. F. Combined perception of emotion in pictures and
musical sounds. Brain es. 1070, 160–170 (2006).
22. Balconi, M. & Carrera, A. Cross-modal perception (face and voice) in emotions. EPs and behavioural measures. Neuropsychol.
Trends 1, 43–64 (2007).
23. Balconi, M. & Carrera, A. Cross-modal integration of emotional face and voice in congruous and incongruous pairs: e P2 EP
eect. J. Cogn. Psychol. 23, 132–139 (2011).
24. oinous, J., otz, S. A., Tavano, A. & Schröger, E. e role of emotion in dynamic audiovisual integration of faces and voices. Soc.
Cogn. Aect. Neurosci. 10, 713–720 (2014).
25. Grossman, T., Striano, T. & Federici, A. Crossmodal integration or emotional information from face and voice in the infant brain.
Dev. Sci. 9, 309–315 (2006).
26. Baumgartner, T., Esslen, M. & Jänce, L. From emotion perception to emotion experience: Emotions evoed by pictures and classical
music. Int. J. Psychophysiol. 60, 34–43 (2006).
27. Jessen, S. & otz, S. A. e temporal dynamics of processing emotions from vocal, facial, and bodily expressions. Neuroimage 58,
665–674 (2011).
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
28. Balconi, M. & Mazza, G. Lateralisation eect in comprehension of emotional facial expression: a comparison between EEG alpha
band power and behavioural inhibition (BIS) and activation (BAS) systems. Laterality 15, 361–384 (2010).
29. Everhart, D. E., Demaree, H. A. & Wuensch, . L. Healthy high-hostiles evidence low-alpha power (7.5–9.5 Hz) changes during
negative aective learning. Brain Cogn. 52, 334–342 (2003).
30. ussell, J. A. Core aect and the psychological construction of emotion. Psychol. ev. 110, 145–172 (2003).
31. Silberman, E. . & Weingartner, H. Hemispheric lateralization of functions related to emotion. Brain Cogn. 5, 322–353 (1986).
32. Balconi, M. & Mazza, G. Brain oscillations and BIS/BAS (behavioral inhibition/activation system) eects on processing mased
emotional cues. ES/ED and coherence measures of alpha band. Int. J. Psychophysiol. 74, 158–165 (2009).
33. Davidson, . J. In Brain asymmetry (eds Davidson, . J. & Hughdahl, .) 361–387 (MIT Press, 1995).
34. Harmon-Jones, E. Clarifying the emotive functions of asymmetrical frontal cortical activity. Psychophysiology 40, 838–848 (2003).
35. Sutton, S. . & Davidson, . J. Prefrontal brain electrical asymmetry predicts the evaluation of aective stimuli. Neuropsychologia
38, 1723–1733 (2000).
36. Harmon-Jones, E. & Allen, J. J. Behavioral activation sensitivity and resting frontal EEG asymmetry: covariation of putative
indicators related to ris for mood disorders. J. Abnorm. Psychol. 106, 159–163 (1997).
37. Schmidt, L. A. & Trainor, L. J. Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cogn.
Emot. 15, 487–500 (2001).
38. Balconi, M. & Bortolotti, A. esonance mechanism in empathic behavior. BEES, BIS/BAS and psychophysiological contribution.
Physiol. Behav. 105, 298–304 (2012).
39. Junghöfer, M., Bradley, M. M., Elbert, T. . & Lang, P. J. Fleeting images: a new loo at early emotion discrimination. Psychophysiology
38, 175–178 (2001).
40. Balconi, M. & Lucchiari, C. Event-related potentials related to normal and morphed emotional faces. J. Psychol. 139, 176–192
41. Morita, Y., Morita, ., Yamamoto, M., Waseda, Y. & Maeda, H. Eects of facial aect recognition on the auditory P300 in healthy
subjects. Neurosci. es. 41, 89–95 (2001).
42. Schupp, H. T. et al. Aective picture processing: the late positive potential is modulated by motivational relevance. Psychophysiology
37, 257–261 (2000).
43. Cuthbert, B. N., Schupp, H. T., Bradley, M. M., Birbaumer, N. & Lang, P. J. Brain potentials in affective picture processing:
Covariation with autonomic arousal and aective report. Biol. Psychol. 52, 95–111 (2000).
44. Moser, J. S., Hajca, G., Buay, E. & Simons, . F. Intentional modulation of emotional responding to unpleasant pictures: An EP
study. Psychophysiology 43, 292–296 (2006).
45. Schupp, H. T., Junghöfer, M., Weie, A. I. & Hamm, A. O. e selective processing of briey presented aective pictures: An EP
analysis. Psychophysiology 41, 441–449 (2004).
46. Balconi, M. & Pozzoli, U. Event-related oscillations (EOs) and event-related potentials (EPs) comparison in facial expression
recognition. J. Neuropsychol. 1, 283–294 (2007).
47. Balconi, M. & Pozzoli, U. Event-related oscillations (EO) and event-related potentials (EP) in emotional face recognition. Int. J.
Neurosci. 118, 1412–1424 (2008).
48. Balconi, M., Brambilla, E. & Falbo, L. Appetitive vs. defensive responses to emotional cues. Autonomic measures and brain
oscillation modulation. Brain es. 1296, 72–84 (2009).
49. anse, P. & otz, S. A. Modulation of early conflict processing: N200 responses to emotional words in a flaner tas.
Neuropsychologia 48, 3661–3664 (2010).
50. Balconi, M. & Pozzoli, U. Arousal eect on emotional face comprehension. Frequency band changes in dierent time intervals.
Physiol. Behav. 97, 455–462 (2009).
51. Streit, M., Wölwer, W., Brinmeyer, J., Ihl, . & Gaebel, W. Electrophysiological correlates of emotional and structural face
processing in humans. Neurosci. Lett. 278, 13–16 (2000).
52. Sato, W., ochiyama, T., Yoshiawa, S. & Matsumura, M. Emotional expression boosts early visual processing of the face: EP
recording and its decomposition by independent component analysis. Neuroreport 12, 709–714 (2001).
53. Balconi, M. & Pozzoli, U. Face-selective processing and the effect of pleasant and unpleasant emotional expressions on EP
correlates. Int. J. Psychophysiol. 49, 67–74 (2003).
54. Herrmann, M. J. et al. Face-specic event-related potential in humans is independent from facial expression. Int. J. Psychophysiol.
45, 241–244 (2002).
55. Bradley, M. M. & Lang, P. J. In Handboo of Psychophysiology (eds Cacioppo, J. T., Tassinary, L. G. & Berntson, G. G.) 581–607
56. Carretié, L., Iglesias, J. & García, T. A study on the emotional-processing of visual stimuli through event-related potentials. Brain
Cogn. 34, 207–217 (1997).
57. Eimer, M. & Holmes, A. Event-related brain potential correlates of emotional face processing. Neuropsychologia 45, 15–31 (2007).
58. Pastor, M. C. et al. Aective picture perception: Emotion, context, and the late positive potential. Brain es. 1189, 145–151 (2008).
59. Johnston, V. S., Miller, D. . & Burleson, M. H. Multiple P3 s to emotional stimuli and their theoretical signicance. Psychophysiology
23, 684–693 (1986).
60. Balconi, M. & Molteni, E. Past and future of near-infrared spectroscopy in studies of emotion and social neuroscience. J. Cogn.
Psychol. 28, 129–146 (2015).
61. Hoshi, Y., Shinba, T. & Doi, N. esting hypofrontality in schizophrenia: A study using near-infrared time-resolved spectroscopy.
Schizophr. es. 84, 411–420 (2006).
62. Bloland, Y. et al. Combined EEG-fNIS decoding of motor attempt and imagery for brain switch control: an oine study in
patients with tetraplegia. IEEE Trans. Neural Syst. ehabil. Eng. 22, 222–229 (2014).
63. Fazli, S. et al. Enhanced performance by a hybrid NIS-EEG brain computer interface. Neuroimage 59, 519–529 (2012).
64. Liu, Y., Ayaz, H., Curtin, A., Onaral, B. & Shewois, P. In Foundations of Augmented Cognition (eds Schmorrow, D. & Fidopiastis, C.)
335–344 (Springer Berlin Heidelberg, 2013).
65. Balconi, M., Grippa, E. & Vanutelli, M. E. What hemodynamic (fNIS), electrophysiological (EEG) and autonomic integrated
measures can tell us about emotional processing. Brain Cogn. 95, 67–76 (2015).
66. Herrmann, M. J. et al. Enhancement of activity of the primary visual cortex during processing of emotional stimuli as measured with
event-related functional near-infrared spectroscopy and event-related potentials. Hum. Brain Mapp. 29, 28–35 (2008).
67. Hongyu, Y., Zhenyu, Z., Yun, L. & Zongcai, . Gender dierence in hemodynamic responses of prefrontal area to emotional stress
by near-infrared spectroscopy. Behav. Brain es. 178, 172–176 (2007).
68. Minati, L., Jones, C. & Gray, M. Emotional modulation of visual cortex activity: a functional near-infrared spectroscopy study.
Neuroreport 20, 1344–1350 (2009).
69. Plichta, M. M. et al. Auditory cortex activation is modulated by emotion: A functional near-infrared spectroscopy (fNIS) study.
Neuroimage 55, 1200–1207 (2011).
70. Balconi, M. & Ferrari, C. Emotional memory retrieval. rTMS stimulation on le DLPFC increases the positive memories. Brain
Imaging Behav. 6, 454–461 (2012).
71. Hariri, A. ., Booheimer, S. Y. & Mazziotta, J. C. Modulating emotional responses: eects of a neocortical networ on the limbic
system. Neuroreport 11, 43–48 (2000).
Scientific RepoRts | 6:23083 | DOI: 10.1038/srep23083
72. alish, Y. & obins, G. Psychological predispositions and networ structure: e relationship between individual predispositions,
structural holes and networ closure. Soc. Networs 28, 56–84 (2006).
73. night, . T., Staines, W. ., Swic, D. & Chao, L. L. Prefrontal cortex regulates inhibition and excitation in distributed neural
networs. Acta Psychol. (Amst). 101, 159–178 (1999).
74. Miller, E. . & Cohen, J. D. An integrative theory of prefrontal cortex function. Annu. ev. Neurosci. 24, 167–202 (2001).
75. Stoecel, L. E., Palley, L. S., Gollub, . L., Niemi, S. M. & Evins, A. E. Patterns of Brain Activation when Mothers View eir Own
Child and Dog: An fMI Study. PLoS One 9, e107205 (2014).
76. Balconi, M. & Canavesio, Y. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional
mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face
processing. Cogn. Emot. 1–15 (2014). doi: 10.1080/02699931.2014.993306
77. Posner, M. I. & aichle, M. E. Images of Mind. (Scientic Medical Library, 1997).
78. Balconi, M. & Lucchiari, C. Consciousness and emotional facial expression recognition: Subliminal/supraliminal stimulation eect
on n200 and p300 EPs. J. Psychophysiol. 21, 100–108 (2007).
79. Marumo, ., Taizawa, ., awaubo, Y., Onitsua, T. & asai, . Gender dierence in right lateral prefrontal hemodynamic
response while viewing fearful faces: A multi-channel near-infrared spectroscopy study. Neurosci. es. 63, 89–94 (2009).
80. Morinaga, . et al. Anticipatory anxiety-induced changes in human lateral prefrontal cortex activity. Biol. Psychol. 74, 34–38 (2007).
81. Tuscan, L. A. et al. Exploring frontal asymmetry using functional near-infrared spectroscopy: A preliminary study of the eects of
social anxiety during interaction and performance tass. Brain Imaging Behav. 7, 140–153 (2013).
82. Balconi, M., Grippa, E. & Vanutelli, M. E. esting lateralized activity predicts the cortical response and appraisal of emotions: an
fNIS study. Soc. Cogn. Aect. Neurosci. 10, 1607–1614 (2015).
83. Berowitz, L. In Handboo of cognition and emotion (eds Dalgleish, T. & Power, M. J.) 411–428 (1999).
84. araaş, S., Erzengin, Ö. U. & Başar, E. e genesis of human event-related responses explained through the theory of oscillatory
neural assemblies. Neurosci. Lett. 285, 45–48 (2000).
85. Hoshi, Y. In Neural Correlates of ining. On ining. (eds ra, E., Gulyás, B. & Pöppel, E.) 83–93 (Springer-Verlag Berlin
Heidelberg, 2009). doi: 10.1007/978-3-540-68044-4_6
86. Schneider, S. et al. Show me how you wal and I tell you how you feel - A functional near-infrared spectroscopy study on emotion
perception based on human gait. Neuroimage 85, 380–390 (2014).
87. Bradley, M. M. & Lang, P. J. Measuring emotion: the Self-Assessment Maniin and the Semantic Dierential. J. Behav. er. Exp.
Psychiatry 25, 49–59 (1994).
88. Oostenveld, . & Praamstra, P. The five percent electrode system for high-resolution EEG and EP measurements. Clin.
Neurophysiol. 112, 713–719 (2001).
89. Matsuda, G. & Hirai, . Sustained decrease in oxygenated hemoglobin during video games in the dorsal prefrontal cortex: A NIS
study of children. Neuroimage 29, 706–711 (2006).
90. Schroeter, M. L., Zysset, S., ruggel, F. & Von Cramon, D. Y. Age dependency of the hemodynamic response as measured by
functional near-infrared spectroscopy. Neuroimage 19, 555–564 (2003).
91. Shimada, S. & Hirai, . Infants brain responses to live and televised action. Neuroimage 32, 930–939 (2006).
Author Contributions
M.B. supervised the experimental paradigms, M.E.V. executed the experiments. M.B. wrote the main manuscript
text and MEV prepared gures. All authors reviewed the manuscript. e submission has been approved by all
of the authors and by the institution where the work was carried out (catholic University of Milan), and all the
subjects who participated to the experiment gave their informed consent. All the methods were in accordance
with the approved guidelines. e manuscript was not submitted elsewhere for publication.
Additional Information
Competing nancial interests: e authors declare no competing nancial interests.
How to cite this article: Balconi, M. and Vanutelli, M. E. Hemodynamic (fNIRS) and EEG (N200) correlates
of emotional inter-species interactions modulated by visual and auditory stimulation. Sci. Rep. 6, 23083;
doi: 10.1038/srep23083 (2016).
is work is licensed under a Creative Commons Attribution 4.0 International License. e images
or other third party material in this article are included in the article’s Creative Commons license,
unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license,
users will need to obtain permission from the license holder to reproduce the material. To view a copy of this
license, visit
... The current study elected to use affective multimodal stimuli to investigate the neural activity underlying multimodal integration during emotion perception using functional neuroimaging and psychophysical measures. Functional near-infrared spectroscopy (fNIRS) is well-suited to study the cortical activity associated with emotion perception, as it allows for the measurement of both oxygenated and deoxygenated blood flow, it does not require immobilization, is resilient to movement, and is a reliable measure of neural activity 54,55 . Additionally, fNIRS measures the hemodynamic response at a higher temporal resolution than fMRI, which makes it wellsuited to capture the temporal complexities of emotion perception. ...
... Additionally, fNIRS measures the hemodynamic response at a higher temporal resolution than fMRI, which makes it wellsuited to capture the temporal complexities of emotion perception. While fNIRS has been used in a variety of cognitive tasks, it has been used less frequently in tasks investigating emotion, with very few investigating the integration of emotional audiovisual stimuli [54][55][56][57] . Importantly, fNIRS data acquisition is virtually silent, which gives it a significant advantage in sound sensitive studies examining auditory processing as fMRI scanners are inherently limited by the loud noise generated during scans. ...
Full-text available
Emotion is communicated via the integration of concurrently presented information from multiple information channels, such as voice, face, gesture and touch. This study investigated the neural and perceptual correlates of emotion perception as influenced by facial and vocal information by measuring changes in oxygenated hemoglobin (HbO) using functional near-infrared spectroscopy (fNIRS) and acquiring psychometrics. HbO activity was recorded from 103 channels while participants (n=39, age=20.37years) were presented with vocalizations produced in either a happy, angry or neutral prosody. Voices were presented alone or paired with an emotional face and compared with a face-only condition. Behavioral results indicated that when voices were paired with faces, a bias in the direction of the emotion of the voice was present. Subjects’ responses also showed greater variance and longer reaction times when responding to the bimodal conditions when compared to the face-only condition. While both the happy and angry prosody conditions exhibited right lateralized increases in HbO compared to the neutral condition, these activations were segregated into posterior-anterior subdivisions by emotion. Specific emotional prosodies may therefore differentially influence emotion perception, with happy voices exhibiting posterior activity in receptive emotion areas and angry voices displaying activity in anterior expressive emotion areas.
... The N200, an ERP conflict-related negativity at the frontocentral electrodes at 200ms after stimulus onset, is a prominent marker in the Flanker task and has been linked to conflict processing at the anterior cingulate cortex (ACC) [8,12]. In experimental studies, the N200 window is related to emotional-cognitive interactions [13,14] and has also been associated with anxietyrelated disorders [15][16][17]. Overall, there seems to be some evidence to indicate the role of the N200 window with emotion-cognition interations in individuals with anxiety-related disorders. ...
Full-text available
The aim: This study used reaction time (RT) and event-related potential (ERP) analysis in an emotion-cognition Eriksen-Flanker (ECEF) task to investigate behavioral and neural abnormalities in individuals with public speaking anxiety (PSA). Although 25 per cent of people worldwide suffer from PSA, there is currently a lack of standardized assessment or biomarkers to detect emotion-cognition abnormalities in individuals with PSA. Material and method: RT and ERP were compared between 12 subjects with high (H) PSA and 12 subjects with low (L) PSA in the ECEF experiment. EEG was recorded with the 14-channel Emotiv EPOC+. Results: RT data showed a significant Flanker Effect across groups in the neutral and emotional (PSA-related) conditions, with increased Flanker effect in the HPSA group. On average, LPSA subjects were faster than the HPSA subjects in the ECEF task. HPSA subjects showed aberrant ERP responses in two ways. Firstly in the reversed N200 conflict effect with increased frontocentral amplitude in the incongruent compared to the congruent condition. Secondly, in the absence of the P200 frontocentral emotional modulation found in LPSA subjects. In the HPSA group, decreased P200 amplitude is significantly related to impaired behavioral performance in the neutral congruent condition. Conclusions: RT and ERP are useful in modern medicine because they successfully unveiled the biomarkers of abnormalities during the interaction of emotion and cognition. Impaired conflict processing in PSA-related conditions was found at the N200 and P200 windows in HPSA individuals.
... For all charts, bars represent ± 1 SE; asterisks mark statistically significant differences with p ≤ 0.05 modulated by cognitive empathy and top-down evaluation of empathic stimuli (Saxe and Kanwisher 2003); however, this right hemisphere activation seems to be related specifically to negative-valenced stimuli. In line with this, several studies previously highlighted that cortical right hemisphere hemodynamic activity seems to be more involved than the left side in processing negative cue (Balconi and Vanutelli 2016;Balconi et al. 2015;Balconi and Mazza 2010). The valencespecific hypothesis could be adopted to explain this result (Balconi and Pozzoli 2003;Junghöfer et al. 2001;Morita et al. 2001;Wedding and Stalans 1985). ...
Full-text available
Empathy for pain is at the basis of altruistic behaviors and is known to be modulated by variables such as group membership, pleasantness or unpleasantness of situations and social relationships. Also, face attractiveness and aesthetic judgment might play a role when observing a person in painful conditions, by increasing individuals’ empathic responsiveness. Indeed, physical attractiveness can modify both the perception of the face itself and its reception in a social context. In the present study, we aimed to assess cortical activity when attention is focused on the aesthetic features of an individual showing painful feelings. Brain activity (optical imaging: functional near-infrared spectroscopy, fNIRS), considered in its hemodynamic components (oxygenated [oxy-Hb] and deoxygenated hemoglobin [deoxy-Hb]) was monitored when 22 subjects (Mage = 24.9; SD = 3.6) observed faces (attractive; unattractive) that received painful stimulations (pain; no pain) and were asked to judge the attractiveness and pain condition of the face. Specifically, we targeted the left and right inferior frontal gyrus (IFG), sensory cortex, and temporo-parietal junction (TPJ). Analyses revealed significant lower oxy-Hb levels in left IFG compared to right hemispheric channels when asking participants to rate faces attractiveness independently from the stimulus features. Besides, lower levels of deoxy-Hb were detected in the right TPJ for unattractive faces compared to attractive faces. Overall, present findings highlighted that the formulation of an aesthetic judgment and face attractiveness plays a relevant role in empathic concerns and this seems to be able to overlay painful appraisal.
... For instance, emotion recognition from facial expression has been used in several studies [9,10]. Some other researchers have used peripheral physiological data such as blood pressure [11], hemodynamic data acquired by fNIRS [12], electrocardiography signal (ECG) [13], speech [14], skin conductance and electromyography [15] and the combination of these signals [16]. Among these techniques, electroencephalography (EEG) which measures brain electrical activities has been one of the most interesting tools for emotion recognition [17]. ...
Full-text available
Emotion is a fundamental factor that influences human cognition, motivation, decision making and social interactions. This psychological state arises spontaneously and goes with physiological changes that can be recognized by computational methods. In this study, changes in minimum spanning tree (MST) structure of brain functional connectome were used for emotion classification based on EEG data and the obtained results were employed for interpretation about the most informative frequency content of emotional states. For estimation of interaction between different brain regions, several connectivity metrics were applied and interactions were calculated in different frequency bands. Subsequently, the MST graph was extracted from the functional connectivity matrix and its features were used for emotion recognition. The results showed that the accuracy of the proposed method for separating emotions with different arousal levels was 88.28%, while for different valence levels it was 81.25%. Interestingly, the system performance for binary classification of emotions based on quadrants of arousal-valence space was also higher than 80%. The MST approach allowed us to study the change of brain complexity and dynamics in various emotional states. This capability provided us enough knowledge to claim lower-alpha and gamma bands contain the main information for discrimination of emotional states.
... The measurement of neural correlates of cognitive and affective processes using concurrent EEG and fNIRS, multimodal functional neuroimaging, has seen growing interest [43][44][45][46]. As fNIRS and EEG measure complementary aspects of brain activity (hemodynamic and electrophysiological, respectively), a hybrid brain data incorporates more information and enabling higher mental decoding accuracy [43] confirming earlier findings [47]. ...
Full-text available
Human facial expressions are regarded as a vital indicator of one's emotion and intention, and even reveal the state of health and wellbeing. Emotional states have been associated with information processing within and between subcortical and cortical areas of the brain, including the amygdala and prefrontal cortex. In this study, we evaluated the relationship between spontaneous human facial affective expressions and multi-modal brain activity measured via non-invasive and wearable sensors: functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) signals. The affective states of twelve male participants detected via fNIRS, EEG, and spontaneous facial expressions were investigated in response to both image-content stimuli and video-content stimuli. We propose a method to jointly evaluate fNIRS and EEG signals for affective state detection (emotional valence as positive or negative). Experimental results reveal a strong correlation between spontaneous facial affective expressions and the perceived emotional valence. Moreover, the affective states were estimated by the fNIRS, EEG, and fNIRS + EEG brain activity measurements. We show that the proposed EEG + fNIRS hybrid method outperforms fNIRS-only and EEG-only approaches. Our findings indicate that the dynamic (video-content based) stimuli triggers a larger affective response than the static (image-content based) stimuli. These findings also suggest joint utilization of facial expression and wearable neuroimaging, fNIRS, and EEG, for improved emotional analysis and affective brain-computer interface applications.
... The clinical group showed a generally stronger activation in emotion-related neural activity regions compared to the healthy controls. Balconi et al. (2016) applied functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) to explore the cortical lateralization effect in dorsolateral prefrontal cortex (DLPFC) region of the brain with emotional visual and auditory stimulation. They confirmed that the important role of the DLPFC in processing emotional interactions also worked in different species. ...
... Specifically, while the 90 images were displayed randomly, the subjects counted how many images of a specified class were displayed. Each of the images was displayed for six seconds with an inter-stimulus interval of eight seconds in order to alleviate the influence of previously displayed images in the same manner as [2]. For the inter-stimulus interval, a cross mark was displayed in the center of the monitor. ...
Full-text available
In this paper, we propose a human-centered image classification via a neural network considering visual and biological features. The proposed method has two novelties. Firstly, we apply Group-Sparse Local Fisher Discriminant Analysis (GS-LFDA) to biological features. GS-LFDA realizes dimensionality reduction and noise elimination for biological features with consideration of local structures and class information. Secondly, we construct a Canonical Correlation Analysis (CCA)-based hidden layer via Discriminative Locality Preserving CCA (DLPCCA). DLPCCA transforms visual features into effective features by considering the relationships with biological information and class information. The CCA-based hidden layer enables transformation of visual features into effective features for image classification from a small number of training samples. Furthermore, once the projection can be obtained in the training phase, elimination of the need for biological data acquisition in the test phase is realized. This is another merit of our method.
Neurovascular coupling is a key physiological mechanism that occurs in the healthy human brain, and understanding this process has implications for understanding the aging and neuropsychiatric populations. Combined electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) has emerged as a promising, noninvasive tool for probing neurovascular interactions in humans. However, the utility of this approach critically depends on the methodological quality used for multimodal integration. Despite a growing number of combined EEG–fNIRS applications reported in recent years, the methodological rigor of past studies remains unclear, limiting the accurate interpretation of reported findings and hindering the translational application of this multimodal approach. To fill this knowledge gap, we critically evaluated various methodological aspects of previous combined EEG–fNIRS studies performed in healthy individuals. A literature search was conducted using PubMed and PsycINFO on June 28, 2021. Studies involving concurrent EEG and fNIRS measurements in awake and healthy individuals were selected. After screening and eligibility assessment, 96 studies were included in the methodological evaluation. Specifically, we critically reviewed various aspects of participant sampling, experimental design, signal acquisition, data preprocessing, outcome selection, data analysis, and results presentation reported in these studies. Altogether, we identified several notable strengths and limitations of the existing EEG–fNIRS literature. In light of these limitations and the features of combined EEG–fNIRS, recommendations are made to improve and standardize research practices to facilitate the use of combined EEG–fNIRS when studying healthy neurovascular coupling processes and alterations in neurovascular coupling among various populations.
Purpose Here we aimed to automatically classify human emotion earlier than is typically attempted. There is increasing evidence that the human brain differentiates emotional categories within 100- 300 ms after stimulus onset. Therefore, here we evaluate the possibility of automatically classifying human emotions within the first 300 ms after the stimulus and identify the time-interval of the highest classification performance. Methods To address this issue, MEG signals of 17 healthy volunteers were recorded in response to three different picture stimuli (pleasant, unpleasant, and neutral pictures). Six Linear Discriminant Analysis (LDA) classifiers were used based on two binary comparisons (pleasant versus neutral and unpleasant versus neutral) and three different time-intervals (100- 150 ms, 150- 200 ms, and 200- 300 ms post-stimulus). The selection of the feature subsets was performed by Genetic Algorithm and LDA. Results We demonstrated significant classification performances in both comparisons. The best classification performance was achieved with a median AUC of 0.83 (95%- CI [0.71; 0.87]) classifying brain responses evoked by unpleasant and neutral stimuli within 100- 150 ms, which is at least 850 ms earlier than attempted by other studies. Conclusion Our results indicate that using the proposed algorithm, brain emotional responses can be significantly classified at very early stages of cortical processing (within 300 ms). Moreover, our results suggest that emotional processing in the human brain occurs within the first 100- 150 ms.
This resting-state functional magnetic resonance imaging (rs-fMRI) study in tinnitus patients was conducted to observe the spontaneous neural activity of the central auditory system using a derived index, mean amplitude of low-frequency fluctuation (mALFF). Tinnitus subjects with right-ear hearing impairment (THL) and without hearing loss (TNH) and two age-, sex-, and education-matched control groups (NC1 and NC2) were recruited for rs-fMRI. mALFF maps of the tinnitus and matched NC groups were plotted in the central auditory system, including the primary auditory cortex (PAC), higher auditory cortex (HAC), and hubs of the central auditory pathway. mALFF values of the activity clusters in the central auditory system of THL and TNH patients were extracted and correlated with each clinical characteristic. Significantly increased mALFF clusters were found in bilateral PAC and HAC of THL-NC1 maps and in the left inferior colliculus and right HAC of TNH-NC2 maps. Thus, subgroups of tinnitus with and without hearing impairment might exhibit different homeostatic plasticity in the central auditory system. mALFF values of aberrant active clusters in the central auditory system are partly associated with specific clinical tinnitus characteristics.
Full-text available
Prefrontal cortex provides both inhibitory and excitatory input to distributed neural circuits required to support performance in diverse tasks. Neurological patients with prefrontal damage are impaired in their ability to inhibit task-irrelevant information during behavioral tasks requiring performance over a delay. The observed enhancements of primary auditory and somatosensory cortical responses to task-irrelevant distractors suggest that prefrontal damage disrupts inhibitory modulation of inputs to primary sensory cortex, perhaps through abnormalities in a prefrontal-thalamic sensory gating system. Failure to suppress irrelevant sensory information results in increased neural noise, contributing to the deficits in decision making routinely observed in these patients. In addition to a critical role in inhibitory control of sensory flow to primary cortical regions, and tertiary prefrontal cortex also exerts excitatory input to activity in multiple sub-regions of secondary association cortex. Unilateral prefrontal damage results in multi-modal decreases in neural activity in posterior association cortex in the hemisphere ipsilateral to damage. This excitatory modulation is necessary to sustain neural activity during working memory. Thus, prefrontal cortex is able to sculpt behavior through parallel inhibitory and excitatory regulation of neural activity in distributed neural networks.
Full-text available
Near-infrared spectroscopy (NIRS) enables the non-invasive measurement of spatiotemporal characteristics of brain function, which has received increasing attention during the last years. This new birth of interest is attributable to unique characteristics of the NIRS technique, which may be summarised in certain technical advantages: its experimental and ecological validity and the extension of application to clinical samples. This paper presents the main applications of the NIRS technique that measures changes in brain activation to study emotions and social neuroscience field. In the first part of this paper, we discuss the basic principles, strengths, and limitations of NIRS for the study of principal emotional functions. In the second part, we focus on the actual applications of NIRS in emotional and social research. In this regard, first, we consider some main topics of emotional contexts, such as visual (facial expression) and auditory cues recognition, and social neuroscience field. Second, we discuss the utility to apply NIRS simultaneously to other techniques (electroencephalography, Transcranial Magnetic Stimulation, and functional Magnetic Resonance Imaging) to improve the intrinsic power of such measures. Third, we consider the possible applications of NIRS devices to study specific emotion-related functions (such as connectivity and plasticity applications).
Full-text available
Using recent regional brain activation/emotion models as a theoretical framework, we examined whether the pattern of regional EEG activity distinguished emotions induced by musical excerpts which were known to vary in affective valence (i.e., positive vs. negative) and intensity (i.e., intense vs. calm) in a group of undergraduates. We found that the pattern of asymmetrical frontal EEG activity distinguished valence of the musical excerpts. Subjects exhibited greater relative left frontal EEG activity to joy and happy musical excerpts and greater relative right frontal EEG activity to fear and sad musical excerpts. We also found that, although the pattern of frontal EEG asymmetry did not distinguish the intensity of the emotions, the pattern of overall frontal EEG activity did, with the amount of frontal activity decreasing from fear to joy to happy to sad excerpts. These data appear to be the first to distinguish valence and intensity of musical emotions on frontal electrocortical measures.
Conference Paper
Full-text available
Next generation brain computer interfaces (BCI) are expected to provide robust and continuous control mechanism. In this study, we assessed integration of optical brain imaging (fNIR: functional near infrared spectroscopy) to a P300-BCI for improving BCI usability by monitoring cognitive workload and performance. fNIR is a safe and wearable neuroimaging modality that tracks cortical hemodynamics in response to sensory, motor, or cognitive activation. Eight volunteers participated in the study where simultaneous EEG and 16 optode fNIR from anterior prefrontal cortex were recorded while participants engaged with the P300-BCI for spatial navigation. The results showed a significant response in fNIR signals during high, medium and low performance indicating a positive correlation between prefrontal oxygenation changes and BCI performance. This preliminary study provided evidence that the performance of P300-BCI can be monitored by fNIR which in turn can help improve the robustness of the BCI classification.
At the heart of emotion, mood, and any other emotionally charged event are states experienced as simply feeling good or bad, energized or enervated. These states - called core affect - influence reflexes, perception, cognition, and behavior and are influenced by many causes internal and external, but people have no direct access to these causal connections. Core affect can therefore be experienced as free-floating (mood) or can be attributed to some cause (and thereby begin an emotional episode). These basic processes spawn a broad framework that includes perception of the core-affect-altering properties of stimuli, motives, empathy, emotional meta-experience, and affect versus emotion regulation; it accounts for prototypical emotional episodes, such as fear and anger, as core affect attributed to something plus various nonemotional processes.
Emotion decoding constitutes a case of multimodal processing of cues from multiple channels. Previous behavioural and neuropsychological studies indicated that, when we have to decode emotions on the basis of multiple perceptive information, a cross-modal integration has place. The present study investigates the simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs), through an ample range of different emotions (happiness, sadness, fear, anger, surprise, and disgust). Auditory emotional stimuli (a neutral word pronounced in an affective tone) and visual patterns (emotional facial expressions) were matched in congruous (the same emotion in face and voice) and incongruous (different emotions) pairs. Subjects (N=30) were required to process the stimuli and to indicate their comprehension (by stimpad). ERPs variations and behavioural data (response time, RTs) were submitted to repeated measures analysis of variance (ANOVA). We considered two time intervals (150-250; 250-350 ms post-stimulus), in order to explore the ERP variations. ANOVA showed two different ERP effects, a negative deflection (N2), more anterior-distributed (Fz), and a positive deflection (P2), more posterior-distributed, with different cognitive functions. In the first case N2 may be considered a marker of the emotional content (sensitive to type of emotion), whereas P2 may represent a cross-modal integration marker, it being varied as a function of the congruous/incongruous condition, showing a higher peak for congruous stimuli than incongruous stimuli. Finally, a RT reduction was found for some emotion types for congruous condition (i.e. sadness) and an inverted effect for other emotions (i.e. fear, anger, and surprise).
In the last years social neuroscience research attempted to identify the neural networks underlying the human ability to perceive others' emotions, a core process in establishing meaningful social bonds. A large amount of papers arose and identified common and specific empathy-based networks with respect to stimulus type and task. Despite the great majority of studies focused on human-human contexts, we do not establish relations with only other humans, but also with non-human animals. The aim of the present work was to explore the brain mechanisms involved in empathic concern for people who interacts with both peers and other species. Participants have been assessed by functional near-infrared spectroscopy (fNIRS) while viewing pictures depicting humans interacting with both other men and women (human-human condition: HH), or with dogs and cats (human-animal: HA). Results showed that aggressive HH interactions elicited greater prefrontal activity (PFC) than HA ones while, when considering HA interactions, friendly ones were related to higher cortical activity. Finally, oxy (O2Hb) and deoxyhemoglobin (HHb) increasing related to the processing of aggressive interactions positively correlated with different empathic measures, within more specific brain regions. Results were elucidated with respect to available evidence on emotion perception, empathic neural mechanisms and their functional meaning for human-animal contexts. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The present research explored the effect of cross-modal integration of emotional cues (auditory and visual AV) compared with only visual (V) emotional cues in observing inter-species interactions. The brain activity was monitored when subjects processed AV and V situations which represented an emotional (positive or negative), inter-species (human-animal) interactions. Congruence (emotionally congruous or incongruous visual and auditory patterns) was also modulated. EEG brain oscillations (from delta to beta) were analyzed and the cortical source localization (by sLORETA) was applied to the data. Frequency band (mainly low-frequency delta and theta) showed a significant brain activity increasing in response to negative compared to positive interactions within the right hemisphere. Moreover, differences were found based on stimulation type, with an increased effect for AV compared with V. Finally, delta band supported a lateralized right DLPFC activity in response to negative and incongruous inter-species interactions, mainly for AV. The contribution of cross-modality, congruence (incongruous patterns) and lateralization (right DLPFC) in response to inter-species emotional interactions was discussed at light of a "negative lateralized effect".