ArticlePDF Available

Abstract and Figures

Recent psychology and neuroscience studies have used tactile stimuli in patients, concluding after their experiments that touch is a sense tightly linked to emotions. In parallel, a new way of seeing films, 4D cinema, has added new stimuli to the traditional audiovisual via, including the tactile vibration. In this work, we have studied the brain activity of audience while viewing a scene filmed and directed by us and with an emotional content, under two different conditions: 1) image + sound, 2) image + sound + vibro-tactile stimulation. We have designed a glove where pulse trains are generated in coin motors at specific moments and recorded 35 viewers’ electroencephalograms (EEGs) to evaluate the impact of the vibro-tactile stimulation during the film projection. Hotelling’s T-squared results show higher brain intensity if the tactile stimulus is received during the viewing than if no tactile stimulus is injected. Condition 1 participants showed activation in left and right orbitofrontal areas, whereas Condition 2 they also showed activities in right superior frontal and right-medial frontal areas. We conclude that the addition of vibrotactile stimulus increases the brain activity in areas linked with attentional processes, while producing a higher intensity in those related to emotional processes.
This content is subject to copyright. Terms and conditions apply.
Vol.:(0123456789)
Multimedia Tools and Applications (2024) 83:67673–67686
https://doi.org/10.1007/s11042-024-18218-8
1 3
Haptic stimulation duringtheviewing ofafilm:
anEEG‑based study
VíctorCerdán‑Martínez1 · ÁlvaroGarcía‑López2 · PabloRevuelta‑Sanz3 ·
TomásOrtiz1 · RicardoVergaz4
Received: 11 December 2022 / Revised: 8 October 2023 / Accepted: 8 January 2024 /
Published online: 27 January 2024
© The Author(s) 2024
Abstract
Recent psychology and neuroscience studies have used tactile stimuli in patients, concluding
after their experiments that touch is a sense tightly linked to emotions. In parallel, a new way
of seeing films, 4D cinema, has added new stimuli to the traditional audiovisual via, includ-
ing the tactile vibration. In this work, we have studied the brain activity of audience while
viewing a scene filmed and directed by us and with an emotional content, under two different
conditions: 1) image + sound, 2) image + sound + vibro-tactile stimulation. We have designed
a glove where pulse trains are generated in coin motors at specific moments and recorded 35
viewers’ electroencephalograms (EEGs) to evaluate the impact of the vibro-tactile stimula-
tion during the film projection. Hotelling’s T-squared results show higher brain intensity if the
tactile stimulus is received during the viewing than if no tactile stimulus is injected. Condition
1 participants showed activation in left and right orbitofrontal areas, whereas Condition 2 they
also showed activities in right superior frontal and right-medial frontal areas. We conclude that
the addition of vibrotactile stimulus increases the brain activity in areas linked with attentional
processes, while producing a higher intensity in those related to emotional processes.
Keywords EEG· Emotions· Attention· Neurocinematics· Vibro-tactile· Multisensorial·
Temporal cortex· Orbito-frontal cortex· Audiovisual
1 Introduction
Traditional films are designed to be perceived by two senses: sight and hear [1, 2].
Technology development during the latest years has allowed the transformation of
audiovisual experiences into a multisensorial one, named as the Cinema 4D according to
some works (Cardini, 2011; [35,6, 76]. Several experiments have used smell, taste or
touch stimuli in films or videogames [7],Bumsuk [8, 9]. For instance, Lemmens etal., [10]
designed a suit that provided vibrotactile stimulation to boost the emotions of the viewers
during the projection of audiovisual works. None of the above studies used brain activity
measurements, stating their results only according to the subjective impressions of the
participants.
Extended author information available on the last page of the article
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67674
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
To evaluate the impact of viewing films in humans, several neuroscience studies have
used electroencephalogram (EEG) technology to investigate the viewers’ brain activity
[11, 4, 1216, 92]. Some of them have focused on emotions, showing that temporal
and orbitofrontal areas are the most important ones when it comes to emotional content
processing.
Regarding the identification of hemispherical brain lateralization, several hypotheses
still remain under a competitive controversial. One of them suggests a higher participation
of the right hemisphere on the emotion perception, even regardless its valence, i.e., if it is
positive or negative [17]. On the other hand, a different participation of each hemisphere
as a function of the emotional valence is proposed: the left hemisphere is the one more
closely related to the valence of the emotions and the right one is more related with their
intensity [17]. An additional proposal links positive emotions with the activation of the
left hemisphere, while the right one activation would be raised by negative emotions
[1822], though further studies are needed to reinforce these valence differentiations.
Notwithstanding, we can find at least a consensus in the literature: temporal lobes and
orbitofrontal areas are the brain regions activated in any emotional processing [23, 24].
Although the quantity of basic emotions is still a controversial topic, at least the consensus
reaches to the following six: anger, fear, sadness, happiness, disgust, and surprise [2527,
77]. On the other hand, emotions can be analyzed according to their content in valence
–negative/unpleasant to positive/pleasant– and excitation –low to high– [28, 96].
Regarding the relation between emotions and brainwaves, Jung et al. showed that
explicit violent content in films increases the EEG register activity mainly in delta, theta
and alpha waves [97], but other studies widen the scope of frequencies related to emotional
reactions [29, 98], also showing the theta band preponderance [72]. Some EEG-based
studies demonstrate the hypothesis of a higher brain activity intensity under negative
emotions compared with that produced by positive [30, 31]. The origin of this activity
could be located on the cortical-limbic structures, anterior orbitofrontal and temporal areas
[32, 89].
Espenhahn etal. analyzed the brain activity in young people while viewing films by
using the somatosensory evoked potentials measured with an EEG [78]. Their result,
however, were not focused on the evaluation of emotional responses with different ways
of watching the movies, as in the work presented here. Closer to it, Raheel et al., [33,
99] performed a study in which the brain activity was analyzed while watching the film
with and without image and sound, and with some additional stimulation such as heating
or ventilation. Their conclusion was that those stimuli increased the emotional intensity
induced by the movie.
Furthermore, other studies have designed tactile devices, such as vests to enhance
people’s affective communication through multimedia [34], or to evoke specific emotions
through vibration and heat [35]. Additional works have focused on enhancing the movie-
watching experience through tactile stimulation [10, 36, 37], similar to Lemmens etal.’s
[10] multisensory jacket, designed with 16 different segments of 4 motors each capable of
generating synchronized vibrotactile stimuli during moments in a film. Furthermore, other
studies have designed haptic vests applied to video games and augmented reality [38]. In
this vibrotactile feedback jacket, fourteen vibration actuators were integrated to generate
an immersive experience by activating different vibration patterns and intensity levels
according to the game scenes.
Some other recent researches have used vibrotactile stimuli to boost the attentional
or emotional response in people with some disabilities [39, 40], or even in non-disabled
people [14, 41]. Recently, a study was conducted involving hearing impaired individuals,
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67675
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
where the cortex activity of the participants was recorded using EEG [42]. While seeing
a video of a neutral landscape, two different soundtracks were played, each one evoking
a clearly different emotion. In one mode, subtitles were projected remarking the type of
music, its title and authorship. In the other mode, the same video was projected along
with a synchronized vibrotactile stimulation through a haptic glove instead of subtitles.
Hear impaired people response was demonstrated to be highly attentional in the first case
(frontal lobe activation), but highly emotional in the second one (temporal lobe activity).
Nevertheless, up to our knowledge, no relevant studies have been performed on the
brain activity in people without disabilities when only tactile stimulation is applied while
viewing films. The goal of this work is to analyze this activity while viewing a traditional
film with sound and image, and that when adding tactile stimulation on the hands, focusing
on the differences to understand the way in which the brain processes this new stimulus.
This will pave the way to separate different neural processes, and to distinguish the exact
role of the involved brain areas as a foundation to perform analysis when working with
people with disabilities.
Despite the existing studies on the analysis of viewers’ brain activity and emotions to
date, there are no EEG studies that assess the differences between traditional viewing of
audiovisual works and multisensory works, such as those that involve tactile stimulation.
We have decided to conduct a study that provides results regarding these differences,
considering that the perception of movies, video games, and other audiovisual content will
increasingly be influenced by multisensory stimuli in the near future.
2 Materials andMethods
Thirty five participants make our sample up, with ages between 18 and 75, in accordance
with other similar EEG-based studies [12, 43, 44]. Prior to the experiments, subjects were
asked about their phobias, mental disorders, psychiatric and neurological pathologies,
and/or if they use psychotropic substances. None of the participants mentioned anyone of
the above conditions. This study is part of a line of research whose clinical trials were
approved by the Clinical Research Ethics Committee (CEIC) of the San Carlos Clinical
Hospital since April 4th 2019, Madrid (Spain).
2.1 Audiovisual material
Following the conclusions of some studies that used audiovisual stimulation to analyze
emotions with EEGs [4, 45], and, overall, the one of Pereira etal., [46], where the use of
videos of more than one minute was demonstrated as convenient when trying to detect
emotions through EEG, we used a film sequence lasting 5min. It was shot by us, working
in a small cinema team in which two professional actors (an actress and an actor) played
under a well illuminated but intimate atmosphere a tender scene, where they kiss and
caress in the bed of a room. The aim of the director was to create the environment to define
the personal relation between two lovers.
The video was produced in a hotel room in the central area of Madrid. It involved a
director, a camera operator/director of photography, a sound technician, and a production
assistant. The technical equipment used included a Sony PXW-FS5 camcorder, LED
lighting, and a Zoom H5 sound recorder with Rode NTG-4 microphones. Recording
files were stored in MXF format with the XAVC codec, at a resolution of 1920 × 1080
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67676
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
in progressive mode, 25 frames per second, and a bitrate of 50 Mbps. Subsequently,
these files were converted to the Apple ProRes 422 codec so they could be edited in Final
Cut Pro X. The final export of this process was an MP4 file with the H.264 codec, at a
resolution of 1920 × 1080 and a bitrate of 8 Mbps.
To check its validity, 110 students from the Information Sciences Faculty from
Complutense University of Madrid were surveyed to value the kind of emotion that was
transmitted by the film. A questionnaire was applied using a model of emotions which
sets a space of limited, discrete, basic emotions, as well as some complex emotions [28,
47],Zhao y Ge, 2018). The video was watched by students in a classroom at the Faculty
of Information Sciences at the Complutense University of Madrid, in three different
sessions. When the screening was over, the attending students were instructed to complete
a questionnaire posted on the Virtual Campus using their computers or mobile devices.
In this preliminary study students rated the film as ’pleasant’, opposed to ’unpleasant,’
with an average score of 4.1 (5 being ’pleasant’ and 1 being ’unpleasant’). In the
questionnaire, we also asked them to rate the perceived intensity of a specific emotion on
a scale of 1 to 5. The emotions being: happiness, relaxation, satisfaction, and surprise. The
students associated the video more with feelings of relaxation (3.56) and satisfaction (3.01)
than with happiness (1.72) and surprise (1.06).
After the preliminary study, we projected the same film to 35 participants (different
from the previous 110) in two different modes: 1) image and sound, and 2) image, sound,
and tactile stimulation (Fig.2). Thus, in mode 1 viewers watched and heard the video in
a traditional way, whereas in condition 2 the same video was synchronized with tactile
stimuli.
The director of the film chose the exact times that had to trigger the tactile stimuli
(Fig.1). Mainly, it was during the images where the action included explicit physical con-
tact between the actors, such as kisses and caresses.Once these moments were determined
in milliseconds, they were included in the protocol so that, through the “EEG Control”
software, they could be launched in complete synchronization with the video. Before con-
ducting the experiment, the director viewed the audiovisual with tactile stimulation to
ensure that all stimuli were correctly placed. It is worth to say that the exact placement and
duration of the stimuli was established as a purely creative film decision, just like the light-
ing, framing or setup could have been [48, 79].
Fig. 1 Sample of frames from the video where there is tactile interaction between the two video characters
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67677
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
We employed a protocol that encompassed all commands for automatic synchronization
with the video through our proprietary software (“EEG control”). The software serves as
a protocol hub that unifies communication across multiple systems. This protocol includes
timestamps in milliseconds, such as "play video," "glove vibration," or "EEG marker
(Fig.2). This ensured precise synchronization between the video and tactile stimuli or EEG
markers (data collection of brain activity).
We must remark, however, that the mode was selected in a random way for each
viewer, thus avoiding any memory effect in the brain responses. Half of the participants
first viewed Condition 1 and then Condition 2, while the other half did the opposite.
The first participant watched the video first in Condition 1 and then in Condition 2.
Participant 2 watched it first in Condition 2 and then in Condition 1, and so on.
2.2 Tactile stimulation andprotocol
Two haptic gloves were used for the vibrotactile stimulation. Each glove was a Inesis
Golf Glove 100 where 6 Uxcell 1030 micromotors were placed, one on each of the
finger pads and one on the palm. These micromotors provided haptic stimulation by
vibration (Fig.3). The motors operated at 3V DC and consumed up to 70mA each,
with a maximum of 630mA per glove. All other main electronic elements were placed
Fig. 2 Schematic representation of the two modes of the experiment. Mode 1: imagen and sound. Mode 2:
Image, sound, and tactile stimulation. In both modes there was a simultaneous EEG recording
Fig. 3 Gloves used with the coin
motors locations clearly visible
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67678
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
on a printed circuit board (PCB) or Arduino shield, with three L293D motor control-
lers (to control the 12 motors for the 2 gloves) and all the appropriate connections to
the gloves and power supply. In order to provide enough power and to avoid damag-
ing the Arduino board due to an excessive current demand by the haptic stimulation
devices, we designed a drive circuit system to control each motor switch. This circuit
was made up of a power bank, another L293D motor controller (which controlled up to
4 motors and provided up to 0.6A per channel) and a flyback diode to protect against
motor flyback currents.
The positive haptic stimulation lasts 1.6s each. The haptic stimuli are modulated in a
square PWM signal with variable duty cycle due to intensity control, in bursts of 1kHz.
It is applied finger per finger, with the following order of cumulative activations: palm
– thumb – index – middle – ring – little finger and then reverse deactivation towards the
palm, in a 1.6 s full sequence. This pattern is produced in both hands simultaneously.
To generate the stimuli, an Arduino UNO rev3 was used, which in turn is triggered by
a control PC and synchronized with the viewing of the film. It was mandatory an exact
timing as well as a perfect fitting of the glove to the user’s hand, ensuring the right haptic
stimulation through the motors.
Every moment in which the director of the film decided the insertion of an emotional
tactile stimulus, a mark was generated in the EEG to analyze the 200ms (Fig.4). We
selected this analysis window in the EEG recordings because, according to several studies,
it is suitable for analyzing the emotional activity of viewers during the viewing of a visual
or audiovisual work [42, 4952].
Each participant underwent a control test to verify that the vibration itself did not
produce brain activity beyond tactile detection. This vibration consisted of a signal with
a constant frequency of 2Hz and a duty cycle of 10%, with the 6 motors of each hand on
for 50ms and off for 450ms. The duration of this test was 3min. The reason for using
such a low duty cycle was twofold: not to cause discomfort due to its long duration, and to
maintain the tactile stimulus constant.
1st step: tactile
stimuli added,
synchedwith actors
gestures(kisses,
caresses…)
2nd step: marks
addedin thosevery
momentsfor the
EEG recroding
3rd step: we apply
the same marks in
the EEG recording
forthe mode 1)
EEG recording(mode 2)
Image, soundand touc h
EEG recording(mode 1)
Imageand sound
Fig. 4 Design of the marks and tactile stimuli for the experiment recording through EEG
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67679
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
All the stimulations were tracked in this way to evaluate the differences of the brain activity
among the different viewers. Once the EEG record was obtained, a Self-Assessment Mani-
kin (SAM) questionnaire was performed to evaluate the emotional experience of the view-
ers, regarding valence, arousal, and dominance [28, 47, 53]. Each participant reported, indi-
vidually, their emotional experience for each one of the modes of the film on a score of up
to 5 points. For valence they chose between pleasant (5 points), pleased (4 points), neutral
(3 points), unsatisfied (2 points), and unpleasant (1 point). For arousal they selected between
excited (5 points), wide-awake (4 points), neutral (3 points), dull (2 points), and calm (1 point).
Finally, for dominance the participants chose between dependent (5 points), powerlessness (4
points), neutral (3 points), powerful (2 points), and independent (1 point). The duration of
the whole experiment, including EEG instrumentation and gloves setups and initializations,
as well as the questionnaires, was around 60min per participant. No payment was made to
anyone in the sample group for this study.
2.3 EEG recording method
A 64-channels Neuroscan Quik-Cap was used for acquiring the EEG recordings. The
software was ATI Pentatek © (from Advantel SRL). Prior to obtain the EEG recording, the
impedance was checked to be under 5 kΩ. Reference electrodes where ubicated on the two
mastoids. Sample frequency for the register was 1kHz. These data were averaged using a
mean reference. A visual inspection on each one of the obtained registers was performed
to clean data away from artifacts due to eye or muscle movements. Those noisy channels
were substituted by a linear interpolation of adjacent channels. Additionally, those channels
whose square magnitude was higher than four standard deviations of their mean power were
substituted by the mean of the adjacent channels [54].
2.4 Brain sources localization
To allocate the origin of neural activities, an EEG inverse problem was assessed. It
was solved using Low Resolution Electromagnetic Tomography (LORETA) method,
[55]. The solution for each model was restricted to one, or a combination of more
than one, specific anatomical structure. Those structures had restrictions derived from
the assumption of segmenting the average brain atlas from the Montreal Neurological
Institute (MNI) [56] into 90 parts. This procedure followed the automated anatomical
atlas (AAL) [57].
LORETA was applied through the Neuronic software [49, 58, 59, 90] in 50 to 70
artifact-free 200 ms windows, for each participant and mode. It produced a series of
bioelectrical activation maps revealing the maximum activation areas for each group.
Once those areas were located, statistical parametric maps (SPMs) were computed through
the Hotelling’s T-square test against zero, voxel by voxel, to determine statistically significant
sources. Applied to independent groups [60], it allowed the obtention of probability maps with
thresholds at a false discovery rate (FDR) of q ¼ 0.05 [61], depicting them as 3D activation
images overlapped to a MNI brain model. Once the probability maps were obtained, those
anatomical structures greater than 10 voxels and over the threshold – according to AAL
atlas—were identified and highlighted [61]. Subsequently, local maxima were located using
the MNI XYZ-coordinate system.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67680
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
3 Results
LORETA results reveal a maximum activity both in the left orbitofrontal area (X = 2,
Y = 51, Z = –5, with T2 = 4.07) and the right orbitofrontal (X = 3, Y = 52, Z = –4,
T2 = 4.21), with amaximum intensity of 4.21 in the mode 1 (image + sound). In mode 2
(image + sound + touch), the average brain activity was located in the upper right fron-
tal area (X = 27, Y = 62, Z = 6, T2 = 26.40), medial right frontal (X = 40, Y = 53, Z = 8,
T2 = 22.,99), as well as left orbitofrontal (X = -5, Y = 62, Z = -5, T2 = 18.054) and right
orbitofrontal (X = –4, Y = 62, Z = –43, T2 = 19.718) with a maximum intensity of 26.40.
These results are shown through the brain activity maps in Fig.5.
The questionnaires performed after the EEG experiment show differences between the
emotional response of the viewers during the conditions of mode 1 and mode 2. While
in the condition 1 the viewers scored the emotional valence with a 4.2 (Pleasant), that of
the mode 2 was slightly higher, 4.3 (Pleasant). Regarding the arousal, there were higher
differences: average score of mode 1 was 3 (Neutral), whereas multisensorial condition
2 reached a 4.1 (Wide awake). Moreover, participants scored a dominance of 4.1
(Powerlessness) in mode 1, and 2.1 (Powerful) in condition 2.
4 Discussion
Our results show that the tactile stimulation produces a higher activity in frontal and
orbitofrontal areas during the viewing of a film with multisensorial stimuli. According
to several studies [59, 62], emotions are located in orbitofrontal brain areas while the
attentional processes arise from the frontal areas.
Interestingly, we find a remarkable increase in the brain activity of the condition 2
(image + sound + touch) respect to the mode 1 (image + sound) in superior frontal areas.
Several studies have related superior frontal areas with attentional processes [63, 64].
Therefore, the reason for the observed increment in those areas may be the generation of
attentional processes in viewers when a new stimulus is added to the traditional viewing
of a film, as they are not used to that way of perceiving the audiovisual works, like some
authors state [65, 66, 83]. Furthermore, although there is a significant difference in brain
activity between both conditions (4.21 in Condition 1 compared to 26.4 in Condition 2),
the results are consistent with studies [67] that suggest that when we are accustomed to a
stimulus, the introduction of a new one has a greater impact on brain activity. In the case of
Fig. 5 EEG results in: left.—Mode 1 (image and sound) and right.—Mode 2 (image + sound + touch). Max-
imal intensity projection areas are displayed in yellow/red colour. SPMs were computed based on a voxel-
by-voxel Hotelling’s T2 test against zero
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67681
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
this experiment, participants were accustomed to viewing movies with sound and images,
but not with a tactile stimulus.
This is a conclusion that could be easily linked with the research in habituation psychology
[68, 69, 70, 76]. In a cinematographic environment, the incorporation of a new stimulus could
be compared to the situation produced at the early stages of its development, at the beginning
of the 20th Century, where the viewers could see the first frames on a screen. Among the audi-
ence, some were not able to perceive persons, but “flying heads”, due to not seeing the whole
bodies of the actors on the screen [71]. This was a new kind of stimulation for which the brain
needed a certain training and, hence, requiring a volitional attentional process.
On the other hand, it is remarkable how the results show a higher lateralization of the
brain activity, pointing towards the right hemisphere in condition 2 (image + sound + touch)
when compared to mode 1. According Pralus etal., [17] right hemisphere activation in
the brain is related to a higher intensity in the perception of an audiovisual work, and the
left hemisphere activation is more related to the valence of the emotion. Our EEG record-
ing results seem to reinforce the above conclusions, because they match with the view-
ers’ answers in those questionnaires [28, 96]. Although the participants provided a similar
valence score in both conditions (pleasant), they remarkably differed in arousal and domi-
nance. While arousal was neutral in condition 1, viewers felt more awaken in condition 2.
Indeed, they marked with a high score the tactile stimulation, qualifying it as “powerful”,
whereas the condition 1 was “powerlessness”.
Finally, all these conclusions agree with the raising of frontal areas activities in
condition 2, reported by several authors as to be linked with attentional processes [63].
5 Conclusions
We conclude that the tactile stimulation increases the brain activity intensity of the viewers
while watching a film with emotional content. They perceive a higher emotional intensity,
as well as they develop more cognitive attention focused on the projected film.
We would like to remark some limitations of the present study. First: the tactile stimuli
were located only in both hands. Therefore, further research is needed to analyze the brain
activity with additional olfactory or gustative stimulus, or/and tactile on other parts of the
body. Second: for future research, we think that a comparison should be made between two
groups, one stimulated by a series similar to the one applied to our mode 2, and other one
in which a training on the sensorial stimuli had been previously performed. In this way,
we could value the real effect on those viewers of the tactile stimulation used to perceive
works with multisensorial stimuli.
Funding Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature.
Data availability The datasets generated during and/or analysed during the current study are available from
the corresponding author on reasonable request.
Declarations
Informed consent The participants of the experiment signed a consent document.
Conflicts of interest We do not have conflicts of interest.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67682
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
Research involving Human Participants and/or Animals This study is part of a line of research whose clinical
trials were approved by the Clinical Research Ethics Committee (CEIC) of the San Carlos Clinical Hospital
since April 4th 2019, Madrid (Spain).
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License,
which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long
as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Com-
mons licence, and indicate if changes were made. The images or other third party material in this article
are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the
material. If material is not included in the article’s Creative Commons licence and your intended use is not
permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly
from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
References
1. Akiba HT, Costa MF, Gomes JS, Oda E, Simurro PB, Dias AM (2019) Neural Correlates of Preference:
A Transmodal Validation Study. Front Hum Neurosci 13:73. https:// doi. org/ 10. 3389/ fnhum. 2019. 00073
2. Huang RS, Chen CF, Sereno MI (2018) Spa- tiotemporal integration of looming visual and tactile
stimuli near the face. Human Brain Mapp 39(5):1256–2176. https:// doi. org/ 10. 1002/ hbm. 23995
3. Biazon, L. C., Martinazzo, A. A. G., Jose, M. A., Ficheman, I. K., Zuffo, M. K., & Lopes, R. D.
(2016).Developing an interactive 4D cinema platform with open source tools. 2016 IEEE Interna-
tional Symposium on Consumer Electronics (ISCE).https:// doi. org/ 10. 1109/ isce. 2016. 77974 04
4. Kang D, Kim J, Jang D-P, Cho YS, Kim S-P (2015) Investigation of engagement of viewers in movie
trailers using electroencephalography. Brain-Computer Interfaces 2:193–201. https:// doi. org/ 10. 1080/
23262 63X. 2015. 11035 91
5. Terlutter R, Diehl S, Koinig I, Waiguny MKJ (2016) Positive or Negative Effects of Technology
Enhancement for Brand Placements? Memory of Brand Placements in 2D, 3D, and 4D Movies. Media
Psychol 19(4):505–533. https:// doi. org/ 10. 1080/ 15213 269. 2016. 11423 77
6. Zhoul Y, Tapaswi M, Fidler S (2018) Now you shake me: Towards automatic 4D cinema. 2018 IEEE/
CVF Conference on computer vision and pattern recognition, pp 7425–7434. https:// opena ccess.
thecvf. com/ conte nt_ cvpr_ 2018/ html/ Zhou_ Now_ You_ Shake_ CVPR_ 2018_ paper. html
7. Genna C, Oddo C, Fanciullacci C, Chisari C, Micera S, Artoni F (2018) Bilateral cortical representation
of tactile roughness. Brain Res 1699:79–88. https:// doi. org/ 10. 1016/j. brain res. 2018. 06. 014
8. Bumsuk Choi, Eun-Seo Lee, & Kyoungro Yoon. (2011).Streaming Media with Sensory Effect. 2011 Inter-
national Conference on Information Science and Applications. https:// doi. org/ 10. 1109/ icisa. 2011. 57723 90
9. Rangel ML, Souza L, Rodrigues EC, Oliveira JM, Miranda MF, Galves A, Vargas CD (2021) Predict-
ing upcoming events occurring in the space surrounding the Hand. Neural Plast 2021:6649135
10. Lemmens, P., Crompvoets, F., Brokken, D., van den Eerenbeemd, J. & de Vries, G.-J. (2009).A body-
conforming tactile jacket to enrich movie viewing. World Haptics 2009 - Third Joint EuroHaptics
Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems.
https:// doi. org/ 10. 1109/ whc. 2009. 48108 32
11. Simons RF, Detenber BH, Cuthbert BN, Schwartz DD, Reiss JE (2003) Attention to television: Alpha
power and its relationship to image motion and emotional content. Media Psychol 5(3):283–301.
https:// doi. org/ 10. 1207/ S1532 785XM EP0503_ 03
12. Jalilifard A, Rastegarnia A, Pizzolato EB, Islam MK (2020) Classification of emotions induced by horror
and relaxing movies using single-channel EEG recordings. Int J Electr Comput Eng (IJECE) 10(4):3826
13. Kauttonen J, Hlushchuk Y, Jääskeläinen IP, Tikka P (2018) Brain mechanisms underlying cue-based
memorizing during free viewing of movie Memento. Neuroimage 172:313–325. https:// doi. org/ 10.
1016/j. neuro image. 2018. 01. 068
14. Masood N, Farooq H (2021) Comparing Neural Correlates of Human Emotions across Multiple Stimu-
lus Presentation Paradigms. Brain Sci 11(6):696. https:// doi. org/ 10. 3390/ brain sci11 060696
15. Raz G, Hendler T (2014) Forking Cinematic Paths to the Self Neurocinematically Informed Model of
Empathy in Motion Pictures. Projections 8–2, https:// doi. org/ 10. 3167/ proj. 2014. 080206
16. Zhu L, Wu Y (2021) Love your country: EEG evidence of actor preferences of audiences in patriotic
movies. Front Psychol 12:717025
17. Pralus A, Belfi A, Hirel C, Lévêque Y, Fornoni L, Bigand E, Jung J, Tranel D, Nighoghossian N,
Tillmann B, Caclin A (2020) Recognition of musical emotions and their perceived intensity after uni-
lateral brain damage. Cortex, S0010945220302173. https:// doi. org/ 10. 1016/j. cortex. 2020. 05. 015
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67683
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
18. Brookshire G, Casasanto D (2018) Approach motivation in human cerebral cortex. Philos Trans B 373.
https:// doi. org/ 10. 1098/ rstb. 2017. 0141
19. Comer C, DeVore B, Harrison P, Harrison D (2017) Cerebral laterality, emotion, and cardiopulmonary
functions: An investigation of left and right CVA patients. Acta Neuropsychologica 15:32–55. https://
doi. org/ 10. 5604/ 01. 3001. 0010. 6093
20. Davidson R (1993) Cerebral asymetry and emotion: conceptual and methodological conundrums.
Cog Emotion 7:115–138
21. Gainotti G (2019) A historical review of investigations on laterality of emotions in the human
brain. J Hist Neurosci 11:217–233. https:// doi. org/ 10. 1080/ 09647 04X. 2018. 15246 83
22. Wyczesany M, Capotosto P, Zappasodi F, Prete G (2018) Hemispheric asymmetries and emotions:
Evidence from effective connectivity. Neuropsychologia 121:98–105
23. Kortelainen, Jukka; Väyrynen, Eero; Seppänen, Tapio (2015). High-Frequency Electroencephalo-
graphic Activity in Left Temporal Area Is Associated with Pleasant Emotion Induced by Video
Clips. Comput Intell Neurosci 1–14. https:// doi. org/ 10. 1155/ 2015/ 762769
24. Meletti S, Tassi L, Mai R, Fini N, Tassinari CA, Russo GL (2006) Emotions induced by intracerebral electri-
cal stimulation of the temporal lobe. Epilepsia 47(5):47–51. https:// doi. org/ 10. 1111/j. 1528- 1167. 2006. 00877
25. Barrett LF (2011) Was Darwin wrong about emotional expressions? Curr Dir Psychol Sci 20:400–
406. https:// doi. org/ 10. 1177/ 09637 21411 429125
26. Ortony A, Turner TJ (1990) What’s basic about basic emotions? Psychol Rev 97:315–331. https://
doi. org/ 10. 1037/ 0033- 295X. 97.3. 315
27. Panksepp J (2010) Affective consciousness in animals: perspectives on dimensional and primary
process emotion approaches. Proc Biol Sci 277:2905–2907. https:// doi. org/ 10. 1098/ rspb. 2010. 1017
28. Barrett LF, Mesquita B, Ochsner KN, Gross JJ (2007) The experience of emotion. Annu Rev Psy-
chol 58:373–403. https:// doi. org/ 10. 1146/ annur ev. psych. 58. 110405. 085709
29. Zheng W-L, Zhu J-Y, Lu B-L (2019) Identifying stable patterns over time for emotion recognition
from EEG. IEEE Trans Affect Comput 10(3):417–429. https:// doi. org/ 10. 1109/ taffc. 2017. 27121 43
30. Arjmand H-A, Hohagen J, Paton B, Rickard NS (2017) Emotional Responses to Music: Shifts in Frontal Brain
Asymmetry Mark Periods of Musical Change. Front Psychol 8:2044. https:// doi. org/ 10. 3389/ fpsyg. 2017. 02044
31. Pan F, Zhang L, Ou Y, Zhang X (2019) The audio-visual integration effect on music emotion: Behavio-
ral and physiological evidence. PLoS ONE 14(5):1–21. https:// doi. org/ 10. 1371/ journ al. pone. 02170 40
32. Linhartová P, Látalová A, Kóša B, Kašpárek T, Schmahl C, Paret C (2019) fMRI neurofeedback in emo-
tion regulation: A literature review. Neuroimage. https:// doi. org/ 10. 1016/j. neuro image. 2019. 03. 011
33. Raheel A, Majid M, Alnowami M, Anwar SM (2020) Physiological Sensors Based Emotion Recognition
While Experiencing Tactile Enhanced Multimedia. Sensors 20(14):4037. https:// doi. org/ 10. 3390/ s2014 4037
34. Zhang L, Saboune J, El Saddik A (2015) Development of a haptic video chat system. Multimedia
Tools Appl 74(15):5489–5512. https:// doi. org/ 10. 1007/ s11042- 014- 1865-x
35. Arafsha F, Alam KM, El Saddik A (2015) Design and development of a user centric affective haptic
jacket. Multimed Tools Appl 74(9):3035–3052. https:// doi. org/ 10. 1007/ s11042- 013- 1767-3
36. Ablart D, Velasco C, Obrist M,Video O (2017) Integrating mid-air haptics into movie experiences.
En Proceedings of the 2017 ACM International Conference on Interactive Experiences for TV, pp
77–84. https:// doi. org/ 10. 1145/ 30775 48. 30775 51
37. Mazzoni A, Bryan-Kinns N (2016) Moody: Haptic sensations to enhance mood in film music. En
Proceedings of the 2016 ACM Conference Companion Publication on Designing Interactive Sys-
tems. https:// doi. org/ 10. 1145/ 29088 05. 29088 11
38. Zhu L, Cao Q, Cai Y (2020) Development of augmented reality serious games with a vibrotactile feedback
jacket. Virtual Reality & Intelligent Hardware 2(5):454–470. https:// doi. org/ 10. 1016/j. vrih. 2020. 05. 005
39. Piccardi ES, Begum Ali J, Jones EJH, Mason L, Charman T, … Gliga T (2021) Behavioural and
neural markers of tactile sensory processing in infants at elevated likelihood of autism spectrum
disorder and/or attention deficit hyperactivity disorder. J Neurodevelopmental Disord 13(1). https://
doi. org/ 10. 1186/ s11689- 020- 09334-1
40. Portnova GV, McGlone FP, Tankina OA, Skorokhodov IV, Shpitsberg IL, Varlamov AA (2019)
EEG correlates of tactile perception abnormalities in children with autism spectrum disorder.
Sovremennye Tekhnologii v Meditsine 11(1):169. https:// doi. org/ 10. 17691/ stm20 19. 11.1. 20
41. Zhuang N, Zeng Y, Yang K, Zhang C, Tong L, Yan B (2018) Investigating Patterns for Self-Induced
Emotion Recognition from EEG Signals. Sensors 18(3):841. https:// doi. org/ 10. 3390/ s1803 0841
42. Mulas MJ, Revuelta P, Garcia A, Ruiz B, Vergaz R, Cerdan V, Ortiz T (2020) Vibrotactile caption-
ing of musical effects in audio-visual media as an alternative for deaf and hard of hearing people:
An EEG study. IEEE Access: Pract Innovations, Open Solutions 8:190873–190881. https:// doi. org/
10. 1109/ access. 2020. 30322 29
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67684
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
43. Lee Y-Y, Hsieh S (2014) Classifying different emotional states by means of EEG-based functional con-
nectivity patterns. PLoS ONE 9(4):e95415. https:// doi. org/ 10. 1371/ journ al. pone. 00954 15
44. Pradhapan P, Velazquez ER, Witteveen JA, Tonoyan Y, Mihajlović V (2020) The Role of Features
Types and Personalized Assessment in Detecting Affective State Using Dry Electrode EEG. Sensors
20:6810. https:// doi. org/ 10. 3390/ s2023 6810
45. Revuelta P, Ortiz T, Lucía MJ, Ruiz B, Sánchez-Pena JM (2020) Limitations of Standard Accessible
Captioning of Sounds and Music for Deaf and Hard of Hearing People: An EEG Study. Front Integr
Neurosci 14:1. https:// doi. org/ 10. 3389/ fnint. 2020. 00001
46. Pereira ET, Gomes HM, Veloso LR, Mota MRA (2021) Empirical evidence relating EEG signal dura-
tion to emotion classification performance. IEEE Trans Affect Comput 12(1):154–164. https:// doi. org/
10. 1109/ taffc. 2018. 28541 68
47. Geethanjali B, Adalarasu K, Hemapraba A, Kumar SP, Rajasekeran R (2017) Emotion analysis using
SAM (Self-Assessment Manikin) scale. Biomedical Research-tokyo. https:// acort ar. link/ OV4Xdn
48. Bordwell D, Bordwell T (1988) Film Art: an Introduction, 2nd edn. Random House
49. Ponz A, Montant M, Liegeois-Chauvel C, Silva C, Braun M, Jacobs AM, Ziegler JC (2014) Emotion
processing in words: a test of the neural re-use hypothesis using surface and intracranial EEG. Social
Cognitive and Affective Neuroscience 9(5):619–627. https:// doi. org/ 10. 1093/ scan/ nst034
50. Xie W, McCormick SA, Westerlund A, Bowman LC, Nelson CA (2019) Neural correlates of facial
emotion processing in infancy. Dev Sci 22(3):e12758. https:// doi. org/ 10. 1111/ desc. 12758
51. Yang K, Tong L, Shu J, Zhuang N, Yan B, Zeng Y (2020) High Gamma Band EEG Closely Related
to Emotion: Evidence From Functional Network. Front Human Neurosci 14. https:// doi. org/ 10. 3389/
fnhum. 2020. 00089
52. Yang T, Di Bernardi Luft C, Sun P, Bhattacharya J, Banissy MJ (2020) Investigating age-related neural
compensation during emotion perception using electroencephalography. Brain Sci 10(2):61. https://
doi. org/ 10. 3390/ brain sci10 020061
53. Lang PJ (1985) The Cognitive Psychophysiology of Emotion: Anxiety and the Anxiety Disorders.
Lawrence Erlbaum, Hillsdale, NJ
54. Dmochowski JP, Bezdek MA, Abelson BP, Johnson JS, Schumacher EH, Parra LC (2014) Audi-
ence preferences are predicted by temporal reliability of neural processing. Nature Communications
5(July):1–9. https:// doi. org/ 10. 1038/ ncomm s5567
55. Pascual-Marqui RD, Michel CM, Lehman D (1994) Low resolution electromagnetic tomography: a
new method for localizing electrical activity of the brain. Int J Psychophysiololy 18:49–65. https:// doi.
org/ 10. 1016/ 0167- 8760(84) 90014-X
56. Evans AC, Collins DL, Mills SR, Brown ED, Kelly RL, Peters TM (1993) 3D statistical neuroanatomi-
cal models from 305 MRI volumes. Proc. IEEE- Nucl Sci Symp Med Imaging Conf 95:1813–1817
(https:// bit. ly/ 2FbDz Dw)
57. Tzourio-Mazoyer N, Landeau B, Papathanassiou D, Crivello F, Etard O, Delcroix N, Mazoyer B, Joliot
M (2002) Automated anatomical labeling of activations in SPM using a macroscopic anatomical par-
cellation of the MNI MRI single-subject brain. Neuroimage 15(1):273–289
58. Al-Naan A, Hosny M, Al-Ohali Y, Al-Wabil A (2017) Review and classification of emotion rec-
ognition based on EEG brain-computer interface system research: A systematic review. Appl Sci
7(12):1239. https:// doi. org/ 10. 3390/ app71 21239
59. Goshvarpour A, Goshvarpour A (2019) EEG spectral powers and source localization in depressing,
sad, and fun music videos focusing on gender differences. Cogn Neurodyn 13:161–173. https:// doi. org/
10. 1007/ s11571- 018- 9516-y
60. Carbonell F, Galan L, Valdes P, Worsley K, Biscay RJ, Diaz-Comas L (2004) Random field-union intersection
tests for EEG/MEG imaging. Neuroimage 22:268–276. https:// doi. org/ 10. 1016/j. neuro image. 2004. 01. 020
61. Lage-Castellanos A, Martínez-Montes E, Hernández-Cabrera JA, Galán L (2010) False discovery rate and
permutation test: an evaluation in ERP data analysis. Stat Med 29:63–74. https:// doi. org/ 10. 1002/ sim. 3784
62. Cela-Conde C, Ayala F, Munar E, Maestú F, Nadal M, Capó M, Del Río D, López-Ibor J, Ortiz T,
Mirasso C, Marty G (2009) Sex-related similarities and differences in the neural correlates of beauty.
PNAS 106(10):3847–3852. https:// doi. org/ 10. 1073/ pnas. 09003 04106
63. Hales JB, Brewer JB (2011) The timing of associative memory formation: frontal lobe and anterior
medial temporal lobe activity at associative binding predicts memory. J Neurophysiol 105(4):1454–
1463. https:// doi. org/ 10. 1152/ jn. 00902. 2010
64. Vaidya AR, Fellows LK (2019) Ventromedial frontal lobe damage affects interpretation, not explora-
tion, of emotional facial expressions. Cortex; a J Devoted Study Nerv Syst Behav 113:312–328. https://
doi. org/ 10. 1016/j. cortex. 2018. 12. 013
65. Geller JD (2020) Introduction: Psychotherapy through the lens of cinema. J Clin Psychol. https:// doi.
org/ 10. 1002/ jclp. 22995
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67685
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
66. Pace-Schott EF, Shepherd E, Spencer RMC, Marcello M, Tucker M, Propper RE, Stickgold R (2011)
Napping promotes inter-session habituation to emotional stimuli. Neurobiol Learn Mem 95(1):24–36.
https:// doi. org/ 10. 1016/j. nlm. 2010. 10. 006
67. Schweinberger SR, Neumann MF (2016) Repetition effects in human ERPs to faces. Cortex; a J
Devoted Study Nerv Syst Behav 80:141–153. https:// doi. org/ 10. 1016/j. cortex. 2015. 11. 001
68. Sokolov YN (1963) Higher nervous functions: The orienting reflex. Annual Annual Review of Physi-
ology 25:545–580. https:// doi. org/ 10. 1146/ annur ev. ph. 25. 030163. 002553
69. Wood DC (1988) Habituation in Stentor produced by mechanoreceptor channel modification. J Neuro-
sci 8(7):2254–2258. https:// doi. org/ 10. 1523/ JNEUR OSCI. 08- 07- 02254. 1988
70. Rankin HA, Abrams T, Barry RJ, Bhatnagar S, Clayton DF, Colombo J, Thompson RF (2009) Habitu-
ation revisited: An updated and revised description of the behavioral characteristics of habituation.
Neurobiol Learn Mem 92(2):135–138. https:// doi. org/ 10. 1016/j. nlm. 2008. 09. 012
71. Zatsepin V (2017) Acting for the silent screen: Film actors and aspiration between the wars. Hist J
Film Radio Telev 37(4):760–762. https:// doi. org/ 10. 1080/ 01439 685. 2017. 13451 33
72. Balconi M, Lucchiari C (2006) EEG correlates (event-related desynchronization) of emotional face elabora-
tion: A temporal analysis. Neuroscience Letters 392(1–2):0–123. https:// doi. org/ 10. 1016/j. neulet. 2005. 09. 004
73. Breiter H, Etcoff N, Whalen P, Kennedy W, Rauch S, Buckner R, Srauss M, Hyman S, Rosen B (1996)
Response and Habituation of the Human Amygdala during Visual Processing of Facial Expression.
Neuron 17(5):875–887. https:// doi. org/ 10. 1016/ S0896- 6273(00) 80219-6
74. Cardini F, Costantini M, Galati G, Romani GL, Làdavas E, Serino A (2011) Viewing One’s Own
Face Being Touched Modulates Tactile Perception: An fMRI Study. J Cogn Neurosci 23(3):503–513.
https:// doi. org/ 10. 1162/ jocn. 2010. 21484
75. Cela-Conde C, Marty G, Maestú F, Ortiz T, Munar E, Fernández A, Roca M, Rosselló J, Quesne F
(2004) Activation of the prefrontal cortex in the human visual aesthetic perception. Proc Natl Acad Sci
16:6321–6325. https:// doi. org/ 10. 1073/ pnas. 04014 27101
76. Christoforou C, Papadopoulos TC, Constantinidou F, Theodorou M (2017) Your Brain on the Movies:
A Computational Approach for Predicting Box-office Performance from Viewer’s Brain Responses to
Movie Trailers. Front Neuroinform 11:72. https:// doi. org/ 10. 3389/ fninf. 2017. 00072
77. Ekman P, Cordaro D (2011) What is meant by calling emotions basic. Emot Rev 3:364–370. https://
doi. org/ 10. 1177/ 17540 73911 410740
78. Espenhahn S, Yan T, Beltrano W, Kaur S, Godfrey K, Cortese F, Bray S, Harris AD (2020) The effect
of movie-watching on electroencephalographic responses to tactile stimulation. NeuroImage. https://
doi. org/ 10. 1016/j. neuro image. 2020. 117130
79. Giménez Sarmiento Á, CerdánMartínez V (2022) Propuesta metodológica para el análisis de la forma
documental De la cuantificación a la cualificación. Comunicación Y Métodos 4(1):9–25. https:// doi.
org/ 10. 35951/ v4i1. 151
80. Harmon-Jones E (2003) Clarifying the emotive functions of asymmetrical frontal cortical activity. Psy-
chophysiology 40(6):838–848. https:// doi. org/ 10. 1111/ 1469- 8986. 00121
81. Gainotti G (2018) Emotions and the Right Hemisphere: Can New Data Clarify Old Models? Neurosci-
entist. https:// doi. org/ 10. 1177/ 10738 58418 785342
82. Grimshaw GM, Carmel D (2014) An asymmetric inhibition model of hemispheric differences in emo-
tional processing. Front Psychol 5:489
83. Kaneko T, Tomonaga M (2008) Utility of Habituation-Dishabituation Procedure for Comparative Cog-
nitive Studies of Callithrix Jacchus and Aotus spp.: Preliminary Assessments. Perceptual and Motor
Skills 106(3):830–832. https:// doi. org/ 10. 2466/ pms. 106.3. 830- 832
84. Knyazev GG, Slobodskoj-Plusnin JY, Bocharov AV (2009) Event-related delta and theta synchronization
during explicit and implicit emotion processing. Cognitive Neuroscience 164(4):0–1600
85. Khan, P., Ranjan, P., & Kumar, S. (2021). AT2GRU: A human emotion recognition model with miti-
gated device heterogeneity. IEEE Transactions on Affective Computing, 1–1. https:// doi. org/ 10. 1109/
taffc. 2021. 31141 23
86. Leeuwis N, Pistone D, Flick N, van Bommel T (2021) A sound prediction: EEG-based neural syn-
chrony predicts online music streams. Front Psychol 12:672980
87. Major S, Carpenter K, Beyer L, Kwak H, Dawson G, Murias M (2020) The influence of background auditory
noise on P50 and N100 suppression elicited by the Paired-Click Paradigm. J Psychophysiol 34(3):171–178
88. Masood N, Farooq H (2019) Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emo-
tional State. Sensors 19:522. https:// doi. org/ 10. 3390/ s1903 0522
89. Miller JG, Xia G, Hastings PD (2019) Resting Heart Rate Variability is Negatively Associated with
Mirror Neuron and Limbic Response to Emotional Faces. Biological Psychology 107717. https:// doi.
org/ 10. 1016/j. biops ycho. 2019. 107717
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
67686
Multimedia Tools and Applications (2024) 83:67673–67686
1 3
90. Ortiz Alonso T, Santos JM, Ortiz Terán L, Borrego Hernández M, Poch Broto J, de Erausquin GA
(2015) Differences in early stages of tactile ERP temporal sequence (P100) in cortical organization
during passive tactile stimulation in children with blindness and controls. PLoS ONE 10(7):e0124527.
https:// doi. org/ 10. 1371/ journ al. pone. 01245 27
91. Schrammen E, Grimshaw GM, Berlijn AM, Ocklenburg S, Peterburs J (2020) Response inhibition
to emotional faces is modulated by functional hemispheric asymmetries linked to handedness. Brain
Cogn 145(105629):105629. https:// doi. org/ 10. 1016/j. bandc. 2020. 105629
92. Smith ME, Gevins A (2004) Attention and brain activity while watching television: Components of
viewer engagement. Media Psychol 6(3):285–305. https:// doi. org/ 10. 1207/ s1532 785xm ep0603_3
93. Smith EE, Reznik SJ, Stewart JL, Allen JJB (2017) Assessing and conceptualizing frontal EEG asym-
metry: An updated primer on recording, processing, analyzing, and interpreting frontal alpha asymme-
try. Int J Psychophysiol 111:98–114. https:// doi. org/ 10. 1016/j. ijpsy cho. 2016. 11. 005
94. Tanaka K, Yasuda S, Kuriki S, Uchikawa Y (2016) The influence of visual induction of positive-neg-
ative emotions on the somatosensory cortex. IEEE Trans Electron Inf Syst 136(9):1298–1304. https://
doi. org/ 10. 1541/ ieeje iss. 136. 1298
95. Ushida T, Ikemoto T, Taniguchi S, Ishida K, Murata Y, Ueda W, Tanaka S, Ushida A, Tani T (2005)
Virtual pain stimulation of allodynia patients activates cortical representation of pain and emotions: a
functional MRI study. Brain Topogr 18(1):27–35. https:// doi. org/ 10. 1007/ s10548- 005- 7898-8
96. Zhao G, Zhang Y, Ge Y (2018) Frontal EEG Asymmetry and Middle Line Power Difference in Dis-
crete Emotions. Front Behav Neurosci 12:225–235. https:// doi. org/ 10. 3389/ fnbeh. 2018. 00225
97. Jung C-W, Lee H-E, Wi H-W, Choi N-S, Park P-W (2016) Study on the characteristics of EEG in rest-
ing state on visuo-spatial working memory performance. Journal of the Korea Academia-Industrial
cooperation Society 17(4):351–360. https:// doi. org/ 10. 5762/ kais. 2016. 17.4. 351
98. Masood N, Farooq H (2018) Multimodal paradigm for emotion recognition based on EEG signals. En
Human-Computer Interaction. Theories, Methods, and Human Issues. Springer International Publish-
ing, pp 419–428
99. Raheel A, Anwar SM, Majid M, Khan B, Ehatisham-ul-Haq (2016) Real time text speller based on eye
movement classification using wearable EEG sensors. 2016 SAI Computing Conference (SAI). https://
doi. org/ 10. 1109/ SAI. 2016. 75559 77
Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and
institutional affiliations.
Authors and Aliations
VíctorCerdán‑Martínez1 · ÁlvaroGarcía‑López2 · PabloRevuelta‑Sanz3 ·
TomásOrtiz1 · RicardoVergaz4
* Víctor Cerdán-Martínez
vicerdan@ucm.es
Álvaro García-López
alvaro.garcia.lopez@urjc.es
Pablo Revuelta-Sanz
revueltapablo@uniovi.es
Tomás Ortiz
tortiz@ucm.es
Ricardo Vergaz
rvergaz@ing.uc3m.es
1 Universidad Complutense de Madrid, Av Séneca 4, 28040Madrid, Spain
2 University Rey Juan Carlos, Electronic Engineering, Madrid, Spain
3 Universidad de Oviedo, C/. González Besada, 13- Planta Baja, 33007Oviedo, Spain
4 Universidad Carlos III de Madrid, C/ Madrid, 126, 28903Getafe(Madrid), Spain
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1.
2.
3.
4.
5.
6.
Terms and Conditions
Springer Nature journal content, brought to you courtesy of Springer Nature Customer Service Center
GmbH (“Springer Nature”).
Springer Nature supports a reasonable amount of sharing of research papers by authors, subscribers
and authorised users (“Users”), for small-scale personal, non-commercial use provided that all
copyright, trade and service marks and other proprietary notices are maintained. By accessing,
sharing, receiving or otherwise using the Springer Nature journal content you agree to these terms of
use (“Terms”). For these purposes, Springer Nature considers academic use (by researchers and
students) to be non-commercial.
These Terms are supplementary and will apply in addition to any applicable website terms and
conditions, a relevant site licence or a personal subscription. These Terms will prevail over any
conflict or ambiguity with regards to the relevant terms, a site licence or a personal subscription (to
the extent of the conflict or ambiguity only). For Creative Commons-licensed articles, the terms of
the Creative Commons license used will apply.
We collect and use personal data to provide access to the Springer Nature journal content. We may
also use these personal data internally within ResearchGate and Springer Nature and as agreed share
it, in an anonymised way, for purposes of tracking, analysis and reporting. We will not otherwise
disclose your personal data outside the ResearchGate or the Springer Nature group of companies
unless we have your permission as detailed in the Privacy Policy.
While Users may use the Springer Nature journal content for small scale, personal non-commercial
use, it is important to note that Users may not:
use such content for the purpose of providing other users with access on a regular or large scale
basis or as a means to circumvent access control;
use such content where to do so would be considered a criminal or statutory offence in any
jurisdiction, or gives rise to civil liability, or is otherwise unlawful;
falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association
unless explicitly agreed to by Springer Nature in writing;
use bots or other automated methods to access the content or redirect messages
override any security feature or exclusionary protocol; or
share the content in order to create substitute for Springer Nature products or services or a
systematic database of Springer Nature journal content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a
product or service that creates revenue, royalties, rent or income from our content or its inclusion as
part of a paid for service or for other commercial gain. Springer Nature journal content cannot be
used for inter-library loans and librarians may not upload Springer Nature journal content on a large
scale into their, or any other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not
obligated to publish any information or content on this website and may remove it or features or
functionality at our sole discretion, at any time with or without notice. Springer Nature may revoke
this licence to you at any time and remove access to any copies of the Springer Nature journal content
which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or
guarantees to Users, either express or implied with respect to the Springer nature journal content and
all parties disclaim and waive any implied warranties or warranties imposed by law, including
merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published
by Springer Nature that may be licensed from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a
regular basis or in any other manner not expressly permitted by these Terms, please contact Springer
Nature at
onlineservice@springernature.com
Article
Full-text available
This study explores the distinct artistic and technical qualities of Indian mythological films by focusing on the visual aesthetics and cinematic devices that characterize them. Indian mythological films use a unique visual language to bring ancient epics and folklore to life. It is firmly anchored in cultural and religious traditions. This study looks at the lavish costumes, vivid color schemes, and complex set designs that all add to the immersive experience of the genre. To improve the storyline, it also examines how special effects, camera work, and editing techniques are used. Examining contemporary and influential works, the study shows how filmmakers mix new technologies with classic motifs to appeal to a broad audience. The study examines audience response, narrative structures, and directing approaches through popular film acceptance case studies. This analysis aims to highlight how important visual storytelling is to the preservation and popularisation of Indian mythology, as well as how it affects the larger cinematic landscape. The study concludes that the combination of inventive cinematic techniques and rich visual aesthetics protects a cultural legacy and pushes the genre into new creative realms, guaranteeing its continued significance in modern film.
Article
Full-text available
Movie watching is one of the common ways to spark love for the country. A good patriotic movie can arouse love and pride, encourage people to stand by their countries, and reinforce a sense of national belonging. To evoke audience emotion and enhance patriotism, the choice of actors is fundamental and is a dilemma for film producers. In this exploratory study, an electroencephalogram (EEG) with a rating task was used to investigate how actor types (i.e., skilled vs. publicity) in patriotic movies modulate the willingness of audiences to watch a film and their emotional responses. Behavioral results showed that audiences are more willing to watch patriotic movies starring skilled actors than to watch patriotic movies starring publicity actors. Furthermore, brain results indicated that smaller P3 and late positive potential (LPP) were elicited in response to skilled actors than to publicity actors in patriotic movies. A larger theta oscillation was also observed with skilled actors than with publicity actors. These findings demonstrate that the willingness of audiences to watch a movie is deeply affected by actor types in patriotic films. Specifically, skilled actors engage audiences emotionally, more so than publicity actors, and increase the popularity of patriotic movies. This study is the first to employ neuroscientific technology to study movie casting, which advances film studies with careful scientific measurements and a possible new direction. La première des vertus est le dévouement à la patrie. Napoléon Bonaparte
Article
Full-text available
Neuroforecasting predicts population-wide choices based on neural data of individuals and can be used, for example, in neuromarketing to estimate campaign successes. To deliver true value, the brain activity metrics should deliver predictive value above and beyond traditional stated preferences. Evidence from movie trailer research has proposed neural synchrony, which compares the similarity of brain responses across participants and has shown to be a promising tool in neuroforecasting for movie popularity. The music industry might also benefit from these increasingly accurate success predictors, but only one study has been forecasting music popularity, using functional magnetic resonance imaging measures. Current research validates the strength of neural synchrony as a predictive measure for popularity of music, making use of electroencephalogram to capture moment-to-moment neural similarity between respondents while they listen to music. Neural synchrony is demonstrated to be a significant predictor for public appreciation on Spotify 3 weeks and 10 months after the release of the albums, especially when combined with the release of a single. On an individual level, other brain measures were shown to relate to individual subjective likeability ratings, including Frontal Alpha Asymmetry and engagement when combined with the factors artist and single release. Our results show the predictive value of brain activity measures outperforms stated preferences. Especially, neural synchrony carries high predictive value for the popularity on Spotify, providing the music industry with an essential asset for efficient decision making and investments, in addition to other practical implications that include neuromarketing and advertising industries.
Article
Full-text available
Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same emotion, produce any subject and stimulus independent neural correlates” remains unanswered. Furthermore, we found that there are publicly available datasets that are used in a large number of studies targeting EEG-based human emotional state recognition. Since one of the major concerns and contributions of this work is towards classifying emotions while subjects experience different stimulus-presentation paradigms, we need to perform new experiments. This paper presents a novel experimental study that recorded EEG data for three different human emotional states evoked with four different stimuli presentation paradigms. Fear, neutral, and joy have been considered as three emotional states. In this work, features were extracted with common spatial pattern (CSP) from recorded EEG data and classified through linear discriminant analysis (LDA). The considered emotion-evoking paradigms included emotional imagery, pictures, sounds, and audio–video movie clips. Experiments were conducted with twenty-five participants. Classification performance in different paradigms was evaluated, considering different spectral bands. With a few exceptions, all paradigms showed the best emotion recognition for higher frequency spectral ranges. Interestingly, joy emotions were classified more strongly as compared to fear. The average neural patterns for fear vs. joy emotional states are presented with topographical maps based on spatial filters obtained with CSP for averaged band power changes for all four paradigms. With respect to the spectral bands, beta and alpha oscillation responses produced the highest number of significant results for the paradigms under consideration. With respect to brain region, the frontal lobe produced the most significant results irrespective of paradigms and spectral bands. The temporal site also played an effective role in generating statistically significant findings. To the best of our knowledge, no study has been conducted for EEG emotion recognition while considering four different stimuli paradigms. This work provides a good contribution towards designing EEG-based system for human emotion recognition that could work effectively in different real-time scenarios.
Article
Full-text available
Predicting upcoming sensorimotor events means creating forward estimates of the body and the surrounding world. This ability is a fundamental aspect of skilled motor behavior and requires an accurate and constantly updated representation of the body and the environment. To test whether these prediction mechanisms could be affected by a peripheral injury, we employed an action observation and electroencephalogram (EEG) paradigm to assess the occurrence of prediction markers in anticipation of observed sensorimotor events in healthy and brachial plexus injury (BPI) participants. Nine healthy subjects and six BPI patients watched a series of video clips showing an actor’s hand and a colored ball in an egocentric perspective. The color of the ball indicated whether the hand would grasp it (hand movement), or the ball would roll toward the hand and touch it (ball movement), or no event would occur (no movement). In healthy participants, we expected to find distinct electroencephalographic activation patterns (EEG signatures) specific to the prediction of the occurrence of each of these situations. Cluster analysis from EEG signals recorded from electrodes placed over the sensorimotor cortex of control participants showed that predicting either an upcoming hand movement or the occurrence of a tactile event yielded specific neural signatures. In BPI participants, the EEG signals from the sensorimotor cortex contralateral to the dominant hand in the hand movement condition were different compared to the other conditions. Furthermore, there were no differences between ball movement and no movement conditions in the sensorimotor cortex contralateral to the dominant hand, suggesting that BPI blurred specifically the ability to predict upcoming tactile events for the dominant hand. These results highlight the role of the sensorimotor cortex in creating estimates of both actions and tactile interactions in the space around the body and suggest plastic effects on prediction coding following peripheral sensorimotor loss. 1. Introduction Predicting upcoming movements in a variable environment is a fundamental aspect of skilled motor behavior [1–5]. This prediction ability demands an accurate and constantly updated representation of the body and its surrounding space [6, 7] and can be critical for survival [8, 9]. Interestingly, the mere knowledge of a coming action performed by others has been shown to automatically trigger the motor system [8, 10, 11]. Furthermore, the integrity of the parietal cortex has been proven to be important in relation to the capacity to predict upcoming actions [11]. Action observation paradigms have shown that the capacity to estimate the consequences of others’ actions seems to be bonded to our own sensorimotor representations [12]. Early seminal work showed the existence of bimodal neurons that were both responsive to tactile stimuli applied to a given body part and to the sight of objects moving towards the same body part in the premotor area F4 of macaque monkeys [13, 14] as well as in the posterior parietal cortex [15, 16]. Such neurons form a network devoted to the representation of peripersonal space, defined as the space directly surrounding different parts of the body [17]. In another series of studies, it was shown that observing other peoples’ skin being touched or tickled activated the observer’s somatosensory representation in the brain [18–20]. Thus, anticipating the occurrence of a tactile event in the peripersonal body space might trigger specific responses in those brain regions [21, 22]. Furthermore, as suggested by recent behavioral studies, the networks in the brain coding for the space of action (“arm reaching space”) and the peripersonal space could be at least partially segregated. An important finding is that the space within arms’ reach is not body part centered, while the peripersonal space is [23]. As a consequence, the predictive coding signatures associated with each of these two networks might also differ. There is mounting evidence indicating that modifications in the body can alter peripersonal space [24–28]. Among the different types of peripheral injury, brachial plexus injury (BPI) has been seen as a challenging model for the study of brain plasticity [29–35]. Although the upper limb is still connected to the body/trunk, its sensory and motor functions can be deeply impaired due to nerve damage [34]. BPI patients present structural brain change, as well as grey matter atrophy in multiple cortical areas mostly related with motor function [36]. Furthermore, both behavioral [37] and neurophysiological [38] effects have been reported after peripheral lesions affecting the dominant versus nondominant limb. Thus, one could expect that predictive coding aspects that are associated with the peripersonal space might be altered after a BPI in the dominant hand. The aim of this study was to investigate in healthy participants whether prediction of movement and tactile events occurring in the space surrounding the hand trigger specific electroencephalographic signatures in the sensorimotor cortex. We employed a modified version of the action observation paradigm originally devised by Kilner et al. [10], in which a prediction marker was retrieved from the EEG signals collected over the sensorimotor cortex region when the participants expected to observe a hand moving towards an object. In addition, we introduced a new condition associated specifically with the prediction of an upcoming tactile event, a ball moving towards a hand at rest. We hypothesized that distinct EEG signatures would be associated with the prediction of these upcoming events. If this was the case, then distinct neural networks might be enrolled in coding peripersonal and motor prediction events. Furthermore, in BPI participants, we expected that these EEG signatures would be affected as a function of sensorimotor loss. 2. Materials and Methods 2.1. Participants Two groups of right-handed participants were tested: (i) nine neurologically healthy subjects (two women and seven men; mean age 30 years, range 21–49) and (ii) six participants suffering from traumatic unilateral brachial plexus injury (BPI; all males, mean age 28.67 years, range 20–40, see Table 1 for the patients’ demographic and clinical data). Handedness was evaluated considering their laterality before the BPI occurrence using the Edinburgh Inventory [39]. All subjects gave informed consent prior to testing. The experiment was approved by the local ethics committee (process number: 298.925, Instituto de Neurologia Deolindo Couto of the Federal University of Rio de Janeiro). ID Age Handedness Injury side Lesion Time since injury (months) BPI1 30 R R S, M 15 BPI2 20 R R S, M 8 BPI3 24 R R S, M 7 BPI4 32 R R S, M 8 BPI5 26 R L S, M, I 24 BPI6 40 R L S, M, I 6 Anatomical localization of BPI: S: superior trunk; M: middle trunk; I: inferior trunk; R: right; L: left.
Article
Full-text available
Backgrounds Atypicalities in tactile processing are reported in autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) but it remains unknown if they precede and associate with the traits of these disorders emerging in childhood. We investigated behavioural and neural markers of tactile sensory processing in infants at elevated likelihood of ASD and/or ADHD compared to infants at typical likelihood of the disorders. Further, we assessed the specificity of associations between infant markers and later ASD or ADHD traits. Methods Ninety-one 10-month-old infants participated in the study ( n = 44 infants at elevated likelihood of ASD; n = 20 infants at elevated likelihood of ADHD; n = 9 infants at elevated likelihood of ASD and ADHD; n = 18 infants at typical likelihood of the disorders). Behavioural and EEG responses to pairs of tactile stimuli were experimentally recorded and concurrent parental reports of tactile responsiveness were collected. ASD and ADHD traits were measured at 24 months through standardized assessment (ADOS-2) and parental report (ECBQ), respectively. Results There was no effect of infants’ likelihood status on behavioural markers of tactile sensory processing. Conversely, increased ASD likelihood associated with reduced neural repetition suppression to tactile input. Reduced neural repetition suppression at 10 months significantly predicted ASD (but not ADHD) traits at 24 months across the entire sample. Elevated tactile sensory seeking at 10 months moderated the relationship between early reduced neural repetition suppression and later ASD traits. Conclusions Reduced tactile neural repetition suppression is an early marker of later ASD traits in infants at elevated likelihood of ASD or ADHD, suggesting that a common pathway to later ASD traits exists despite different familial backgrounds. Elevated tactile sensory seeking may act as a protective factor, mitigating the relationship between early tactile neural repetition suppression and later ASD traits.
Article
Full-text available
Standard captioning for the deaf and hard of hearing people cannot transmit the emotional information that music provides in support of the narrative in audio-visual media. We explore an alternative method using vibrotactile stimulation as a possible channel to transmit the emotional information contained in an audio-visual soundtrack and, thus, elicit a greater emotional reaction in hearing-impaired people. To achieve this objective, we applied two one-minute videos that were based on image sequences that were unassociated with dramatic action, maximizing the effect of the music and vibrotactile stimuli. While viewing the video, using EEG we recorded the brain activity of 9 female participants with normal hearing, and 7 female participants with very severe and profound hearing loss. The results show that the same brain areas are activated in participants with normal hearing watching the video with the soundtrack, and in participants with hearing loss watching the same video with a soft and rhythmic vibrotactile stimulation on the palm and fingertips, although in different hemispheres. These brain areas (auditory cortex, superior temporal cortex, medial frontal cortex, inferior frontal gyrus, superior temporal pole and insula) have been consistently reported as areas involved in the emotional perception of music. We conclude that vibrotactile stimuli can generate cortex activation while watching audio-visual media in a similar way to sound. Thus, a further in-depth study of the possibilities of these stimuli can contribute to an alternative subtitling channel for enriching the audiovisual experience of hearing-impaired people.
Article
Full-text available
Assessing the human affective state using electroencephalography (EEG) have shown good potential but failed to demonstrate reliable performance in real-life applications. Especially if one applies a setup that might impact affective processing and relies on generalized models of affect. Additionally, using subjective assessment of ones affect as ground truth has often been disputed. To shed the light on the former challenge we explored the use of a convenient EEG system with 20 participants to capture their reaction to affective movie clips in a naturalistic setting. Employing state-of-the-art machine learning approach demonstrated that the highest performance is reached when combining linear features, namely symmetry features and single-channel features, with nonlinear ones derived by a multiscale entropy approach. Nevertheless, the best performance, reflected in the highest F1-score achieved in a binary classification task for valence was 0.71 and for arousal 0.62. The performance was 10-20% better compared to using ratings provided by 13 independent raters. We argue that affective self-assessment might be underrated and it is crucial to account for personal differences in both perception and physiological response to affective cues.
Preprint
Full-text available
Background In the past few years, augmented reality (AR) has rapidly advanced and has been applied in different fields. One of the successful AR applications is the immersive and interactive serious games, which can be used for education and learning purposes. Methods In this project, a prototype of an AR serious game is developed and demonstrated. Gamers utilize a head-mounted device and a vibrotactile feedback jacket to explore and interact with the AR serious game. Fourteen vibration actuators are embedded in the vibrotactile feedback jacket to generate immersive AR experience. These vibration actuators are triggered in accordance with the designed game scripts. Various vibration patterns and intensity levels are synthesized in different game scenes. This article presents the details of the entire software development of the AR serious game, including game scripts, game scenes with AR effects design, signal processing flow, behavior design, and communication configuration. Graphics computations are processed using the graphics processing unit in the system. Results /Conclusions The performance of the AR serious game prototype is evaluated and analyzed. The computation loads and resource utilization of normal game scenes and heavy computation scenes are compared. With 14 vibration actuators placed at different body positions, various vibration patterns and intensity levels can be generated by the vibrotactile feedback jacket, providing different real-world feedback. The prototype of this AR serious game can be valuable in building large-scale AR or virtual reality educational and entertainment games. Possible future improvements of the proposed prototype are also discussed in this article.
Article
Device heterogeneity can cause a detrimental impact on the classification of healthcare data. In this work, we propose the Maximum Difference-based Heterogeneity Mitigation (MDHM) method to address device heterogeneity. Mitigating heterogeneity increases the reliability of using multiple devices from different manufacturers for measuring a particular physiological signal. Further, we propose an attention-based bilevel GRU (Gated Recurrent Unit) model, abbreviated as AT2GRU, to classify multi-modal healthcare time-series data for human emotion recognition. The physiological signals of Electroencephalogram (EEG) and Electrocardiogram (ECG) for twenty-three persons are leveraged from the DREAMER dataset for emotion recognition. Also, from the DEAP dataset, the biosignals namely EEG, Galvanic Skin Response (GSR), Respiration Amplitude (RA), Skin Temperature (ST), Blood Volume (BV), Electromyogram (EMG) and Electrooculogram (EOG) of thirty-two persons are used for emotion recognition. The EEG and the other biosignals are denoised by the wavelet filters for enhancing the model's classification accuracy. A multi-class classification is carried out considering valence, arousal, and dominance for each person in the datasets. The classification accuracy is validated against the self-assessment obtained from the respective person after watching a movie/video. The proposed AT2GRU model surpasses the other sequential models namely Long Short Term Memory (LSTM) and GRU in performance.
Article
Effective response inhibition requires efficient bottom-up perceptual processing and effective top-down inhibitory control. To investigate the role of hemispheric asymmetries in these processes, 49 right- and 50 left-handers completed a tachistoscopic Go/Nogo task with positive and negative emotional faces while ERPs were recorded. Frontal resting state EEG asymmetry was assessed as a marker of individual differences in prefrontal inhibitory networks. Results supported a dependency of inhibitory processing on early lateralized processes. As expected, right-handers showed a stronger N170 over the right hemisphere, and better response inhibition when faces were projected to the right hemisphere. Left-handers showed a stronger N170 over the left hemisphere, and no behavioural asymmetry. Asymmetries in response inhibition were also valence-dependent, with better inhibition of responses to negative faces when projected to the right, and better inhibition of responses to positive faces when projected to the left hemisphere. Frontal asymmetry was not related to handedness, but did modulate response inhibition depending on valence. Consistent with the asymmetric inhibition model (Grimshaw & Carmel, 2014), greater right frontal activity was associated with better response inhibition to positive than to negative faces; subjects with greater left frontal activity showed an opposite trend. These findings highlight the interplay between bottom-up and top-down processes in explaining hemispheric asymmetries in response inhibition.