ArticlePDF Available

Abstract

Commercial non-invasive BCI (Brain Computer Interface) devices are currently applied in several research fields, thank to the low cost of these EEG-based systems and also to the portability of the equipment. The latter feature makes BCI devices particularly suited for entertainment applications, especially due to the possibility to detect the mental state of the users. Emotions and entertainment are evidently strongly related, as the influence of music in human emotional states. On one side, BCI devices represent a challenge in gaming motion control, on the other, they have been successfully applied in music production and composition. In our previous work we focused on conscious production of music notes with the aim of developing an entertainment application prototype. In this work we trace the state-of-the art of research and discuss possible applications of the preliminary obtained results.
The musical mind: a Brain Computer Interface approach
Raffaella Folgieri1, Claudio Lucchiari2
1Department of Economics, Management and Quantitative Methods, Università degli Studi di Milano, Italy
2Health Sciences Department, Università degli Studi di Milano, Italy
This is an e-print version. To read the final version use the following link:
http://www.ikpress.org/abstract.php?iid=614&id=42&aid=4384#.VdRFBpUVhE4
ISSN No. : 2395-3438 (Print), 2395-3446 (Online)
To cite this paper:
Folgieri R., Lucchiari C. (2015). The musical mind: a brain computer interface approach. Journal of Basic and
Applied Research International, Vol.: 11, 1.
Abstract
Commercial non-invasive BCI (Brain Computer Interface) devices are currently applied in several research
fields, thank to the low cost of these EEG-based systems and also to the portability of the equipment. The
latter feature makes BCI devices particularly suited for entertainment applications, especially due to the
possibility to detect the mental state of the users. Emotions and entertainment are evidently strongly
related, as the influence of music in human emotional states. On one side, BCI devices represent a
challenge in gaming motion control, on the other, they have been successfully applied in music production
and composition. In our previous work we focused on conscious production of music notes with the aim of
developing an entertainment application prototype. In this work we trace the state-of-the art of research
and discuss possible applications of the preliminary obtained results.
Key Words: Brain Computer Interface; EEG, music, emotions.
INTRODUCTION
The reliability of commercial non-invasive BCI (Brain Computer Interface) devices, the portability of the
equipment and the low cost of these EEG-based systems, compared to other brain image techniques, such
as fMRI, determined the increasing interest in their application in different Research fields [1, 2, 3].
This last BCI devices’ feature particularly allows to perform experiments involving virtual [1] and real [4]
situations and a relatively large number of participants to the experiment, usually limited, due to the costs
and/or the invasivity of other techniques. BCIs are especially suited when it is needed to evaluate individual
emotional or cognitive responses, limiting the confounding effects due to anxiety or discomfort. In fact,
these devices consist in a simplification of the medical EEG equipment, communicating an EEG response to
stimuli by WiFi connection, allowing to people feeling relaxed and to move freely in the experimental
environment, acting as in the absence of any device.
The use of Brain Computer Interface (BCI) devices is largely promoted for gaming by several studies.
Nevertheless, despite of the many efforts done to create BCI-based game interfaces, only few
entertainment BCI applications are really effective, not due to the difficulties on the computer side in signal
interpretation, but the difficulties of individuals in focusing on their imagination of movements to control
games’ characters.
Some BCIs, such as the Emotiv Epoc (Emotive System Inc.) [5], easily allow the myographic interface to
game commands, but are hardly able to correctly translate the pure cerebral signals into specific actions or
movements. In fact, BCIs are currently less accurate than other game interfaces and require several training
sessions to be used. On the contrary, music entertainment applications seem to be more effective and
require a shorter training on BCI devices.
In this paper, we introduce a cognitive approach to music-elicited emotions in order to analyse the possible
role of a BCI-based paradigm both to improve our comprehension of the musical mind and to promote the
development of tools able to support emotional involvement in different virtual environments. We will
focus attention on games specifically projected as ludic application, but we believe that similar tools will be
useful also in educational and neuro-restorative settings.
In section two, we introduce some basic studies on the brain correlates of music processing, so to present a
general idea of the musical mind also considering the related models of emotional processing.
Section three presents a short review on implicit and explicit use of BCI in games and of main BCI
commercial models. Section four focuses on our approach in enhancing gamers’ emotional experience
through music, presenting the results of preliminary experiments performed to evaluate our approach in
detecting users’ mental states. Last, we shortly present the state-of-the-art of our research. Section five is
devoted to our conclusions and some considerations about BCIs’ promising applications
THE MUSICAL MIND
In neuroscience studies considerable interest has been recently devoted to cognitive processes of music [6,
7]. The music processing, in fact, is a unique example of a complex mental process that requires the
coordination of many neural and cognitive subsystems still having a sort of modularity, which is well suited
to the needs experimental control variables [7].
Interestingly, the music processing requires integration between perceptual and executive processes,
difficult to find in other domains of cognition. For these reasons, music mechanisms have often been the
subject of analysis in the study of the link between perception and action.
Bangert and Altenmüller [8] propose a model of two-way relationship (Forward-Inverse) which operates a
mapping between the auditory representation of the input signal and the respective motor representation.
Neurophysiological studies that have monitored the learning process in pianists, have shown that the
sensorimotor co-activation emerges after a few minutes of practice and is consolidated in a few days [8].
Research on the mirror system [9], has shown that passive listening to previously learned music activates a
neural network that includes active premotor areas, similar to those involved in the execution [10]. The
tests on the neural correlates of integration in the audio-motor musicians seem to indicate the existence of
a fronto-parietal-temporal network that is active during the execution as well as while listening to music.
This functional network can be established after only a short practice of music in non-experts, and seems
very durable and stable in experts.
Considering the effect of different music pieces on the brain, the well-known Mozart effect has received
much attention in neuroscience. The Mozart effect refers to a controversial scientific theory following a
famous experiment conducted by Gordon Shaw and Frances Rausher and published in 1993 [11].
According to the authors, listening to Mozart's music and, in particular, the sonata K448, led to a temporary
increase in intellectual abilities in a group of volunteers. Specifically, after listening to the level of
intelligence participants showed an increase in the IQ of 8-9 points compared to controls. According to
Shaw, the Mozart's music may have this effect, because of ability to boost creative processes.
A study by Bodner et al. has analyzed the Mozart effect, by the use functional magnetic resonance imaging
(fRMN) [12]. Authors found significant differences in the change of brain blood flow in frontal, prefrontal
and occipital cortexes as well as in cerebellum induced by Sonata K448 by Mozart, compared to that
induced by Beethoven's music and a popular composition for piano. All these areas are implied in the
spatial-temporal processing. Therefore, it can be inferred music activates the auditory cortex, even if some
pieces have a higher impact on over brain regions involved in fine motor coordination, vision and attention.
Some studies have also documented the therapeutic efficacy of music in the treatment of epilepsy,
reducing the duration and frequency of seizures, suggesting that music (in particular, the K448 sonata) may
important effects in reducing cortical excitation and in organizing various cortical networks [13].
Several neural structures are known to take part to emotion processing. The amygdala is probably the brain
structure the most consistently associated with the recognition of fear. The frontal cortices and putamen
are activated during recognition of happiness, while the anterior frontal areas are activated for the
recognition of sadness [14].
Damage to the insula in patients with Huntington's disease [15] and individuals identified as carriers of the
gene for this disease [16] showed a marked deficit in recognition of disgust due to an abnormal activity of
the caudate and putamen.
Both cortical and subcortical brain structures seem to be involved in emotional regulation, although some
brain structures appear to be responsible for the production and perception of specific emotions. In
particular, it was assumed that the subcortical brain structures can interfere with cognitive processes
through prefrontal-thalamic-strial and the fronto-striatal circuits [17].
Listening to music involves several processes: cognitive, evaluative and affective. As a result of a daily
exposure to music, people with no musical training are able to implicitly encode the structures and
properties of tonality and harmony that underlie most genres of Western culture, such as folk music and
classical [18].
Most neuroimaging studies that examine emotional processes have used visual stimuli to evoke emotions.
This material is composed of stimuli using both facial expressions both scenes created to evoke basic
emotions (positive or negative). However, it is evident that emotional experiences mainly rely on the
presence of stimuli combined from different mode. For example, the music is often used to enhance the
emotional impact of the film.
The emotional power of music over the human mind and the body has been the subject of interest in
philosophy, medicine, psychology and musicology [19]. Several theories have attempted to describe and
explain the emotional effect of music.
However, there is still a lack of experimental tests to determine the exact mechanisms of emotional effects
induced by music, the nature of music-elicited emotions, and their relationship with other affective
processes. It has been proposed that music-elicited emotions differ from others emotions because they are
neither goal oriented nor caused by an adaptive reaction. Moreover, emotional responses to the music
might reflect extra-musical associations such as memory and attention rather than the direct effects of
auditory processing. Another debate is how to correctly describe the full range of emotions inducible
music. Some studies [20] suggest that music can elicit different emotions from those described as basic
emotions from classical theories [21] or three-dimensional models that describe all the experience in terms
of affective valence and arousal [22].
Neuroimaging methods have thrown new light on these issues. However, research on the neural correlates
of the perception of emotions and music are still insufficient. It was observed that a dopamine peak
happens in the area of the striatum (the set of brain nuclei, which is part of the caudate, located in the
frontal lobe), precisely coinciding with the emotional peak induced in listeners by music.
The activation of the striatum and limbic regions while listening to pleasant music has been shown in
various studies using functional magnetic resonance imaging and this activation was also evident when
listening to unknown pieces [23].
One study examined the neural correlates of emotions induced happy and sad music. The happy musical
stimuli were associated with greater activation in the ventral striatum and the dorsal prefrontal cortex
bilaterally, the left cingulate cortex, and the left parahippocampal gyrus. Sad musical stimuli led to
increased activation of the medial structures such as the hippocampus and amygdala [24].
It has been shown that music provokes intense emotional reactions that activate brain regions involved in
reward, motivation, emotion and excitement, including the ventral striatum, midbrain, thalamus,
orbitofrontal cortex, anterior cingulate cortex and the insula [23]. All these brain structures are known to
be activated in response to other stimuli that induce euphoria, such as food, sex and drugs of abuse [25,
26].
In an effort to investigate the neural correlates of sadness, fear and joy, Baumgartner [27] noted that
auditory information interact with visual information in several limbic and paralimbic structures, including
the amygdala and the hippocampus. Changes of activity in these structures were stronger during the
combined presentation of photographs expressing fear or sadness in combination with sad or scary music
melodies, than when the information was only visual. The combined presentation of music and photos also
aroused a strong activation of the parahippocampal gyrus and the temporal poles [27].
Changes in activity in the amygdala, hippocampal formation, parahippocampal gyrus and temporal poles
have also been found in other studies carried out by the use of fMRI, suggesting that these structures are
part of a network which plays a leading role in the emotional processing [28].
Another study analysed the neural processes of people listening to unknown music. By the use of fMRI it
has been shown that the activity of the nucleus accumbens is associated with music processing too. The
involvement of the nucleus accumbens confirms that the emotional effect of music may activate the
mechanisms of expectation and anticipation of gratification, mediated by the dopamine neurotransmitter.
Known and unknown music seems to activate the same dopaminergic areas at the same level. This effect
may be due to the implicit knowledge of music, obtained with the informal experience of the music typical
of a certain culture [29].
In a study, Abrams has shown that listening to classical music evokes a unique pattern of activation of brain
areas in spite of the differences between people [30]. Take on a whole, the neuroscientific evidence clearly
shows the power of music of involving the human mind by the activation of a complex brain network, able
to elicit emotions, as well as attentional and memory mechanism. This is true even for people with no
musical education and the impact on the brain functioning seem to be widely the same to people who
share a same musical culture.
IMPLICIT AND EXPLICIT BCI IN GAMES: A SHORT REVIEW
Several studies are related to the use of brain activity in games to involve moving a cursor on the screen or
guiding the movements of an avatar in a virtual environment through the imagination of the movements
[31, 32, 33]. With this aim, among the different BCIs proposed by commercial Companies, two small-sized,
inexpensive devices are currently largely in use in the scientific and in the entertainment communities: the
Emotiv Epoc (Emotive System Inc.) [5] and the Neurosky MindWave (NeuroSky Inc.) [34]. These two BCI
devices are currently used in several games as command interface. For example, some games as “Sniper
Elite” [35] or “Call of Duty” [36] translate EEG signals collected by a BCI to determine how much the degree
of attention affects the aiming or the accuracy of the player’s weapon. In the games “Metal Gear Solid”
[37], “Silent Hill” [38] or “Resident Evil” [39] the stress level can influence the behaviour of non-player
characters.
BCI-based game interfaces often adopt a two-class approach as a modality for movement control, realizing
a so-called ‘limited explicit interaction’. In some cases, games use signals registered by BCIs to modify the
environment or the level of the game, realizing, in this case, an ‘implicit’ approach. This latter represents an
interaction paradigm based not on explicit and voluntary user’s action, but on the detection of the users’
state in a particular context. Some examples are given by the games “Bacteria Hunt” [40], in which alpha
brain rhythms levels are related to the controllability of the player’s avatar, or “AlphaWoW” , based on the
game “World of Warcraft”, in which the user’s avatar can transform into an animal following the alpha
brain rhythm activity [41].
We must recall that BCIs collect several cerebral frequency rhythms: the alpha band (7 Hz14 Hz), related
to relaxed awareness, contemplation, etc.; the Beta band (14 Hz30 Hz), associated with active thinking,
active attention, solving concrete problems; the Delta band (3 Hz7 Hz), frontally in adults, posteriorly in
children with high amplitude waves, found during some continuous attention tasks [42, 43]; the Theta band
(4 Hz7 Hz), usually related to emotional processing [44]; the Gamma band (30 Hz80 Hz), generally related
to cognitive processing of multi-sensorial signals. The two considered commercial BCIs collect all the listed
signals and also eye blink. Furthermore, the Emotiv Epoc also register myographic signals, allowing the use
of facial expressions and head movements (as a gyroscope) to interact with a game environment. The
Neurosky Mindwave provide also two custom measures: “attention” and “meditation” e-sense meter
values, which indicate respectively the user’s level of mental focus and the level of relaxation. For their
characteristics, the two considered devices could be efficiently used in games both implementing the
explicit interaction and the implicit approach described above, but they are also useful in experiments,
thank to the mentioned portability and comphort for users.
ENHANCING EMOTIONAL EXPERIENCE THROUGH MUSIC
To make a BCI an interesting interaction tool, we must focus on enhancing the user experience, considering
that this device provides a direct detection of the user’s mental state (implicit paradigm) and at the same
time it could represent a new means of control (explicit paradigm). With the aim of testing the use of the
Emotiv Epoc in entertainment applications related to music, we performed several exploratory experiments
to select the brain channels and rhythms to be considered, reducing, so, the users’ training time. With the
perspective to use a BCI as a mean to improve the emotional experience in games or education, especially
through music, we focused our research effort in applying BCI devices in music entertainment applications
and in adapting music to user’s mental state in games. The first approach, in fact, implements the BCIs
explicit paradigm while the second focuses on the implicit use of brain signals detected by a BCI device.
We consider the music field either as a specific application in entertainment and as a necessary part of
emotional involvement of users in games and other kinds of entertainment/educational applications. In
fact, the application of BCI devices to entertainment by a music point of view covers mainly two
perspectives:
games sounds and music adaptation to users’ mental state;
music entertainment applications, such as, for example, music games or music production or composition
engines.
In our experiments and in current works, we focus on the listed two points, at first performing some
preliminary experiments with the aim to individuate how detect users’ mental state corresponding to
sounds eliciting specific emotions; later, testing the possibility to make persons able to consciously produce
single music notes; and lastly creating game environments to test the possibility to influence games’ music
following or contrasting the players’ mental state to enhance their emotional experience and involve them
more in the plot.
In a first experiment [45, 46] we tested the possibility to detect participants’ mental state elicited by
selected sounds. Results showed that brain activity measured at the anterior part of the scalp distinguished
the valence of musical emotions using the Emotiv Epoc. The activity increases during the presentation of
positive-valence musical excerpts, while it decreases corresponding to negative ones. The overall frontal
activation is related to the intensity of emotions elicited by music. In fact, it decreases corresponding to
sound related to fear and increase for happiness or satisfaction. The results did not show any difference in
gender response.
Positive-valence musical stimuli elicited greater left frontal EEG activity and less frontal EEG power in alpha
and beta bands than in theta and gamma bands, while negative-valence stimuli, while negative-valence
musical stimuli elicited greater right frontal EEG activity and less frontal EEG power in theta and gamma
bands than in the alpha and beta bands.
According to [47], the intensity/hemisphere analysis showed significant effects of intensity and hemisphere
in all the bands, but did not reveal any interaction. We observed less overall frontal EEG power (i.e. more
activity) corresponding to intense music stimuli compared to music stimuli eliciting less intense mental
states.
Last, the valence/intensity analysis showed the effects of valence and intensity. As in intensity/hemisphere
analysis, less overall frontal EEG power (i.e. more activity) corresponds to intense music stimuli compared
to music stimuli eliciting less intense mental states.
The experiment let us to well-individuate four mental states by EEG analysis, we related to involvement of
players in games: fear, related to arousal due to stress; joy, corresponding to satisfactory; happiness,
related to arousal due to relaxation; sad, corresponding to frustration.
The aim of a following work [48] has been to investigate how brain reacts to the perception of stimuli
belonging to different sensorial areas. The focus of this experiment was on the β rhythm since it represents
the activation state of a subject. Thus what we measure is the capability of an audio stimulus to change the
preexisting activation state of a subject boosting or weakening it.
In particular, inducing a state through a visual stimulus we analyzed how the β-wave production changes
after an audio stimulus, emotionally coherent to the projected image, is proposed to the subject, to finish
with a new audio stimulation with an opposite emotive character respect to the visual one.
The aim of our study was to observe the qualitative and quantitative variation of the β rhythms in the
different stimulation sets, to define how the stimulation through the superposition of coherent and
opposite stimuli can influence the brain’s activation. We observed that the presence of these rhythms is
related to the engagement of the subject and their observation could be useful to determine his mental
state.
To complete our investigation, we are currently performing other experiments, to test the possibility to
modify games’ or educational applications’ music in real-time, following or contrasting the players’ mental
state to enhance their emotional experience and involve them more in the activity. With the aim of
exploring this possibility in enhancing users’ involvement, we used games, creating a VR environment
reproducing scenes from famous games (Call of Duty, Resident Evil and Guitar Hero) to track the four
mental states detected in our preliminary study phase (i.e. frustration, satisfaction and level of arousal due
to stress or relaxation) from users wearing the Emotiv Epoc or the Neurosky Mindwave. Our aim consists in
testing the possibility to regulate games sound and music following user emotion changing. Depending on
the passively detected user’s mental state, we developed an algorithm operating on the intensity of the
sounds (increasing with low arousal and high frustration, decreasing on the contrary); changes of
background music (for example contrasting, with fast/low rhythms and music genre, the stress and/or the
satisfaction level of the user). Currently, the experiment is in progress and we planned to test our algorithm
on about 60 individuals, in large part students of our University Department. Preliminary results showed
promising progresses in our research. Final aim consists in creating a standard module, easy to interface to
games and eligible for games industry, but also for educational and therapeutic purposes.
CONCLUSIONS
In preliminary experiments performed to test our approach, results showed that brain activity measured at
the anterior part of the scalp distinguished the valence of musical emotions both using the Emotiv Epoc and
the Neurosky Mindwave. In addition, the activity increases during the presentation of positive-valence
musical excerpts, while it decreases corresponding to negative ones. The overall frontal activation is related
to the intensity of emotions elicited by music: it decreases corresponding to sound related to fear and
frustration and increase for happiness or satisfaction.
We also found that, both the Emotiv Epoc has the sufficient number of sensors needed for our objectives.
Therefore, the Emotiv Epoc showed a great precision in detecting features individuating the selected users
mental states (89% vs 81%), probably due to the direct measure of the hemisphere.
Considering the results and the obvious correspondence between music and emotions, the adaptation of
the game’s sounds to the emotional and mental state of a player detected by a BCI appears a promising
application. In fact, BCIs can be easily used to detect if the user is bored or inattentive, consequently
modifying the music track or sounds to involve again the player in the game scenario. Concerning
exclusively music entertainment, consider, for example, the famous game “Guitar Hero”: thanks to a BCI
device it could be possible to modify the music genre, the rhythm or the intensity of the sounds following
the variability of the user’s emotional states, introducing a new modality in game level achievement.
Furthermore, it is predictable that similar tools should be useful for educational or therapeutic and
restorative aims too. First of all, action games are already be tested as valid tools to help people improving
their visuo-spatial attention skills. These skills are particularly impaired in some learning disturbances and
consequently booting them may help in many occasion, for instance in helping dyslexic kids in improving
their reading abilities. However, action games are not designed for this aim, they are often cognitive
demanding and not suitable for all. Furthermore, long term effect on reading abilities and other
neurological condition have not been proved. We believe that dedicated and validated tools should be
designed. In particular, the use of a BCI-based tool should improve the effects of gaming by neuro-feedback
protocols. In our study we have shown that BCI may be used to collect data in a reliable way. So it would
easy to pass from detecting to using the cortical signals in order to explore virtual educating environments.
We also shown that music and game may interact so to improve involvement and engagement in playing.
This is another important aspect since emotional involvement is an important boost of attentional
mechanisms. Music, in fact, might be used to improve attention when cognitive engagement is decreasing
as measure by the BCI device with the aim to train both selective and sustained attention. In this way, a
player should be implicitly engaged in an educating and restorative process tuned to their cognitive
characteristics.
REFERENCES
1. Friedman, D., Leeb, R., Guger, C., Steed, A., Pfurtscheller, G., & Slater, M. Navigating virtual reality
by thought: What is it like?. Presence: Teleoperators and Virtual Environments, 16(1), 100-110,
2007.
2. Nijholt, A., & Tan, D. Brain-computer interfacing for intelligent systems.Intelligent Systems, IEEE,
23(3), 72-79, 2008.
3. Pfurtscheller, G., & Neuper, C. Motor imagery and direct brain-computer communication.
Proceedings of the IEEE, 89(7), 1123-1134, 2001.
4. Nakamura, S., Sadato, N., Oohashi, T., Nishina, E., Fuwamoto, Y., & Yonekura, Y. Analysis of music
brain interaction with simultaneous measurement of regional cerebral blood flow and
electroencephalogram beta rhythm in human subjects. Neuroscience letters, 275(3), 222-226,
1999.
5. Emotiv System Inc. web site, at http://emotiv.com
6. Münte, T.F., Altenmüller, E., & Jäncke, L. The musician's brain as a model of neuroplasticity. Nature
Reviews Neuroscience, 3(6), 473-478, 2002.
7. Zatorre R.J., Chen J.L., Penhune V.B. When the brain plays music: auditory-motor interactions in
music perception and production. Nat.Rev neurosci, 8, 547-558, 2007.
8. Bangert, M., & Altenmüller, E.O. Mapping perception to action in piano practice: a longitudinal DC-
EEG study. BMC neuroscience, 4(1), 26, 2003
9. Rizzolatti, G., & Craighero, . The mirror-neuron system. Annu. Rev. Neurosci., 27, 169-192, 2004.
10. Lahav, A., Saltzman, E., & Schlaug, G. Action representation of sound: audiomotor recognition
network while listening to newly acquired actions. The journal of neuroscience, 27(2), 308-314,
2007.
11. Shaw, G., Ky, K. & Rauscher, F. Mozart Spatial Reasoning, Nature, 365, 611, 1993.
12. Bodner, M., Muftuler, L.T., Nalcioglu, O., & Shaw, G.L. FMRI study relevant to the Mozart effect:
brain areas involved in spatialtemporal reasoning. Neurological research, 23(7), 683-690, 2001.
13. Dastgheib, S. S., Layegh, P., Sadeghi, R., Foroughipur, M., Shoeibi, A., & Gorji, A. The effects of
Mozart’s music on interictal activity in epileptic patients: systematic review and meta-analysis of
the literature. Current neurology and neuroscience reports, 14(1), 1-11, 2014.
14. LeDoux, J. Rethinking the emotional brain. Neuron, 73(4), 653-676, 2012.
15. Kuhl D.E., Phelps M.E., Markham C.J., Mettler E.J., Riege W.J., & Winter J. Cerebral metabolism and
atrophy in Huntington's disease determined by 18 FDG and computed tomographic scan. Ann
Neurol, 12,425434, 1982.
16. Speedie LJ, Brake N, Folstein SE, Bowers D, & Heilman KM. Comprehension of prosody in
Huntington's disease. J Neurol Neurosurg Psychiatry, 53, 607610, 1990.
17. Breitenstein C., & Daum I., Ackermann H. Emotional processing following cortical and subcortical
brain damage: contribution of the fronto-striatal circuitry. Behav Neurol, 11, 2942, 1998.
18. Tillmann B., Bharucha J.J., & Bigand E. Implicit learning of tonality: a self-organizing approach.
Psychological Review, 107, 885913, 2000.
19. Juslin P.N, & Sloboda J.A. Handbook of music and emotion: theory, research, applications. Oxford:
Oxford University Press, 2010.
20. Zentner M., Grandjean D., & Scherer K.R. Emotions evoked by the sound of music: characterization,
classification, and measurement. Emotion, 8:494-52, 2008.
21. Ekman P. Are there basic emotions? Psychol Rev., 99, 550-3, 1992.
22. Russell, J. A. “A Circumplex Model of Affect,” J. Personality and Social Psychology, vol. 39, no. 6, pp.
1161-1178, 1980.
23. Brown S., Martinez M.J., & Parsons L.M. Passive music listening spontaneously engages limbic and
paralimbic systems .» NeuroReport, 13, 20332037, 2004.
24. Mitterschiffthaler M.T., Fu C.H., Dalton J.A., Andrew C.M., & Williams S.C. A functional MRI study of
happy and sad affective states induced by classical music. Hum Brain Mapp, 28, 1150-62, 2007.
25. Small D.M., Zatorre R.J., Dagher A., Evans A.C., & Jones-Gotman M. Changes in brain activity related
to eating chocolate: from pleasure to aversion. Brain, 124, 17201733, 2001.
26. Dalgleish T. The emotional brain. Nature. Review of Neuroscience, 5, 583589, 2004.
27. Baumgartner T. Lutz, K., Schmidt, C.F., & Jäncke, L. The emotional power of music: how music
enhances the feeling of affective pictures. Brain Res, 1, 151164, 2006.
28. Koelsch S. et al. Investigating emotion with music: an fMRI study. Hum. Brain Mapp, 27, 239250,
2006.
29. Salimpoor V.N., van den Bosch I., Kovacevic N., McIntosh A.R., Dagher A., & Zatorre R.J. Interactions
between the nucleus accumbens and auditory cortices predict music reward value. Science, 340,
216-9, 2013
30. Abrams D.A., Ryali S., Chen T., Chordia P., Khouzam A., Levitin D.J., & Menon V. Inter-subject
synchronization of brain responses during natural music listening. Eur J Neurosci., 37, 1458-69,
2013.
31. Ko, M., Bae, K., Oh, G., Ryu, T., A study on new gameplay based on brain-computer interface, in:
Barry, A., Helen, K., Tanya, K. (eds.) Breaking New Ground: Innovation in Games, Play, Practice and
Theory: Proc. of the 2009 Digital Games Research Association Conference, Brunel University, 2009.
32. Pineda, J.A., Silverman, D.S., Vankov, A., and Hestenes, J., Learning to Control Brain Rhythms:
Making a Brain-Computer Interface Possible, Neural Systems and Rehabilitation Engineering, Vol.
11, No. 2, IEEE Transactions on, pp. 181-184, 2003.
33. Lalor, E.C., Kelly, S.P., Finucane, C., Burke, R., Reilly, R.B., McDarby, G., Brain Computer Interface
based on the Steady-state VEP for Immersive Gaming Control, Biomedizinsche Tecknik, pp. 63~64,
2004.
34. NeuroSky Inc. web site, at http://neurosky.biz
35. ‘Sniper Elite’, at http://www.microids.com/en/catalogue/28/sniper-elite-berlin-1945.html
36. ‘Call of Duty’, at http://www.callofduty.com
37. ‘Metal Gear Solid’, at http://www.konami.jp/kojima_pro/english/index.html
38. ‘Silent Hill’, at http://www.konami.com/games/shh/
39. ‘Resident Evil’, at http://www.capcom.co.jp/bio5/
40. Mühl, C., Gürkök, H., Plass-Oude Bos, D., Thurlings, M. E., Scherffig, L., Duvinage, M., Elbakyan, A.
A., Kang, S., Poel, M., and Heylen, D. K. J., Bacteria Hunt: A multimodal, multiparadigm BCI game, in
Proc. of the International Summer Workshop on Multimodal Interfaces, Genua, 2010.
41. Plass-Oude Bos, D., Reuderink, B., Laar, B., Gürkök, H., Mühl, C., Poel, M., Nijholt, A., and Heylen,
D., Brain-Computer Interfacing and Games, in Brain-Computer Interfaces, ser. Human-Computer
Interaction Series, D. S. Tan and A. Nijholt, Eds., 2010.
42. Kirmizialsan, E.; Bayraktaroglu, Z.; Gurvit, H.; Keskin, Y.; Emre, M.; Demiralp, T. Comparative
analysis of event-related potentials during Go/NoGo and CPT: Decomposition of
electrophysiological markers of response inhibition and sustained attention. Brain Research 1104
(1): 114128, 2006.
43. Lucchiari, C., & Pravettoni, G. Feedback related brain activity in a gambling task: a temporal analysis
of EEG correlates. Scandinavian journal of psychology, 51(6), 449-454, 2010.
44. Balconi, M., & Lucchiari, C. Consciousness and arousal effects on emotional face processing as
revealed by brain oscillations. A gamma band analysis. International Journal of Psychophysiology,
67(1), 41-46, 2008.
45. Folgieri, R., Zampolini, R. BCI promises in emotional involvement in music and games. Computers in
Entertainment, ed. ACM, 2012.
46. Folgieri, R., Zicchella, M., A BCI-based application in music: conscious playing of single notes by
brainwavs. Computer in entertainment, ACM, vol 10, 2012
47. Schmidt, L.A., Trainor, L.J., Frontal brain electrical activity (EEG) distinguishes valence and intensity
of musical emotions, Cognition and Emotion 15, 487500, 2001.
48. Folgieri, R., Bergomi, M., Castellani, S.. Eeg-Based Brain-Computer Interface For Emotional
Involvement In Games Through Music. Computers in Music. In Digital Da Vinci (pp. 205-236).
Springer New York, 2014.
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Using recent regional brain activation/emotion models as a theoretical framework, we examined whether the pattern of regional EEG activity distinguished emotions induced by musical excerpts which were known to vary in affective valence (i.e., positive vs. negative) and intensity (i.e., intense vs. calm) in a group of undergraduates. We found that the pattern of asymmetrical frontal EEG activity distinguished valence of the musical excerpts. Subjects exhibited greater relative left frontal EEG activity to joy and happy musical excerpts and greater relative right frontal EEG activity to fear and sad musical excerpts. We also found that, although the pattern of frontal EEG asymmetry did not distinguish the intensity of the emotions, the pattern of overall frontal EEG activity did, with the amount of frontal activity decreasing from fear to joy to happy to sad excerpts. These data appear to be the first to distinguish valence and intensity of musical emotions on frontal electrocortical measures.
Article
Full-text available
The reliability of commercial non-invasive BCI (brain computer Interface) devices and the lower cost of these EEG-based systems, as well as the equipment's portability, determined the increasing interest in their application in different research fields. The latter feature makes BCI devices particularly suited for entertainment applications, especially given the possibility to detect the mental state of the users. The relationship between emotions and entertainment is obvious, as is the influence ofmusic in human emotional states. While BCI devices represent a challenge in gaming motion control, they have been successfully applied in music production [Dan et al. 2009] and composition [Hamadicharef et al. 2010]. In our previous work [Conscious and unconscious music from the brain in press] we focused on conscious production of music notes with the aim of developing a prototype for applications in entertainment. In this work we trace the state-of-the art of our research and present our opinion on possible applications of the preliminary results.
Chapter
Full-text available
The reliability of commercial non-invasive BCI (Brain Computer Interface) devices and the lower cost of these EEG-based systems, determined the increasing interest in their application in different research fields, also thanks to the portability of the equipment. The latter feature makes BCI devices particularly suited for entertainment applications especially due to the possibility to detect the mental state of the users. The relationship between emotions and entertainment is obvious, as the influence of music in human emotional states. While BCI devices represent a challenge in gaming motion control, they have been successfully applied in music production [5] and composition [7]. In our previous work [6] we focused on conscious production of music notes with the aim of developing a prototype for applications in entertainment. In this work we trace the state-of-the art of our research and present our opinion on possible applications of the preliminary obtained results.
Article
Full-text available
Mozart's music has been shown to have promising effects on nervous system functions. In this study, the effects of Mozart's work on epilepsy were reviewed. Articles were obtained from a variety of sources. The results of 12 studies were extracted. Three different meta-analyses were performed to examine (i) the percentage of patients who had changes in their interictal epileptic discharges (IEDs) by music therapy; and the changes of IEDs (ii) during and (iii) after exposure to Mozart's music. Data analysis indicated that 84 % of patients listening to Mozart's music showed a significant decrease in IEDs. In addition, IEDs were decreased during (31.24 %) and after (23.74 %) listening to Mozart's compositions. A noteworthy response to music therapy in patients with a higher intelligence quotient, generalized or central discharges, and idiopathic epilepsy was demonstrated. The effect of Mozart's music on epilepsy seems to be significant. However, more randomized control studies are needed to determine its clinical efficacy.
Article
Full-text available
Background Performing music requires fast auditory and motor processing. Regarding professional musicians, recent brain imaging studies have demonstrated that auditory stimulation produces a co-activation of motor areas, whereas silent tapping of musical phrases evokes a co-activation in auditory regions. Whether this is obtained via a specific cerebral relay station is unclear. Furthermore, the time course of plasticity has not yet been addressed. Results Changes in cortical activation patterns (DC-EEG potentials) induced by short (20 minute) and long term (5 week) piano learning were investigated during auditory and motoric tasks. Two beginner groups were trained. The 'map' group was allowed to learn the standard piano key-to-pitch map. For the 'no-map' group, random assignment of keys to tones prevented such a map. Auditory-sensorimotor EEG co-activity occurred within only 20 minutes. The effect was enhanced after 5-week training, contributing elements of both perception and action to the mental representation of the instrument. The 'map' group demonstrated significant additional activity of right anterior regions. Conclusion We conclude that musical training triggers instant plasticity in the cortex, and that right-hemispheric anterior areas provide an audio-motor interface for the mental representation of the keyboard.
Article
The present study uses music as a tool to induce emotion, and functional magnet resonance imaging (fMRI) to determine neural correlates of emotion processing. We found that listening to pleasant music activated the larynx representation in the rolandic operculum. The larynx is the source of vocal sound, and involved in the production of melody, rhythm, and emotional modulation of the vocal timbre during vocalization. The activation of the larynx is reminiscent of the activation of premotor areas during the observation of grasping movements and might indicate that a system for the perception-action mediation which has been reported for the visual domain also exists in the auditory domain.
Book
Music's ability to express and arouse emotions is a mystery that has fascinated both experts and laymen at least since ancient Greece. The predecessor to this book, Motion and Emotion (OUP, 2001) was critically and commercially successful and stimulated much further work in this area. In the years since the publication of that book, empirical research in this area has blossomed, and the successor to Music and Emotion reflects the considerable activity in this area. The Handbook of Music and Emotion offers an 'up-to-date' account of this vibrant domain. It provides comprehensive coverage of the many approaches that may be said to define the field of music and emotion, in all its breadth and depth. The first section offers multi-disciplinary perspectives on musical emotions from philosophy, musicology, psychology, neurobiology, anthropology, and sociology. The second section features methodologically-oriented chapters on the measurement of emotions via different channels (e.g., self report, psychophysiology, neuroimaging). Sections three and four address how emotion enters into different aspects of musical behavior, both the making of music and its consumption. Section five covers developmental, personality, and social factors. Section six describes the most important applications involving the relationship between music and emotion. In a final commentary, the editors comment on the history of the field, summarize the current state of affairs, as well as propose future directions for the field. The only book of its kind, the Handbook of Music and Emotion will fascinate music psychologists, musicologists, music educators, philosophers, and others with an interest in music and emotion (e.g. in marketing, health, engineering, film, and the game industry). It will be a valuable resource for established researchers in the field, a developmental aid for early-career researchers and postgraduate research students, and a compendium to assist students at various levels. In addition, as with its predecessor, it will also interest from practicing musicians and lay readers fascinated by music and emotion.
Article
Brain-Computer Interface (BCI) is a way to control computers by using human brain waves. As the technology has improved, BCI devices have become smaller and cheaper, making it possible for more individuals to buy them. This allows BCI to be applied to new fields outside of pure research, including entertainment. We examine whether BCI devices can be used as a new gaming device, approaching it from a game design perspective. We propose game play elements that can effectively utilize BCI devices and present a game prototype that demonstrates several of these game play elements. Next, we use statistical data analysis to show that using a BCI device as well as keyboard and mouse interfaces makes the game's control clearer and more efficient than using the traditional input devices. The results offer guidelines for effective game design methodology for making BCI based games. Author Keywords
Article
We used functional magnetic resonance imaging to investigate neural processes when music gains reward value the first time it is heard. The degree of activity in the mesolimbic striatal regions, especially the nucleus accumbens, during music listening was the best predictor of the amount listeners were willing to spend on previously unheard music in an auction paradigm. Importantly, the auditory cortices, amygdala, and ventromedial prefrontal regions showed increased activity during listening conditions requiring valuation, but did not predict reward value, which was instead predicted by increasing functional connectivity of these regions with the nucleus accumbens as the reward value increased. Thus, aesthetic rewards arise from the interaction between mesolimbic reward circuitry and cortical networks involved in perceptual analysis and valuation.
Article
Music is a cultural universal and a rich part of the human experience. However, little is known about common brain systems that support the processing and integration of extended, naturalistic 'real-world' music stimuli. We examined this question by presenting extended excerpts of symphonic music, and two pseudomusical stimuli in which the temporal and spectral structure of the Natural Music condition were disrupted, to non-musician participants undergoing functional brain imaging and analysing synchronized spatiotemporal activity patterns between listeners. We found that music synchronizes brain responses across listeners in bilateral auditory midbrain and thalamus, primary auditory and auditory association cortex, right-lateralized structures in frontal and parietal cortex, and motor planning regions of the brain. These effects were greater for natural music compared to the pseudo-musical control conditions. Remarkably, inter-subject synchronization in the inferior colliculus and medial geniculate nucleus was also greater for the natural music condition, indicating that synchronization at these early stages of auditory processing is not simply driven by spectro-temporal features of the stimulus. Increased synchronization during music listening was also evident in a right-hemisphere fronto-parietal attention network and bilateral cortical regions involved in motor planning. While these brain structures have previously been implicated in various aspects of musical processing, our results are the first to show that these regions track structural elements of a musical stimulus over extended time periods lasting minutes. Our results show that a hierarchical distributed network is synchronized between individuals during the processing of extended musical sequences, and provide new insight into the temporal integration of complex and biologically salient auditory sequences.