ArticlePDF Available

Extracting neurophysiological signals reflecting users' emotional and affective responses to BCI use: A systematic literature review


Abstract and Figures

Background: Brain-computer interfaces (BCIs) allow persons with impaired mobility to communicate and interact with the environment, supporting goal-directed thinking and cognitive function. Ideally, a BCI should be able to recognize a user's internal state and adapt to it in real-time, to improve interaction. Objective: Our aim was to examine studies investigating the recognition of affective states from neurophysiological signals, evaluating how current achievements can be applied to improve BCIs. Methods: Following the PRISMA guidelines, we performed a literature search using PubMed and ProQuest databases. We considered peer-reviewed research articles in English, focusing on the recognition of emotions from neurophysiological signals in view of enhancing BCI use. Results: Of the 526 identified records, 30 articles comprising 32 studies were eligible for review. Their analysis shows that the affective BCI field is developing, with a variety of combinations of neuroimaging techniques, selected neurophysiological features, and classification algorithms currently being tested. Nevertheless, there is a gap between laboratory experiments and their translation to everyday situations. Conclusions: BCI developers should focus on testing emotion classification with patients in ecological settings and in real-time, with more precise definitions of what they are investigating, and communicating results in a standardized way.
Content may be subject to copyright.
Uncorrected Author Proof
NeuroRehabilitation xx (20xx) x–xx
IOS Press
Review Article1
Extracting neurophysiological signals
reflecting users’ emotional and affective
responses to BCI use: A systematic literature
Giulia Liberatia,, Stefano Federiciband Emanuele Pasqualottoa
eCatholique de Louvain, Institute of Neuroscience, Louvain, Belgium7
bUniversit`a di Perugia, Department of Philosophy, Social & Human Sciences and Education, Perugia, Italy
BACKGROUND: Brain–computer interfaces (BCIs) allow persons with impaired mobility to communicate and interact with the
environment, supporting goal-directed thinking and cognitive function. Ideally, a BCI should be able to recognize a user’s internal
state and adapt to it in real-time, to improve interaction.
OBJECTIVE: Our aim was to examine studies investigating the recognition of affective states from neurophysiological signals,
evaluating how current achievements can be applied to improve BCIs.
METHODS: Following the PRISMA guidelines, we performed a literature search using PubMed and ProQuest databases. We
considered peer-reviewed research articles in English, focusing on the recognition of emotions from neurophysiological signals
in view of enhancing BCI use.
RESULTS: Of the 526 identified records, 30 articles comprising 32 studies were eligible for review. Their analysis shows that
the affective BCI field is developing, with a variety of combinations of neuroimaging techniques, selected neurophysiological
features, and classification algorithms currently being tested. Nevertheless, there is a gap between laboratory experiments and
their translation to everyday situations.
CONCLUSIONS: BCI developers should focus on testing emotion classification with patients in ecological settings and in
real-time, with more precise definitions of what they are investigating, and communicating results in a standardized way.
1. Introduction25
Brain–computer interfaces (BCIs) are systems that
measure brain signals, extract specific features from
these signals, and translate these features into out-
put signals, serving as direct communication pathways
Address for correspondence: Giulia Liberati, Universit´
Catholique de Louvain, Institute of Neuroscience, Louvain,
Belgium. E-mail:
between the brain and external devices (Birbaumer, 29
2006a; Wolpaw, Birbaumer, McFarland, Pfurtscheller, 30
& Vaughan, 2002). Because they do not rely on mus- 31
cular activity, BCIs could represent the only means 32
for persons with impaired mobility to communicate 33
and interact with their environment. In this sense, 34
they promote individuals’ sense of sense of agency 35
and goal-directed thinking, and therefore support cog- 36
nitive function, that would otherwise be lost due to 37
1053-8135/15/$35.00 © 2015 – IOS Press and the authors. All rights reserved
Uncorrected Author Proof
2G. Liberati et al. / Extracting neurophysiological signals reflecting
the lack of actions and environmental feedback (Bir-
baumer, 2006b; K¨
ubler & Birbaumer, 2008; Liberati39
& Birbaumer, 2012). Hence, BCIs can be regarded
in all respects not only as assistive technology (AT),41
but also as “cognitive prosthetics” or cognitive sup-42
port technologies (CST, Chu, Brown, Harniss, Kautz,43
& Johnson, 2014) for individuals who have lost motor
Most of the past research in the BCI field has
focused predominantly on the improvement of tech-47
nical aspects, such as enhancing information transfer
rate and signal classification accuracy (Nicolas-Alonso
& Gomez-Gil, 2012). In their systematic review on50
electroencephalography (EEG)-based BCIs, compris-
ing 127 research articles, Pasqualotto and collaborators52
(2012) reported that a user-centered perspective was53
rarely adopted. It is well known that the lack of54
consideration of user’s experience can easily lead to
the abandonment of assistive technology (Federici &56
Borsci, 2014; Federici & Scherer, 2012; Pasqualotto,57
Simonetta, Federici, & Olivetti Belardinelli, 2009).
The last years have seen a growing awareness of the
importance of adopting a user-centered approach in the60
evaluation of BCIs, e.g. assessing specific aspects of the61
user’s experience, such as usability, motivation, satis-62
faction, and improvement of the quality of life (Holz,63
Botrel, Kaufmann, & K¨
ubler, 2015; K¨
ubler et al., 2014;64
Nijboer, Birbaumer, & K¨
ubler, 2010; Nijboer, Plass-
Oude Bos, Blokland, van Wijk, & Farquhar, 2014;66
Pasqualotto et al., 2015; Schettini et al., 2015; Simon
et al., 2014; Zickler, Halder, Kleih, Herbert, & K¨
2013). Although less investigated, the necessity of eval-69
uating the user’s ongoing affective and emotional states
during the control of BCI systems and other technol-
ogyhas been recentlyemphasized by severalresearchers72
(Brouwer, Zander, van Erp, Korteling, & Bronkhorst,73
2015; Molina, Tsoneva, & Nijholt, 2009; Zander &74
Jatzev, 2009). A recent focus group study (Liberati et al.,
2015), which investigated the requirements of persons76
with amyotrophic lateral sclerosis (ALS) with respect to77
BCI use, highlighted how the training to control a BCI
can be perceived as very stressful and time-consuming,79
potentially leading to loss of motivation. BCI users often80
feel angry or frustrated when a communication system81
fails to convey messages adequately, and slow commu-82
nication may cause significant panic, anxiety, and loss83
of control when the user needs to communicate some-84
thing urgently. These spontaneous changes in the mental85
state of the user may disrupt the electrophysiological86
response and deteriorate BCI performance (Holz et al.,87
2015; Reuderink, Poel, & Nijholt, 2011).88
The relevance of emotions in the interaction with 89
the environment is not a novel concept, and has been 90
extensively discussed by neurologist Antonio Damasio 91
(1994). In his book “Descartes’ Error: Emotion, 92
Reason, and the Human Brain”, Damasio emphasized 93
how emotions and feelings may not be intruders in 94
the bastion of reason at all: they may be enmeshed in 95
its networks, for worse and for better. From his per- 96
spective, most of the interactions with the environment 97
take place because the organism requires their occur- 98
rence to maintain homeostasis and survival. Emotions 99
play a fundamental role in these interactions, allow- 100
ing to communicate meanings to others, as well as 101
acting as a cognitive guidance, to ensure survival and 102
well-being. BCIs may represent no exception concern- 103
ing the utility of emotions and affectivity in enhancing 104
cognitive experience. Emotional responses could help 105
accomplishing useful goals, such as displaying satisfac- 106
tion or disappointment towards BCI systems with the 107
aim of improving them, and providing a more thorough 108
experience with the environment. From the perspective 109
of supporting individuals’ cognitive function, taking 110
affective states into account allows promoting a more 111
thorough environmental feedback, which is fundamen- 112
tal for retaining one’s sense of agency (Haselager, 2013) 113
and goal-directed thinking (Birbaumer et al., 2006). 114
According to Picard (2000), the distance between 115
user and machine can be reduced by taking emotional 116
content into account, a process known as affective 117
computing. Consistently, Azcarraga and Suarez (2013) 118
pointed out the importance of developing affective 119
tutoring systems able to monitor the emotions that 120
typically emerge when facing new tasks, such as con- 121
fidence, excitement, frustration, interest, engagement, 122
boredom or confusion, in order to facilitate learning 123
processes. Along the same line, Molina and collabora- 124
tors (2009) observed that emotions and affective states 125
should not be seen as obstacles to BCI use. Indeed, 126
the awareness of how these states influence brain activ- 127
ity patterns could lead to the development of adaptive 128
BCIs, able to interpret the user’s intention in spite of 129
signal deviations. As emphasized by Blankertz and col- 130
leagues (2003), the aim of BCI developers should be to 131
minimize the user’s training, by imposing the major 132
learning load on the computer rather than on the per- 133
son. In other words, the computer should recognize the 134
user’s covert states and adapt to them in real-time, in 135
order to improve its function. 136
These considerations are in line with a new perspec- 137
tive established in the last years, according to which not 138
only voluntary self-regulated signals can be used as BCI 139
Uncorrected Author Proof
G. Liberati et al. / Extracting neurophysiological signals reflecting 3
input, but also signals relative to the user’s state (e.g.
his/her emotions) can provide important information141
for BCI use (Nijboer et al., 2009). To this end, the terms
“passive BCI” (pBCI, Brouwer et al., 2015; Cotrina143
et al., 2014; Roy, Bonnet, Charbonnier, & Campagne,144
2013; Zander & Jatzev, 2009; Zander & Kothe, 2011;145
Zander, Kothe, Jatzev, & Gaertner, 2010) and “affec-
tive BCI” (aBCI, Liberati et al., 2012; Nijboer et al.,
2009; van der Heiden et al., 2014; Widge, Dougherty, &
Moritz, 2014) are becoming more and more popular. By149
relying on spontaneous brain activity, these systems aim
to assess covert aspects of the user’s state, without inter-
fering with other cognitive tasks (Van Erp, Brouwer,152
& Zander, 2015). A further advantage of this novel
approach is that the recognition of covert states could154
also be performed with users with cognitive impairment155
(e.g. dementia), who have no means to undergo exten-156
sive trainings and learn how to use traditional BCIs
(Liberati et al., 2012a, 2012b, 2013).158
1.1. Rationale159
To this date, there is no systematic review on the160
recognition of emotional and affective states from161
neurophysiological responses (i.e. aBCIs), and the con-162
sequent possibility to improve users’ interaction with163
traditional BCIs and/or other AT.164
By following the PRISMA (Preferred Reporting
Items for Systematic Reviews and Meta-analyses) cri-166
teria (Liberati et al., 2009; Moher, Liberati, Tetzlaff, &
Altman, 2009), our aim is to examine studies that have
investigated the recognition of affective states from neu-169
rophysiological signals, shedding light on how current
knowledge and achievements in the aBCI field can be
applied to improve BCI communication and control,172
providing the user with a more comprehensive interac-173
tion with others and the environment. In particular, we174
aim to evaluate whether at the current state of the art, the
need of considering affectivity during BCI use, which176
has strongly emerged from recent studies with patients177
and possible BCI users (Holz et al., 2015; Liberati et
al., 2015), can be adequately met.179
2. Methods180
2.1. Eligibility criteria181
We considered research articles published in
peer-reviewed journals in English, focusing on the183
recognition of affective and emotional states from neu-184
rophysiological signals acquired through non-invasive 185
techniques, in view of enhancing BCI use. Studies 186
simply focusing on the neural correlates of affective 187
and emotional states, without a defined recognition or 188
classification method, and therefore without a direct 189
implication for BCI use, were excluded from the anal- 190
ysis. In absence of a strict peer-review process, articles 191
derived from conference proceedings were not taken 192
into account. 193
2.2. Information sources and research terms 194
We identified articles by searching PubMed and Pro- 195
Quest electronic databases, and by scanning reference 196
lists of pertinent review articles, editorials, and hand- 197
books (Kim, Kim, Oh, & Kim, 2013; M¨
uhl, Allison, 198
Nijholt, & Chanel, 2014a, 2014b; M¨
uhl, Heylen, & 199
Nijholt, 2014; Van Erp, Brouwer, & Zander, 2015). The 200
last search was run on July 10, 2015. 201
We used the following research terms: “affective 202
brain computer interface(s)”; “passive brain computer 203
interface(s)”; emotion(s) & “brain computer inter- 204
face(s)”; “emotion classification”. The terms could 205
appear at any point of the articles’ full text. 206
2.3. Study selection and data collection process 207
Eligibility assessment was performed independently 208
in a blinded standardized manner by authors G.L. 209
and E.M. A first screening was based on the arti- 210
cle abstracts. A following selection was performed 211
after reading the articles in full-text. Disagreements 212
between reviewers were resolved by consensus. We 213
developed a data extraction sheet (Table 1), which was 214
pilot-tested on ten randomly selected included studies 215
and refined accordingly. G.L. extracted the data from 216
the studies and E.P. checked the extracted data. Dis- 217
agreements were resolved by discussion between the 218
authors. 219
2.4. Data items 220
Information was extracted from each study on the 221
following aspects: 222
1. Affective states investigated 223
2. Emotion elicitation method 224
3. Participants 225
a. Number of healthy subjects 226
b. Number and category of patients
Uncorrected Author Proof
4G. Liberati et al. / Extracting neurophysiological signals reflecting
4. Data collection
a. Data acquisition technique228
b. Device portability
5. Emotion recognition technique230
a. Feature extraction231
b. Type of classifier232
6. Performance of classifier
a. Classification accuracy
b. Online (real-time) vs. offline classification
3. Results
3.1. Study selection and characteristics
The search on PubMed and ProQuest databases pro-
vided a total of 746 citations. Nineteen additional
records were identified by checking the references of240
relevant papers. After removing duplicates, 526 records241
remained. Of these, 482 were discarded, because their
abstract clearly indicated that they were out of scope.243
The full text of the remaining 44 citations was examined
in more detail. Fourteen studies did not meet the inclu-245
sion criteria and were therefore excluded from further246
analysis. Thirty articles comprising a total of 32 stud-247
ies met the inclusion criteria, and were included in the248
systematic review. The whole selection process can be
seen in the flow diagram in Fig. 1.250
All 32 studies were experiments on the recogni-
tion/classification of affective states from neurophys-
iological features, for the development of an aBCI.
3.2. Results254
3.3.1. Affective states investigated255
The majority of studies (59%) discriminated affec-256
tive states according to a dimensional model (Russell
& Mehrabian, 1977), either based on valence (Chanel,258
Kierkels, Soleymani, & Pun, 2009; Hadjidimitriou &
Hadjileontiadis, 2013, 2012; Hidalgo-Mu˜
noz, L´
Pereira, Santos, & Tom´
e, 2013; Hosseini et al., 2011;261
Kashihara, 2014; Lee & Hsieh, 2014; Stikic, Johnson,262
Tan, & Berka, 2014; Tai & Chau, 2009), on valence and263
arousal (Baucom, Wedell, Wang, Blitzer, & Shinkareva,264
2012; Frantzidis et al., 2010a, 2010b; Heger, Herff,265
Putze, Mutter, & Schultz, 2014; Hosseini & Naghibi-
Sistani, 2011; Jie, Cao, & Li, 2014; Moghimi, Kushki,267
Power, Guerguerian, & Chau, 2012; Soleymani, Pan-268
tic, & Pun, 2012; Yoon & Chung, 2013), or on valence,269
arousal, and dominance/control (Koelstra & Patras,270
The remaining 41% percent of studies discrim- 272
inated different emotion categories, (Azcarraga & 273
Suarez, 2013; Chanel, Rebetez, B´
etrancourt, & Pun, 274
2011; Jenke, Peer, & Buss, 2014; Kassam, Markey, 275
Cherkassky, Loewenstein, & Just, 2013; Lin, Yang, & 276
Jung, 2014; Mikhail, El-Ayat, Coan, & Allen, 2013; 277
Murugappan, Ramachandran, & Sazali, 2010; Petran- 278
tonakis & Hadjileontiadis, 2010; Sitaram et al., 2011; 279
Yuvaraj et al., 2014), in some cases selected in order 280
to cover the valence-arousal space (Jenke et al., 2014; 281
Lin et al., 2014; Yuvaraj et al., 2014). The num- 282
ber of emotion categories classified within a single 283
experiment varied across studies. The highest num- 284
ber of classified emotions was nine (Kassam et al., 285
2013), followed by six (Petrantonakis & Hadjileon- 286
tiadis, 2010; Yuvaraj et al., 2014), five (Murugappan 287
et al., 2010), four (Azcarraga & Suarez, 2013; Lin et 288
al., 2014; Mikhail et al., 2013), three (Chanel et al., 289
2011; Sitaram et al., 2011), and two (Sitaram et al., 290
2011) emotional categories. The emotions that were 291
classified more frequently were Paul Ekman’s basic 292
and universal emotions (Ekman et al., 1987), namely 293
anger, disgust, fear, happiness, sadness, and surprise 294
(Jenke et al., 2014; Kassam et al., 2013; Lin et al., 295
2014; Mikhail et al., 2013; Murugappan et al., 2010; 296
Sitaram et al., 2011; Soleymani et al., 2012; Yuvaraj 297
et al., 2014). Other investigated affective states were 298
related to the performance of a specific task with dif- 299
ferent levels of difficulty (i.e. solving math problems 300
or playing a videogame, and comprised confidence, 301
excitement, frustration, interest/engagement, boredom, 302
and anxiety (Azcarraga & Suarez, 2013; Chanel et al., 303
2011). One study also investigated pleasure and neu- 304
trality, as well as more complex affective states such 305
as pride, shame, envy and lust (Kassam et al., 2013), 306
whereas another study included calmness and curios- 307
ity (Jenke et al., 2014). Specific definitions of these 308
affective states were generally not provided. 309
3.3.2. Emotion elicitation methods 310
In order to test affective state classification in a 311
controlled context, several different emotion elicita- 312
tion methods were used. A relatively high number 313
of studies (37%) used visual stimuli to evoke differ- 314
ent affective states (Baucom et al., 2012; Frantzidis 315
et al., 2010a, 2010b; Hidalgo-Mu˜
noz et al., 2013; Hos- 316
seini & Naghibi-Sistani, 2011; Hosseini et al., 2011; 317
Jenke et al., 2014; Petrantonakis & Hadjileontiadis, 318
2010; Sitaram et al., 2011; Tai & Chau, 2009). Pic- 319
tures drawn from the International Affective Picture 320
System (IAPS, Lang, Bradley, & Cuthbert, 1999) were 321
Uncorrected Author Proof
G. Liberati et al. / Extracting neurophysiological signals reflecting 5
Table 1
Data extraction sheet summarizing the articles included in the systematic review. For each of the articles included in our work, we extracted the following information: affective states
investigated; method used to elicit the states; the number of participants included (all healthy subjects, unless otherwise specified); selected features; feature extraction method; type of
classifier used to discriminate the features; whether the classification was performed online or not; classification performance
Publication Affective
Participants Technique Other peripheral
Portability Selected
Feature extraction
Type of
Online Performance
Azcarraga et al.
16 EEG* Mouse clicks and
between raw value
of each channel
and resting-state
Computation of
signal value at
each EEG channel
MLP, SVM x Classification
performance: 43%
SVM, 65% MLP;
Combination wth
mouse: 51%
SVM, 73% MLP
Baucom et al.
Valence and arousal IAPS 13 fMRI x Voxels with the most
stable responses
across multiple
Computation of PSC
relative to average
activity for each
x Classification
higher than chance
level for both
valence and
arousal (max:
Chanel et al.
Valence Recall of past
10 EEG GSR, BVP, chest
cavity expansion.
Frequency features;
pairwise MI
peripheral features
x Classification
accuracy: 63%
features and SVM;
70% after fusion
of different
features sets; 80%
after rejection of
Chanel et al.
Anxiety, boredom,
Videogame 14 EEG GSR, BVP, chest
cavity expansion,
skin temperature
Frequency features:
theta, alpha, beta;
peripheral features
FFT LDA, QDA, SVM, x Classification
accuracy: up to
63% using LDA
and fusion of
different features;
over chance level
classification for
all classification
Frantzidis et al.
Valence and arousal IAPS 28 EEG Time-domain
features: ERPs;
features: delta,
theta, alpha
Amplitude and
latency of ERP;
MD, SVM x Classification
accuracy: 79.5%
and 81.3% for MD
and SVM
Uncorrected Author Proof
6G. Liberati et al. / Extracting neurophysiological signals reflecting
Table 1
Publication Affective
Participants Technique Other peripheral
Portability Selected
Feature extraction
Type of
Online Performance
Frantzidis et al.
Valence and arousal IAPS 28 EEG EDA Time-domain
features: ERPs;
features: delta,
theta; ERD/ERS
Amplitude and
latency of ERP;
digital filtering;
MD x Classification
accuracy: 77.68%
and Had-
Valence Music 9 EEG* – Frequency features:
delta, theta, alpha,
beta, gamma;
x Classification
accuracy: up to
86.52% using
and Had-
Valence Music 9 EEG* – Time-frequency
Time-windowing TF
analysis; ZAM
k-NN, SVM x Classification
accuracy: up to
91% for familiar
musical excerpts;
up to 87% for
unfamiliar musical
excerpts; k-NN
Heger et al.
Valence and arousal IAPS and IADS
8 fNIRS – time-domain
features: mean;
Mean calculation;
LDA x Classification
accuracy generally
chance-level for
high and low
valence and
Hidalgo Mu˜
et al. (2013)
Valence IAPS 26 EEG Frequency features:
spectral turbulence
FFT SVM x Classification
accuracy: up to
Hosseini et al.
Valence Photographs 5 fNIRS – Principal
PCA SVM x Classification
accuracy: 72.9%
for positive
valence, 68.3% for
negative valence;
up to 80% in some
Hosseini et al.
Valence and arousal IAPS 15 EEG ApEn, WE Entropy analysis SVM x Classification
accuracy: 73.25%
Jenke et al.
Anger, calmness,
happiness, sadness
IAPS 16 EEG Features from
multiple domains
Multiple feature
QDA x Higher average
accuracy of
methods; HOC,
HOS, and HHS
spectral power
Uncorrected Author Proof
G. Liberati et al. / Extracting neurophysiological signals reflecting 7
Jie et al. (2014) Valence and arousal Music videos 32 EEG SampEn SampEn analysis SVM x Classification
accuracy: 80.43%
for valence,
79.11% for arousal
Valence Neutral faces
with aversive
22 EEG Time-domain
features: ERPs;
Amplitude and
latency of ERP;
SVM x Classification
accuracy: above
chance level, up to
Kassam et al.
Anger, disgust, envy,
fear, happiness,
lust, pride,
sadness, shame
Self-induction 10 fMRI x Voxels with the most
stable responses
across multiple
Voxel selection GNB x Classification
accuracy: 84%
Koelstra et al.
Valence, arousal,
Videos 30 EEG Facial expressions Frequency features:
PSD in theta,
alpha, beta and
peripheral features
FT SVM x Classification
significantly above
chance level; the
use of EEG
outperforms face
Lee et al. (2014) Valence Videos 40 EEG Frequency features:
FFT QDA x Classification
significantly above
chance level
Lin et al. (2014) Joy, anger, sadness,
Music 26 EEG Music properties Frequency features:
PSD in theta,
alpha, beta and
music features
STFT SVM x Classification
accuracy: 74-76%
Mikhail et al.
Joy, anger, fear,
Performance of
36 EEG Frequency features:
theta, alpha, beta
and gamma
FFT SVM x Classification
accuracy: 51% for
joy, 53% for anger,
58% for fear, 61%
for sadness
Moghimi (2012) Valence and arousal Music 10 fNIRS Laterality features;
Quantification of
right and left
computation of
mean, slope, and
coefficient of
LDA x Classification
accuracy: 71.94%
for valence,
71.93% for arousal
Uncorrected Author Proof
8G. Liberati et al. / Extracting neurophysiological signals reflecting
Table 1
Publication Affective
Participants Technique Other peripheral
Portability Selected
Feature extraction
Type of
Online Performance
et al. (2007)
Anger, disgust, fear,
sadness, surprise
Videos 6 EEG – Time-frequency
features: energy,
DWT FCM, FKM x Discrimination of
happiness, disgust
and fear
et al. (2010)
Disgust fear,
neutrality, surprise
Videos 20 EEG Time-frequency
features: delta,
theta, alpha, beta,
WT k-NN, LDA x Classification
accuracy: 83.26%
using k-NN,
75.21% using
et al. (2010b)
Anger, disgust, fear,
sadness, surprise
16 EEG Time domain;
features: alpha and
x Classification
accuracy using a
HAF filtering
procedure: 85.17%
Sitaram et al.
Disgust, happiness IAPS/memory
(different for
training and
4 fMRI x Highly activated
EM SVM Classification
accuracy: 80%
Sitaram et al.
Disgust, happiness IAPS/memory
recall (same
for training
and testing)
12 fMRI x Highly activated
EM SVM Classification
accuracy: 92%
Sitaram et al.
Disgust, happiness,
recall (same
for training
and testing)
4 fMRI x Highly activated
EM SVM Classification
accuracy: 62%
Soleymani et al.
Valence and arousal Videos 24 EEG GSR,ECG,
respiration, skin
Frequency features:
PSD in theta,
alpha, beta and
peripheral features
FFT; Welch’s method SVM x Classification
accuracy: up to
68.5% for valence,
76.4% for arousal
using a modality
fusion strategy
Stikic et al.
Valence Videos/narrative
63 EEG EMG Frequency features:
FFT LDA, QDA x Classification
accuracy: 74.3%
and 94.5% using
LDA, for general
and individualized
67.3% and 93.5%
using QDA, for
general and
Uncorrected Author Proof
G. Liberati et al. / Extracting neurophysiological signals reflecting 9
Tai and Chau
Valence IAPS 10 fNIRS – Time-domain and
GA LDA, SVM x Classification
accuracy: up to
Yoon et al.
Valence and
Music videos 32 EEG Frequency features FFT Perceptron
x Classification
accuracy: for
two-level classes,
70.9% for valence
and 70.1% for
respectively; for
three-level classes
55.4% for valence
and and 55.2% for
Yuvaraj et al.
Anger, disgust, fear,
sadness, surprise
20 (healthy)+20
EEG* – Frequency features:
PSD, HOS-Based
DFT k-NN, SVM x Classification
accuracy: 66.70%
and 70.51% using
SVM, for patients
and healthy
64,26% and
67.84% using
k-NN, for patients
and healthy
List of abbreviations ApEn: Approximate entropy; BVP: Blood volume pulse; EDA: Electrodermal activity; DFT: Discrete Fourier transform; DWT: Discrete wavelet transform; ECG:
Electrocorticography; EEG: Electroencephalography; EM: Effect mapping; EMG: Electromyography; ERD: Event-related desynchronization; ERP: Event-related potential; ERS: Event-related
synchronization; FCM: Fuzzy C-means; FFT: Fast Fourier transform; FKM: Fuzzy k-means; fMRI: Functional magnetic resonance imaging; fNIRS: Functional near-infrared spectroscopy;
FT: Fourier transform; GA: Genetic algorithm; GNB: Gaussian na¨
ıve Bayes; GSR: Galvanic skin response; HHS: Hilbert-Huang spectrum; HOC: Higher order crossing; HOS: Higher-order
spectra; IADS: International affective digitized sounds; IAPS: International affective picture system; k-NN: K-nearest neighbor; LDA: Linear discriminant analysis; MD: Mahalanobis distance;
MI: Mutual information; MLP: Multilayer perceptron; PCA: Principal component analysis; PSC: Percent signal change; PSD: Power spectral density; QDA: Quadratic discriminant analysis;
RVM: Relevance vector machine; SampEn: Sample entropy; STFT: Short-time Fourier transform; SVM: Support vector machine; WE: Wavelet entropy; WT: Wavelet transform; ZAM:
Uncorrected Author Proof
10 G. Liberati et al. / Extracting neurophysiological signals reflecting
Fig. 1. Flow of information through the different phases of the systematic review. Records were first identified through database searching and
other sources (reference lists of pertinent articles). After removing duplicates, records were screened on the basis of their abstracts. The full-texts
of the included articles were assessed for eligibility. Thirty articles, including a total of 32 studies, were finally included in the review.
often used as emotional visual stimuli (Baucom et al.,322
2012; Frantzidis et al., 2010a, 2010b; Hidalgo-Mu˜
et al., 2013; Hosseini & Naghibi-Sistani, 2011; Jenke
et al., 2014; Sitaram et al., 2011; Tai & Chau, 2009),
because being standardized according to valence and326
arousal normative ratings, they are considered par-327
ticularly suitable for investigating the valence and328
arousal dimensions of emotions. Other visual stimuli329
used to elicit emotions were photographs of various
kinds (Hosseini et al., 2011) and facial expressions
(Petrantonakis & Hadjileontiadis, 2010). The latter
were used because of findings suggesting that humans333
have a predisposition to react emotionally to facial334
stimuli (Buck, 1980), and because they are consid-335
ered to have limited variability across cultures (Ekman, 336
1999). 337
Few studies (12%) relied on the auditory channel, 338
specifically using music, considered to be particu- 339
larly effective for emotion elicitation (Hadjidimitriou 340
& Hadjileontiadis, 2012, 2013; Lin et al., 2014; 341
Moghimi et al., 2012). Thirty-four percent of studies 342
used multimodal stimuli, combining visual and audi- 343
tory modalities (Heger et al., 2014; Jie et al., 2014; 344
Kashihara, 2014; Koelstra & Patras, 2013; Lee & Hsieh, 345
2014; Murugappan et al., 2007, 2010; Soleymani et al., 346
2012; Stikic et al., 2014; Yoon & Chung, 2013; Yuvaraj 347
et al., 2014), e.g. using videos (Jie et al., 2014; Koel- 348
stra & Patras, 2013; Lee & Hsieh, 2014; Murugappan 349
Uncorrected Author Proof
G. Liberati et al. / Extracting neurophysiological signals reflecting 11
et al., 2007, 2010; Soleymani et al., 2012; Stikic et al.,
2014; Yoon & Chung, 2013) or IAPS stimuli combined351
with stimuli drawn from the International Affective
Digitized Sounds (IADS, (Heger et al., 2014; Yuvaraj353
et al., 2014), which are also standardized according to354
valence and arousal normative ratings (Bradley & Lang,355
1999). In a more limited number of studies, affective
states were self-induced by participants, either follow-
ing instructions (Kassam et al., 2013), performing facial
expressions (Mikhail et al., 2013), or recalling past359
episodes (Chanel et al., 2009; Sitaram et al., 2011).
Only two studies investigated the classification of
affective states during the performance of cognitive362
and/or motor tasks in realistic settings (Azcarraga &
Suarez, 2013; Chanel et al., 2011). Azcarraga and364
collaborators (2013) aimed to recognize emotions in365
students performing algebra equations. Chanel et al.366
(2011) investigated emotion classification in subjects
playing a videogame (Tetris) with different levels of dif-368
ficulty. Importantly, none of the studies investigated the369
recognition of affective states during the use of a stan-
dard BCI for communication or environmental control,
or during the use of other AT.372
3.3.3. Participants
The studies included in the review involved a total of374
598 participants (note that some subjects were shared375
across studies, i.e. between Yoon et al., 2013 and Jie
et al., 2014, and within the experiments by Sitaram377
et al., 2011). Only 20 participants belonged to a patient
population (Parkinson disease, Yuvaraj et al., 2014).
None of the participants presented cognitive impair-380
ment. Because age and gender of the participants were
often not reported in the articles, we will not include this
information in our analysis. In the majority of cases,383
studies were conducted with university students who384
volunteered to participate.385
3.3.4. Data acquisition386
In most studies (72%), neurophysiological data were387
acquired using EEG (Azcarraga et al., 2013; Chanel
et al., 2009, 2011; Frantzidis et al., 2010a, 2010b; Jenke
et al., 2014; Hadjidimitriou et al., 2012, 2013; Hidalgo
noz et al., 2013; Hosseini et al., 2011; Jie et al.,391
2014; Kashihara et al., 2014; Koelstra et al., 2012;392
Lee et al., 2014; Lin et al., 2014; Mikhail et al., 2013;393
Murugappan et al., 2007, 2010; Petrantonakis & Had-
jileontiadis, 2010; Soleymani et al., 2012; Stikic et al.,395
2014; Yoon et al., 2013; Yuvaraj et al., 2014). In sev-396
eral cases, the Emotiv EPOC headset, a wireless EEG
system that can be set in a shorter amount of time com-
pared to other systems, was used (Azcarraga & Suarez, 399
2013; Hadjidimitriou & Hadjileontiadis, 2013, 2012; 400
Yuvaraj et al., 2014). EEG was often used in combi- 401
nation with the acquisition of peripheral data, such as 402
galvanic skin response (GSR, Chanel et al., 2009, 2011; 403
Soleymani et al., 2012), skin temperature (Chanel et 404
al., 2011; Soleymani et al., 2012), electrodermal activ- 405
ity (EDA, Frantzidis et al., 2010a), blood volume pulse 406
(BVP, Chanel et al., 2009, 2011), chest cavity expansion 407
(Chanel et al., 2009, 2011), electromyography (EMG, 408
Stikic et al., 2014), electrocardiography (ECG, Soley- 409
mani et al., 2012), video—recorded facial expressions 410
(Koelstra & Patras, 2013), mouse clicks and movements 411
(Azcarraga & Suarez, 2013), and stimulus characteris- 412
tics (Lin et al., 2014). 413
In a more limited number of studies (12%), data 414
acquisition was performed using functional near- 415
infrared spectroscopy (fNIRS, Heger et al., 2014; 416
Hosseini et al., 2011; Moghimi et al., 2012; Tai & 417
Chau, 2009). Both EEG and fNIRS are techniques that 418
rely on relatively portable devices. Few studies (16%) 419
used functional magnetic resonance imaging (fMRI) 420
data acquisition (Baucom et al., 2012; Kassam et al., 421
2013; Sitaram et al., 2011). 422
3.3.5. Emotion recognition techniques 423
Recognizing mental states from neurophysiologi- 424
cal signals generally requires three steps, namely data 425
pre-processing, feature extraction, and classification 426
methods (Kim et al., 2013). As pre-processing meth- 427
ods are relatively consistent across research groups, in 428
this review we focused on the differences in features 429
extraction, based on neurophysiological and neuropsy- 430
chological knowledge, and on emotion classification 431
methods, relying on machine learning and statistical 432
signal processing. 433
In the analyzed studies, the classification of dif- 434
ferent affective states was based on the extraction of 435
features belonging either to the time domain (Azcar- 436
raga & Suarez, 2013; Frantzidis et al., 2010a, 2010b; 437
Heger et al., 2014; Kashihara, 2014; Petrantonakis 438
& Hadjileontiadis, 2010; Tai & Chau, 2009), to the 439
frequency domain (Chanel et al., 2009, 2011; Hadjidim- 440
itriou & Hadjileontiadis, 2012; Hidalgo-Mu˜
noz et al., 441
2013; Koelstra & Patras, 2013; Lee & Hsieh, 2014; 442
Lin et al., 2014; Mikhail et al., 2013; Soleymani et 443
al., 2012; Stikic et al., 2014; Yoon & Chung, 2013; 444
Yuvaraj et al., 2014), or to the time-frequency domain 445
(Hadjidimitriou & Hadjileontiadis, 2013; Murugappan 446
et al., 2010; Murugappan et al., 2007). The rationale for 447
extracting mostly features from the frequency domain 448
Uncorrected Author Proof
12 G. Liberati et al. / Extracting neurophysiological signals reflecting
is that spectral power in the various frequency bands
is often associated to emotional phases (Jenke et al.,450
2014; Kim et al., 2013), and frontal asymmetry in the
band power appears to differentiate valence levels452
(Cacioppo, 2004). In particular, the left frontal lobe453
appears to be involved in the experience of positive454
valence, and the right frontal lobe in the experience
of negative valence (Crabbe, Smith, & Dishman, 2007;
Davidson, 1992; Fox, 1991; M¨
uller, Keil, Gruber, &
Elbert, 1999; Schmidt & Trainor, 2001).458
Several different classification methods were used
for emotion recognition, including linear discriminant
analysis (LDA, Chanel et al., 2009, 2011; Heger et al.,461
2014; Moghimi, Kushki, Guerguerian, & Chau, 2012;
Murugappan et al., 2010; Stikic et al., 2014; Tai &463
Chau, 2009), quadratic discriminant analysis (QDA,464
Chanel et al., 2009, 2011; Hadjidimitriou & Had-465
jileontiadis, 2012; Lee & Hsieh, 2014; Petrantonakis
& Hadjileontiadis, 2010; Stikic et al., 2014), support467
vector machines (SVM, Hadjidimitriou & Hadjileon-468
tiadis, 2012; Hidalgo-Mu˜
noz et al., 2013; Hosseini &
Naghibi-Sistani, 2011; Hosseini et al., 2011; Jie et
al., 2014; Kashihara, 2014; Koelstra & Patras, 2013;471
Lin et al., 2014; Mikhail et al., 2013; Sitaram et al.,472
2011; Soleymani et al., 2012), k-nearest neighbor (k-473
NN, Hadjidimitriou & Hadjileontiadis, 2013, 2012;474
Murugappan et al., 2010; Petrantonakis & Hadjileon-475
tiadis, 2010; Yuvaraj et al., 2014), and Mahalanobis
distance (MD, Frantzidis et al., 2010a, 2010b; Had-477
jidimitriou & Hadjileontiadis, 2012; Petrantonakis &
Hadjileontiadis, 2010). Whereas LDA and QDA per-
form dimensionality reduction from a high-dimensional480
feature space to a low-dimensional space in order to
maximize the Fischer discriminant ratio (McLachlan,
2004), SVMs determine a decision boundary in ker-483
nel space instead of the original feature space (Vapnik,484
Golowich, & Smola, 1997). The k-NN algorithm deter-485
mines the class of a new feature vector according
to the number of nearest vectors in the training set487
surrounding the vector (Altman, 1992). MD, widely488
used in clustering analysis, is a measure of the dis-
tance between a point and a distribution (Mahalanobis,490
1936). For a more in-depth overview of the different491
classification methods, the reader can refer to reviews492
on computational methods for mental state classifica-493
tion (Kim et al., 2013; Lemm, Blankertz, Dickhaus, &494
uller, 2011; Lotte, Congedo, L´
ecuyer, Lamarche, &495
Arnaldi, 2007).496
The studies included in this review often compare497
several classification procedures using different feature498
extraction methods and applying different classification499
algorithms within the same experiment. The different 500
procedures are summarized in Table 1. 501
3.3.6. Classification performance 502
Although the scientific community has suggested 503
that a classification accuracy of 70% can be consid- 504
ered as sufficient for BCI communication and device 505
control (K¨
ubler, Neumann, Wilhelm, Hinterberger, & 506
Birbaumer, 2004), no standard threshold has been 507
established in the context of aBCIs. Moreover, it is 508
very difficult to perform a comparison between results 509
from different studies, for several reasons. First, results 510
are not reported in a standardized way across research 511
articles, which sometimes indicate an average classifi- 512
cation performance, while other times only report the 513
highest performance. Second, the classified affective 514
states vary in number, type, and complexity across stud- 515
ies. Third, it is plausible that at the current state of the 516
art, numerous feature selection and classification meth- 517
ods are tested within the same study, but only the ones 518
leading to the best performance is chosen for presen- 519
tation in the paper, eventually leading to a publication 520
bias. Fourth, the studies rely on different and often very 521
small datasets. 522
In all the selected studies, the reported classification 523
accuracies were above chance level, suggesting that the 524
performance could be considered acceptable to a certain 525
extent. The real discriminative factor, however, is when 526
the recognition of affective state is performed. Only 527
in very few cases (9%), and within the same lab, the 528
recognition of affective states was performed in real- 529
time (Sitaram et al., 2011). In the rest of the studies, 530
emotion classification was performed offline, after the 531
end of the experiment. 532
4. Discussion 533
In the present review, we collected relevant informa- 534
tion from research articles investigating the recognition 535
of affective states from neurophysiological signals, to 536
evaluate the possibility of applying aBCIs in everyday 537
contexts, particularly with individuals using traditional 538
BCIs and other AT for cognitive support. 539
Overall, two different lines of research appear in 540
the aBCI field, one following a dimensional model 541
of emotions (Russell & Mehrabian, 1977), the other 542
based on categories often identified as “basic” or “uni- 543
versal” (Ekman, 1999). The dimensional approach is 544
based on a limited number of simple dimensions (max- 545
imum three, when dominance is included), which do 546
Uncorrected Author Proof
G. Liberati et al. / Extracting neurophysiological signals reflecting 13
not require an in-depth definition and are not culture-
dependent. Indeed, in the context of traditional BCIs548
for communication and environmental control, the pos-
sibility to obtain information on valence (e.g. how550
pleasant/unpleasant is the user’s interaction with the551
BCI or other technology) and his/her arousal (e.g. if552
he/she is nervous) could allow contingent modification
and improvement of the user-machine interaction. Nev-
ertheless, the use of a limited number of dimensions
might entail an over-simplification of affective states.556
First of all, there is high risk of associating very different
emotions that can share valence and arousal levels (e.g.
anger and fear). Second, this approach does not take into559
account that a same affective state may vary notably
in both valence and arousal. In spite of these limita-561
tions, the possibility of using this approach to have562
a feedback of the user’s experience in the interaction563
with a BCI, even if only to evaluate his/her acceptance
or appreciation of the technology, would be a remark-565
able and important step towards the development of566
user-centered adaptive BCIs.
The second line of research aims at identifying more
definite affective states. Ideally, the possibility to recog-569
nize a number of different emotions would remarkably570
improve the interaction with a BCI, e.g. by taking into571
account what makes a user angry, sad or happy. The572
downside of this approach is that the more categories573
are included in the classification, the more difficult is
their discrimination.575
Only a limited number of studies investigated
more complex emotions or task-related affective states
(Azcarraga & Suarez, 2013; Chanel et al., 2011). The578
fact that more complex emotions such as pride or shame
are less investigated is not surprising, as this represents a
more advanced step in affective computing, and because581
these affective states are strongly culture-dependent and582
not defined unequivocally. As observed by Brouwer583
and colleagues (2015), researchers investigating spe-
cific emotion categories should discuss how these states585
have been defined in previous studies, as well as which586
definition they are adhering to. Unfortunately, precise
definitions of the emotions under investigations are588
often lacking in the considered studies.589
The fact that task-related affective states, such as590
anxiety, frustration, boredom or interest, are hardly591
investigated, draws attention on the gap existing592
between the need of developing BCIs following a593
user-centered approach, and what is currently being594
evaluated in laboratory settings. Finding a way to rec-595
ognize task-related affective states would pave the way596
for considerably more usable and effective BCIs and597
ATs, but the research in this direction appears to be still 598
at an early stage. 599
In the analyzed studies, emotions were often elicited 600
using standardized stimuli, mostly from the visual or 601
audiovisual domain. In view of improving BCI expe- 602
rience, this choice can be considered useful, as these 603
modalities have an important role in the interaction with 604
computer technology. Stimuli from the IAPS or from 605
the IADS have the advantage of being well studied, and 606
are considered independent of culture, gender, and age, 607
so they are expected to evoke similar emotions across 608
subjects (Bradley & Lang, 1999; Lang et al., 1999). 609
One disadvantage of these stimuli, however, is that they 610
can recall different memories in different individuals, 611
meaning that what is being recognized in classifica- 612
tion studies is not always univocal. Moreover, brain 613
signals could be modulated by stimulation properties 614
irrelevant to emotion (Heger et al., 2014; Kim et al., 615
2013). As observed by Heger and collaborators (2014), 616
when analyzing affective reactions to emotional stim- 617
ulation based on neurophysiological activity, low-level 618
sensory processing may have an influence on the mea- 619
sured signals, even with highly controlled experimental 620
setups. M¨
uhl and colleagues (2011) also pointed out 621
that different induction modalities result in modality- 622
specific EEG responses. Some researchers have also 623
argued that these paradigms are too distant from an 624
ecological setting (Brouwer et al., 2015). For instance, 625
these is evidence suggesting that visual processing may 626
be quicker when the subject actively samples the envi- 627
ronment through eye movements, compared to when an 628
image is imposed at static eyes (Kamienkowski, Ison, 629
Quiroga, & Sigman, 2012). 630
To overcome the limitations of using artificial 631
stimuli, some researchers prefer asking subjects to 632
self-induce given emotions, for instance by voluntar- 633
ily recalling past episodes associated with these states 634
(Chanel et al., 2009; Kassam et al., 2013; Sitaram et al., 635
2011). These studies still have the limitation of being 636
circumscribed to lab settings, in which emotions are 637
detached from current everyday situations and are not 638
investigated in the context of a real task or activity. 639
Only a very limited number of studies required par- 640
ticipants to actively perform cognitive and/or motor 641
tasks with different levels of difficulty to elicit emotions 642
(Azcarraga et al., 2011; Chanel et al., 2009). Differ- 643
ently from lab experiments in which participants are 644
normally instructed to sit still and perform a single task, 645
users in real world situations will usually be in motion 646
and possibly multitasking (Van Erp, Lotte, & Tanger- 647
mann, 2012). More studies in this direction would be 648
Uncorrected Author Proof
14 G. Liberati et al. / Extracting neurophysiological signals reflecting
helpful for better recognizing and understanding affec-
tive states emerging when interacting with traditional650
BCIs and other technology.
Except for one study, to this date all studies were652
performed with healthy participants with no motor or653
cognitive impairment. The replication of these stud-654
ies on a patient population, as well as testing emotion
recognition of patients during BCI use, would therefore
be advisable.
Predictably, most studies were performed using658
portable techniques, such as EEG and fNIRS. EEG
appears to be the primary option for the development of
affective recognition systems, thanks to its low cost and661
simplicity of use, and because it reacts with very low
latency to changes of mental states (Balconi & Luc-663
chiari, 2006; Putze & Schultz, 2014). Moreover, new664
generation EEG systems such as the Emotiv EPOC665
headset do not require gel application and therefore
need less time to be set up, while approaching the per-667
formance of conventional wet system (Brouwer et al.,668
2015; Zander et al., 2011). In several cases, the acqui-
sition of EEG data was combined with the acquisition
of peripheral data such as respiration, ECK, EMG and671
eye—movements, paving the way to what has been672
defined as a “hybrid BCI” (Pfurtscheller et al., 2010).673
The main purpose of combining different physiologi-674
cal data acquisition systems is to improve classification675
accuracy and overcome the disadvantages of the indi-
vidual techniques. Whereas this strategy potentially677
leads to more robust results, it should be considered
that the addition of redundant or unreliable modalities
may be detrimental to classification accuracy (Putze &680
Schultz, 2014). Moreover, in clinical settings, BCIs are
often used with patients who gradually undergo a loss
of mobility, so it should be considered that relying on683
muscular and ocular movements progressively becomes684
More recently, fNIRS was introduced as an alterna-
tive input modality for BCIs (Strait & Scheutz, 2014).687
Albeit providing lower temporal resolution compared to688
EEG, it provides higher spatial resolution (Bießmann,
Plis, Meinecke, Eichele, & M¨
uller, 2011). Moreover,690
in comparison to EEG, fNIRS requires shorter setup691
time, and is not susceptible to electrical artifacts from692
environmental and physiological sources (Heger et al.,693
2014). Compared to fMRI, fNIRS is less expensive,694
portable, and more bearable for subjects (Cutini, Moro695
Basso, & Bisconti, 2012; Heger et al., 2014; Strait &696
Scheutz, 2014).697
A few studies used fMRI for emotion recognition. It698
may be surprising that a technique such as fMRI could699
be used in BCI contexts, as its applicability in everyday 700
life is not plausible due to its high cost and immobil- 701
ity. Nevertheless, fMRI has the advantage of measuring 702
responses from cortical and subcortical structures asso- 703
ciated to emotional processing and located deep in the 704
brain (Kim et al., 2013; Morgane, Galler, & Mokler, 705
2005; Vytal & Hamann, 2010). Hence, from a proof 706
of concept perspective, and to increase the knowledge 707
on emotion differentiation in the brain, fMRI studies 708
should still be considered useful. 709
The different studies varied greatly relative to the 710
kinds of selected neurophyiosiological features and 711
tested classifiers. On one side, this variety indicates that 712
there are numerous paths in the aBCI domain that can 713
be further explored, possibly leading to new solutions 714
and improvements. On the other side, the complexity of 715
this young research branch and the lack of unanimity 716
across working groups clearly indicate that the aBCI 717
field is still at its outset. 718
As observed by Brouwer and colleagues (2015), 719
several confounding factors may lead to overly opti- 720
mistic results in the classification of mental states. For 721
instance, classification accuracies can be inflated if the 722
data used to train the model is not independent of the 723
data used to test the trained model. To obtain an unbi- 724
ased estimate of the generalization performance of the 725
model, an additional data set independent from both 726
the training and the validation data is required, as in 727
a nested cross-validation scheme (Lemm et al., 2011). 728
The occurrence of this confounding factor in a specific 729
study is generally difficult to judge from the information 730
reported in the articles. The most reliable way to rec- 731
ognize a satisfactory BCI would be to evaluate whether 732
it works in real time, but almost the totality of works 733
analyzed in this review only presented offline results. 734
The most relevant outcome of the present review 735
is that, whereas all articles reported acceptable classi- 736
fication performances, the emotion classification was 737
generally either performed offline, or the adopted 738
methodology was not portable. These aspects pose seri- 739
ous limitations to the possibility to transfer BCI-based 740
affective state recognition to real life situations at the 741
current stage. 742
It is important to underline that the offline recognition 743
of affective states from neurophysiological signals can 744
still be very valuable, providing researchers, software 745
designers, and technicians with additional informa- 746
tion that can be used for improving the technology, 747
e.g. by measuring levels of stress and satisfaction 748
(Brouwer et al., 2015; M¨
uhl et al., 2014a). Com- 749
pared to self-reports and behavioral measurements, 750
Uncorrected Author Proof
G. Liberati et al. / Extracting neurophysiological signals reflecting 15
neurophysiological measurements allow detecting
affective states directly assessing the structures from752
which they emerge (Kim et al., 2013). Moreover, dif-
ferently from other channels such as facial expressions754
or speech, neurophysiological measurements cannot be755
feigned (Chanel, Kronegg, Grandjean, & Pun, 2006).756
On the other hand, it is mandatory to fill the gap
between laboratory experiments and everyday situa-
tions. In order to evaluate whether the different aBCI
studies, at the present stage, can lead to an evident appli-760
cation in the field of cognitive support technologies
in general, and more specifically with traditional BCI,
we decided to consider a combination of three factors,763
namely (i) classification performance, (ii) device porta-
bility, and (iii) possibility to recognize emotional states765
in real time. Importantly, these criteria do not represent766
an evaluation of the quality of each study per se.Nev-767
ertheless, they are a basic requirement for allowing the
use of an aBCI in everyday life. As presented in the769
Results section, all studies reported an above chance770
classification accuracy, so we focused on the second
and third factors. We observed that none of the stud-
ies using EEG or fNIRS signals, albeit relying on a773
portable technique that could be possibly used in home774
environments, used real-time emotion classification. In775
fact, the only studies that reported online classification776
were done with fMRI (Sitaram et al., 2011), which is777
not portable. Overall, although the results obtained in
all studies seem encouraging, none of the applied meth-779
ods are at the current stage ready to be used in everyday
life, neither with healthy nor patient populations.
4.1. Limitations of the review782
In the present systematic review, we did not include783
conference proceedings. We believe that at least some784
of these studies are of high quality and undoubtedly785
promising, but as they did not receive strict peer-review,786
and because important methodological details are often
missing, they did not fit the scope of this work. These788
conference articles are numerous, also thanks to a grow-
ing community organizing meetings and workshops on
aBCIs (Brouwer et al., 2015; M¨
uhl et al., 2014a; Nijholt
& Heylen, 2013), and are a demonstration of the fact792
that the aBCI field is in constant evolution.793
Despite our effort in making this review compre-794
hensive, we cannot exclude that relevant studies may
have been dismissed, mostly due to a lack of com-796
mon terminology. In fact, we observed that a high797
number of research groups do not use the terms
“affective brain–computer interface” or “emotion clas-
sification”, despite extracting information of subjects’ 800
affective states from neurophysiological signals and 801
using machine learning techniques. This suggests that 802
some works may neither have emerged from the 803
research, nor have been referenced in other articles in 804
the field. This is motivated by the fact that the field is 805
highly multidisciplinary, including engineers, psychol- 806
ogists and physicians, and therefore, at this stage of 807
early development, still builds on different grounds. 808
4.2. Conclusions 809
The ability of a BCI to recognize a user’s affective 810
state is an added value, given that emotions provide 811
useful information for the adaptation of the BCI to the 812
user’s needs. 813
The aBCI field is in constant development, with a 814
high variety of combinations of neuroimaging tech- 815
niques, selection of neurophysiological features, and 816
classification algorithms currently being tested and 817
improved. Nevertheless, the gap between what is tested 818
in laboratory settings and the translation of the findings 819
to everyday situations and during specific tasks, includ- 820
ing the application with patient populations, still needs 821
to be filled. 822
In particular, BCI developers should focus on the fol- 823
lowing aspects: a) testing affective state recognition 824
in more ecological settings, during the performance 825
of realistic motor and/or cognitive tasks; b) evaluating 826
classification accuracy in real time, as this is the only 827
way the system can re-adapt to the a user’s internal state; 828
c) perform assessments with patient populations, which 829
are the ones who would most benefit from enhanced 830
BCI system; d) provide more accurate and precise def- 831
initions of which affective state is being measured; e) 832
disseminate results in a more transparent and standard- 833
ized way. 834
Conflict of interest 835
The authors have no competing interest to declare. 836
References 837
Altman, N. S. (1992). An introduction to kernel and nearest–neighbor 838
nonparametric regression. The American Statistician,46(3), 175- 839
185. 840
Azcarraga, J., & Suarez, M. T. (2013). Recognizing student emotions 841
using brainwaves and mouse behavior data. International Journal 842
of Distance Education Technologies,11(2), 1-15. 843
Uncorrected Author Proof
16 G. Liberati et al. / Extracting neurophysiological signals reflecting
Balconi, M., & Lucchiari, C. (2006). EEG correlates (event–related
desynchronization) of emotional face elaboration: A temporal845
analysis. Neuroscience Letters,392(1-2).846
Baucom, L. B., Wedell, D. H., Wang, J., Blitzer, D. N., & Shinkareva,847
S. V. (2012). Decoding the neural representation of affective
states. NeuroImage,59(1), 718-727.849
Bießmann, F., Plis, S., Meinecke, F. C., Eichele, T., & M¨
uller, K.
–R. (2011). Analysis of multimodal neuroimaging data. IEEE851
Reviews in Biomedical Engineering,4, 26-58.852
Birbaumer, N., Weber,C., Neuper, C., Buch, E., Haagen, K., & Cohen,853
L. (2006). Physiological regulation of thinking: Brain–computer
interface (BCI) research. Progress in Brain Research,159, 369-855
Bradley, M. M., & Lang, P. J. (1999). International affective digitized857
sounds (IADS): Stimuli, instruction manual and affective ratings.858
Gainesville: University of Florida.859
Brouwer, A. –M., Zander, T. O., van Erp, J. B. F., Korteling, J. E., &860
Bronkhorst, A. W. (2015). Using neurophysiological signals that861
reflect cognitive or affective state: Six recommendations to avoid862
common pitfalls. Frontiers in Neuroscience,9, 136.863
Buck, R. (1980). Nonverbal behavior and the theory of emotion:864
The facial feedback hypothesis. Journal of Personality and Social
Psychology,38(5), 811-24.866
Cacioppo, J. T. (2004). Feelings and emotions: Roles for electrophys-867
iological markers. Biological Psychology,67(1), 235-243.868
Chanel, G., Kierkels, J. J., Soleymani, M., & Pun, T. (2009).869
Short–term emotion assessment in a recall paradigm. Interna-870
tional Journal of HumanComputer Studies,67(8), 607-627.
Chanel, G., Kronegg, J., Grandjean, D., & Pun, T.(2006). Multimedia
content representation, classification and security (pp. 530-537).873
Chanel, G., Rebetez, C., B´
etrancourt, M., & Pun, T. (2011). Emotion875
assessment from physiological signals for adaptation of game876
difficulty. Systems, Man and Cybernetics, Part A: Systems and877
Humans, IEEE Transactions on,41(6), 1052-1063.878
Chu, Y., Brown, P., Harniss, M., Kautz, H., & Johnson, K. (2014).879
Cognitive support technologies for people with TBI: Current880
usage and challenges experienced. Disability and Rehabilitation.881
Assistive Technology,9(4), 279-285.882
Crabbe, J. B., Smith, J. C., & Dishman, R. K. (2007). Emotional &883
electroencephalographic responses during affective picture view-
ing after exercise. Physiology & Behavior,90(2-3), 394-404.885
Cutini, S., Moro Basso, S., & Bisconti, S. (2012). Functional near886
infrared optical imaging in cognitive neuroscience: An introduc-
tory review. J Near Infrared Spectrosc,20, 75-92.888
Davidson, R. J. (1992). Anterior cerebral asymmetry and the nature
of emotion. Brain and Cognition,20(1), 125-151.890
Ekman, P. (1999). Basic emotions. Handbook of Cognition and Emo-891
tion,4, 5-60.892
Ekman, P., Friesen, W. V., O’Sullivan, M., Chan, A.,893
Diacoyanni–Tarlatzis, I., Heider, K., Krause, W. A., LeCompte,894
T., Pitcairn, T., & Ricci–Bitti, P. E. (1987). Universals and895
cultural differences in the judgments of facial expressions of896
emotion. Journal of Personality and Social Psychology,53(4),897
Federici, S., & Borsci, S. (2014). Providing assistive technology in899
italy: The perceived delivery process quality as affecting aban-
donment. Disability and Rehabilitation. Assistive Technology,901
Federici, S., & Scherer, M. (2012). Assistive technology assessment903
handbook. CRC Press.904
Fox, N. A. (1991). If it’s not left, it’s right: Electroencephalograph 905
asymmetry and the development of emotion. American Psychol- 906
ogist,46(8), 863. 907
Frantzidis, C. A., Bratsas, C., Klados, M. A., Konstantinidis, E., 908
Lithari, C. D., Vivas, A. B., Papadelis, C. L., Kaldoudi, E., Pappas, 909
C., & Bamidis, P. D. (2010a). On the classification of emotional 910
biosignals evoked while viewing affective pictures: An integrated 911
data–mining–based approach for healthcare applications. IEEE 912
Transactions on Information Technology in Biomedicine,14(2), 913
309-318. 914
Frantzidis, C. A., Bratsas, C., Papadelis, C. L., Konstantinidis, E., 915
Pappas, C., & Bamidis, P. D. (2010b). Toward emotion aware 916
computing: An integrated approach using multichannel neu- 917
rophysiological recordings and affective visual stimuli. IEEE 918
Transactions on Information Technology in Biomedicine,14(3), 919
589-597. 920
Hadjidimitriou, S. K., & Hadjileontiadis, L. J. (2012). Toward an 921
EEG–based recognition of music liking using time–frequency 922
analysis. IEEE Transactions on Biomedical Engineering,923
59(12), 3498-3510. 924
Hadjidimitriou, S., & Hadjileontiadis, L. (2013). EEG–Based clas- 925
sification of music appraisal responses using time–frequency 926
analysis and familiarity ratings. IEEE Transactions on Affective 927
Computing,4(2), 161-172. 928
Hadjidimitriou, S. K., & Hadjileontiadis, L. J. (2012). Toward an 929
EEG–based recognition of music liking using time–frequency 930
analysis. IEEE Transactions on Biomedical Engineering,931
59(12), 3498-3510. 932
Haselager, P. (2013). Did I do that? Brain–computer interfacing and 933
the sense of agency. Minds and Machines,23(3), 405-418. 934
Heger, D., Herff, C., Putze, F., Mutter, R., & Schultz, T. (2014). Con- 935
tinuous affective states recognition using functional near infrared 936
spectroscopy. BrainComputer Interfaces,1(2), 113-125. 937
noz, A. R., L´
opez, M. M., Pereira, A. T., Santos, I. M., 938
e, A. M. (2013). Spectral turbulence measuring as feature 939
extraction method from EEG on affective computing. Biomedical 940
Signal Processing and Control,8(6), 945-950. 941
Holz, E. M., Botrel, L., Kaufmann, T., & K¨
ubler, A. (2015). 942
Long–term independent brain–computer interface home use 943
improves quality of life of a patient in the locked–in state: A 944
case study. Archives of Physical Medicine and Rehabilitation,945
96(3 Suppl), S16-26. 946
Hosseini, S. A., & Naghibi–Sistani, M. B. (2011). Emotion recogni- 947
tion method using entropy analysis of EEG signals. International 948
Journal of Image, Graphics and Signal Processing,3(5), 30. 949
Hosseini, S. M. H., Mano, Y., Rostami, M., Takahashi, M., Sugiura, 950
M., & Kawashima, R. (2011). Decoding what one likes or dislikes 951
from single–trial fNIRS measurements. Neuroreport,22(6), 269- 952
273. 953
Jenke, R., Peer, A., & Buss, M. (2014). Feature extraction and selec- 954
tion for emotion recognition from EEG. IEEE Transactions on 955
Affective Computing,5(3), 327-339. 956
Jie, X., Cao, R., & Li, L. (2014). Emotion recognition based 957
on the sample entropy of EEG. Biomed Mater Eng,24(1), 958
1185-1192. 959
Kamienkowski, J. E., Ison, M. J., Quiroga, R. Q., & Sigman, M. 960
(2012). Fixation–related potentials in visual search: A combined 961
EEG and eye tracking study. Journal of Vision,12(7), 4. 962
Kashihara, K. (2014). A brain–computer interface for potential 963
non–verbal facial communication based on EEG signals related 964
to specific emotions. Frontiers in Neuroscience,8, 244. 965
Uncorrected Author Proof
G. Liberati et al. / Extracting neurophysiological signals reflecting 17
Kassam, K. S., Markey, A. R., Cherkassky, V. L., Loewenstein, G.,
& Just, M. A. (2013). Identifying emotions on the basis of neural967
activation. PloS One,8(6), e66032.968
Kim, M. -K., Kim, M., Oh, E., & Kim, S.–P. (2013). A review on969
the computational methods for emotional state estimation from
the human EEG. Computational and Mathematical Methods in971
Medicine,2013, 573734.
Koelstra, S., & Patras, I. (2013). Fusion of facial expressions and973
EEG for implicit affective tagging. Image and Vision Computing,974
31(2), 164-174.975
ubler, A., Holz, E. M., Riccio, A., Zickler, C., Kaufmann, T., Kleih,
S. C., Staiger–S¨
alzer, P., Desideri, L., Hoogerwerf, E. J., & Mat-977
tia, D. (2014). The user–centered design as novel perspective for978
evaluatingthe usability of BCI–controlled applications. PloS One,979
9(12), e112392.980
ubler, A., Neumann, N., Wilhelm, B., Hinterberger, T., &981
Birbaumer, N. (2004). Predictability of brain–computer commu-982
nication. Journal of Psychophysiology,18(2/3), 121-129.983
Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1999). International984
affective picture system (IAPS): Instruction manual and affective985
ratings. The Center for Research in Psychophysiology, University986
of Florida.
Lee, Y. –Y., & Hsieh, S. (2014). Classifying different emotional states988
by means of EEG–based functional connectivity patterns. PloS989
One,9(4), e95415.990
Lemm, S., Blankertz, B., Dickhaus, T., & M¨
uller, K. -R. (2011).991
Introduction to machine learning for brain imaging. NeuroImage,992
56(2), 387-399.
Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P.
C., Ioannidis, J. P., Clarke, M., Devereaux, P. J., Kleijnen, J., &995
Moher, D. (2009). The PRISMA statement for reporting system-996
atic reviews and meta–analyses of studies that evaluate health care997
interventions: Explanation and elaboration. Annals of Internal998
Medicine,151(4), W-65.999
Liberati, G., Dalboni da Rocha, J., Veit, R., von Arnim, C., Jenner,1000
A., Lul´
e, D., Ludolph, A. C., Raffone, A., Olivetti Belardinelli,1001
M., & Sitaram, R. (2013, September). Development of a binary1002
fMRI–BCI for Alzheimer patients. A semantic conditioning1003
paradigm using affectiveunconditioned stimuli. In Affective Com-1004
puting and Intelligent Interaction, pp. 838-842.1005
Liberati, G., Dalboni da Rocha, J. L., van der Heiden, L., Raf-
fone, A., Birbaumer, N., Olivetti Belardinelli, M., & Sitaram, R.1007
(2012). Toward a brain–computer interface for Alzheimer’s dis-1008
ease patients by combining classical conditioning and brain state
classification. Journal of Alzheimer’s Disease,29, 1-10.1010
Liberati, G., Pizzimenti, A., Simione, L., Riccio, A., Schettini,
F., Inghilleri, M., Mattia, D., & Cincotti, F. (2015). Develop-1012
ing brain–computer interfaces from a user–centered perspective:1013
Assessing the needs of persons with amyotrophic lateral sclerosis,1014
caregivers, and professionals. Applied Ergonomics,50, 139-46.1015
Liberati, G., Veit, R., Dalboni da Rocha, J., Kim, S., Lul´
D., von Arnim, C., Raffone, A., Olivetti Belardinelli, M.,1017
Birbaumer, N., & Sitaram, R. (2012). Combining classical con-1018
ditioning and brain–state classification for the development1019
of a brain–computer interface (BCI) for Alzheimer’s patients.1020
Alzheimer’s and Dementia,8(4), P515.1021
Lin, Y.–P., Yang, Y.–H., & Jung, T.–P. (2014). Fusion of electroen-
cephalographic dynamics and musical contents for estimating1023
emotional responses in music listening. Frontiers in Neuro-
science,8, 94.1025
Lotte, F., Congedo, M., L´
ecuyer, A., Lamarche, F., & Arnaldi, B. 1026
(2007). A review of classification algorithms for EEG–based 1027
brain–computer interfaces. Journal of Neural Engineering,4(2), 1028
R1-R13. 1029
Mahalanobis, P. C. (1936). On the generalized distance in statistics. 1030
Proceedings of the National Institute of Sciences,2, 49-55. 1031
McLachlan, G. (2004). Discriminant analysis and statistical pattern 1032
recognition (Vol. 544). John Wiley & Sons. 1033
Mikhail, M., El–Ayat, K., Coan, J. A., & Allen, J. J. (2013). Using 1034
minimal number of electrodes for emotion detection using brain 1035
signals produced from a new elicitation technique. International 1036
Journal of Autonomous and Adaptive Communications Systems,1037
6(1), 80-97. 1038
Moghimi, S., Kushki, A., Guerguerian, A. M., & Chau, T. (2012). 1039
Characterizing emotional response to music in the prefrontal 1040
cortex using near infrared spectroscopy. Neuroscience Letters,1041
525(1), 7-11. 1042
Moghimi, S., Kushki, A., Power, S., Guerguerian, A. M., & Chau, 1043
T. (2012). Automatic detection of a prefrontal cortical response 1044
to emotionally rated music using multi–channel near–infrared 1045
spectroscopy. Journal of Neural Engineering,9(2), 026022. 1046
Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred 1047
reporting items for systematic reviews and meta–analyses: The 1048
PRISMA statement. Annals of Internal Medicine,151(4), 264- 1049
269. 1050
Morgane, P. J., Galler, J. R., & Mokler, D. J. (2005). A review of 1051
systems and networks of the limbic forebrain/limbic midbrain. 1052
Progress in Neurobiology,75(2), 143-60. 1053
Murugappan, M., Ramachandran, N., & Sazali, Y. (2010). Classi- 1054
fication of human emotion from EEG using discrete wavelet 1055
transform. Journal of Biomedical Science and Engineering,3(4), 1056
390-396. 1057
Murugappan, M., Rizon, M., Nagarajan, R., Yaacob, S., Zunaidi, I., & 1058
Hazry, D. (2007). EEG feature extractionfor classifying emotions 1059
using FCM and FKM. International Journal of Computers and 1060
Communications,1(2), 21-25. 1061
uhl, C., Allison, B., Nijholt, A., & Chanel, G. (2014a). 1062
Affective brain–computer interfaces: Special issue editorial. 1063
BrainComputer Interfaces,1(2), 63-65. 1064
uhl, C., Allison, B., Nijholt, A., & Chanel, G. (2014b). A survey of 1065
affective brain computer interfaces: Principles, state–of–the–art, 1066
and challenges. BrainComputer Interfaces,1(2), 66-84. 1067
uhl, C., Brouwer, A.-M., van Wouwe, N., van den Broek, 1068
E. L., Nijboer, F., & Heylen, D. K. J. (2011, September). 1069
Modality–specific affective responses and their implications 1070
for affective BCI. Proceedings of the Fifth International 1071
Brain–Computer Interface Conference 2011, pp. 120-123. 1072
uhl, C., Heylen, D., & Nijholt, A. (2014). 15 affective 1073
brain–computer interfaces: Neuroscientific approaches to affect 1074
detection. The Oxford Handbook of Affective Computing, 217. 1075
uller, M. M., Keil, A., Gruber, T., & Elbert, T. (1999). 1076
Processing of affective pictures modulates right–hemispheric 1077
gamma band EEG activity. Clinical Neurophysiology,110(11), 1078
1913-1920. 1079
Nicolas–Alonso, L. F., & Gomez–Gil, J. (2012). Brain computer 1080
interfaces, a review. Sensors,12(2), 1211-1279. 1081
Nijboer, F., Birbaumer, N., & K¨
ubler, A. (2010). The influence of 1082
psychological state and motivation on brain–computer interface 1083
performance in patients with amyotrophic lateral sclerosis – a 1084
longitudinal study. Frontiers in Neuroscience,4(55). 1085
Uncorrected Author Proof
18 G. Liberati et al. / Extracting neurophysiological signals reflecting
Nijboer, F., Plass–Oude Bos, D., Blokland, Y., van Wijk, R., & Far-
quhar, J. (2014). Design requirements and potential target users1087
for brain–computer interfaces––recommendations from rehabili-1088
tation professionals. BrainComputer Interfaces,1(1), 50-61.1089
Nijholt, A., & Heylen, D. K. J. (2013). Editorial (to: Special issue
on affective brain–computer interfaces). BrainComputer Inter-1091
faces,1(1), 63-65.
Pasqualotto, E., Matuz, T., Federici, S., Ruf, C. A., Bartl, M., Olivetti1093
Belardinelli, M., Birbaumer, N., & Halder, S. (2015). Usability1094
and workload of access technology for people with severe motor1095
impairment: A comparison of brain–computer interfacing and eye
tracking. Neurorehabilitation and Neural Repair (in press).1097
Pasqualotto, E., Simonetta, A., Federici, S., & Olivetti Belardinelli,1098
M. (2009, September). Usability evaluation of BCIs. In Assistive1099
Technology from Adapted Equipment to Inclusive Environments1100
AAATE 2009.1101
Petrantonakis, P. C., & Hadjileontiadis, L. J. (2010). Emotion recogni-1102
tion from brain signals using hybrid adaptive filtering and higher1103
order crossings analysis. Affective Computing, IEEE Transactions1104
on,1(2), 81-97.1105
Pfurtscheller, G., Allison, B. Z., Brunner, C., Bauernfeind, G.,1106
Solis–Escalante, T., Scherer, R., Zander, T. O., Mueller–Putz, G.,
Neuper, & Birbaumer, N. (2010). The hybrid BCI. Frontiers in1108
Picard, R. W. (2000). Affective computing. MIT press.1110
Putze, F., & Schultz, T. (2014). Adaptive cognitive technical systems.1111
Journal of Neuroscience Methods,234, 108-115.1112
Reuderink, B., Poel, M., & Nijholt, A. (2011). The impact of loss of
control on movement BCIs. IEEE Transactions on Neural Sys-
tems and Rehabilitation Engineering: A Publication of the IEEE1115
Engineering in Medicine and Biology Society,19(6), 628-637.1116
Russell, J. A., & Mehrabian, A. (1977). Evidence for a three–factor1117
theory of emotions. Journal of Research in Personality,11(3),1118
Schettini, F.,Riccio, A., Simione, L., Liberati, G., Caruso, M., Frasca,1120
V., Calabrese, B., Mecella, M., Pizzimenti, A., Inghilleri, M., Mat-1121
tia, D., & Cincotti, F. (2015). Assistive device with conventional,1122
alternative, and brain–computer interface inputs to enhance inter-1123
action with the environment for people with amyotrophic lateral1124
sclerosis: A feasibility and usability study. Archives of Physical1125
Medicine and Rehabilitation,96(3 Suppl), S46-S53.
Schmidt, L. A., & Trainor,L. J. (2001). Frontal brain electrical activity1127
(EEG) distinguishes valence and intensity of musical emotions.1128
Cognition & Emotion,15(4), 487-500.
Simon, N., K¨
athner, I., Ruf, C. A., Pasqualotto, E., K¨
ubler, A.,1130
& Halder, S. (2014). An auditory multiclass brain–computer
interface with natural stimuli: Usability evaluation with healthy1132
participants and a motor impaired end user. Frontiers in Human1133
Neuroscience,8, 1039.
Sitaram, R., Lee, S., Ruiz, S., Rana, M., Veit, R., & Birbaumer, N. 1134
(2011). Real–time support vector classification and feedback of 1135
multiple emotional brain states. NeuroImage,56(2), 753-765. 1136
Soleymani, M., Pantic, M., & Pun, T. (2012). Multimodal emotion 1137
recognition in response to videos. Affective Computing, IEEE 1138
Transactions on,3(2), 211-223. 1139
Stikic, M., Johnson, R. R., Tan, V., & Berka, C. (2014). 1140
EEG–based classification of positive and negative affective states. 1141
BrainComputer Interfaces,1(2), 99-112. 1142
Strait, M., & Scheutz, M. (2014). What we can and cannot (yet) do 1143
with functional near infrared spectroscopy. Frontiers in Neuro- 1144
science,8, 117. 1145
Tai, K., & Chau, T. (2009). Single–trial classification of NIRS signals 1146
during emotional induction tasks: Towards a corporeal machine 1147
interface. Journal of Neuroengineering and Rehabilitation,6, 39. 1148
Van Erp, J. B., Brouwer, A.-M., & Zander, T. O. (2015). Introduc- 1149
tion to using neurophysiological signals that reflect cognitive or 1150
affective state. Frontiers in Neuroscience,9, 193. 1151
Van Erp, J. B., Lotte, F., & Tangermann, M. (2012). Brain–computer 1152
interfaces: Beyond medical applications. Computer, (4), 26-34. 1153
Vapnik, V., Golowich, S., & Smola, A. (1997). Support vector method 1154
for function approximation, regressionestimation, and signal pro- 1155
cessing (M. C. Mozer, M. I. Jordan, T. Petsche (eds.), Advanced 1156
in Neural Information Processing Systems ed.). Cambridge, MA: 1157
MIT Press. 1158
Vytal, K., & Hamann, S. (2010). Neuroimaging support for 1159
discrete neural correlates of basic emotions: A voxel–based 1160
meta–analysis. Journal of Cognitive Neuroscience,22(12), 2864- 1161
2885. 1162
Yoon, H. J., & Chung, S. Y. (2013). EEG–based emotion estimation 1163
using bayesian weighted–log–posterior function and perceptron 1164
convergence algorithm. Computers in Biology and Medicine,1165
43(12), 2230-2237. 1166
Yuvaraj, R., Murugappan, M., Ibrahim, N. M., Omar, M. I., Sundaraj, 1167
K., Mohamad, K., Palaniappan, R., & Satiyan, M. (2014). Emo- 1168
tion classification in Parkinson’s disease by higher–order spectra 1169
and power spectrum features using EEG signals: A comparative 1170
study. Journal of Integrative Neuroscience,13(1), 89-120. 1171
Zander, T. O., Lehne, M., Ihme, K., Jatzev, S., Correia, J., Kothe, 1172
C., Picht, B., & Nijboer, F. (2011). A dry EEG–system for 1173
scientific research and brain–computer interfaces. Frontiers in 1174
Neuroscience,5, 53. 1175
Zickler, C., Halder, S., Kleih, S. C., Herbert, C., & K¨
ubler, A. (2013). 1176
Brain painting: Usability testing according to the user–centered 1177
design in end users with severe motor paralysis. Artificial Intelli- 1178
gence in Medicine,59(2), 99-110. 1179
... Many reviews have focused on emotion recognition and classification based on electrophysiological signals. For instance, Liberati et al. selected studies that aimed to identify affective states from electrophysiological signals and emphasized their application on brain-computer interfaces (BCIs) [18]. Usually, in clinical interventions, an optimal BCI records and analyzes electrophysiological activity in real time to provide feedback to the system user, including emotional processing. ...
... In general, the experimental setups of the studies were based on the technical configurations summarized in Table 4. Different methods were identified to analyze EEG activity. The first one focused on five typical frequency bands: delta (0-3 Hz), theta (4-7 Hz), alpha (8)(9)(10)(11)(12), beta (13)(14)(15)(16)(17)(18)(19)(20) and gamma (>20 Hz). Most studies concluded that lower frequencies were effectively associated with emotional processing and regulation. ...
Full-text available
The electrophysiological basis of emotion regulation (ER) has gained increased attention since efficient emotion recognition and ER allow humans to develop high emotional intelligence. However, no methodological standardization has been established yet. Therefore, this paper aims to provide a critical systematic review to identify experimental methodologies that evoke emotions and record, analyze and link electrophysiological signals with emotional experience by statistics and artificial intelligence, and lastly, define a clear application of assessing emotion processing. A total of 42 articles were selected after a search based on six scientific browsers: Web of Science, EBSCO, PubMed, Scopus, ProQuest and ScienceDirect during the first semester of 2020. Studies were included if (1) electrophysiological signals recorded on human subjects were correlated with emotional recognition and/or regulation; (2) statistical models, machine or deep learning methods based on electrophysiological signals were used to analyze data. Studies were excluded if they met one or more of the following criteria: (1) emotions were not described in terms of continuous dimensions (valence and arousal) or by discrete variables, (2) a control group or neutral state was not implemented, and (3) results were not obtained from a previous experimental paradigm that aimed to elicit emotions. There was no distinction in the selection whether the participants presented a pathological or non-pathological condition, but the condition of subjects must have been efficiently detailed for the study to be included. The risk of bias was limited by extracting and organizing information on spreadsheets and participating in discussions between the authors. However, the data size selection, such as the sample size, was not considered, leading to bias in the validity of the analysis. This systematic review is presented as a consulting source to accelerate the development of neuroengineering-based systems to regulate the trajectory of emotional experiences early on. Keywords: electrophysiological signals; emotional intelligence; emotion recognition; emotion regulation; methodology
... Frontal Alpha Asymmetry (FAA) is used as an index of EEG in this study. There are a variety 94 of EEG indexes, and many reports have been published on the relationship between EEG and 95 emotion (Coan & Allen, 2004, Hajcak et al., 2010, Liberati et al., 2015. Of these, FAA is an 96 important index often used in basic and applied research as an indicator representing an 97 aspect of emotion (Davidson, 2004). ...
In this research to assess emotions from biometric signals, participants are asked to evaluate the emotions they subjectively experienced in order to confirm whether the assumed emotions were actually elicited. However, the evaluation of emotions is not routinely performed in daily life, and it is possible that this evaluation may alter biological signals. In fMRI studies, evaluation has been shown to activate the amygdala, which is said to be related to emotional expression. However, electroencephalography (EEG) studies do not take into consideration the effects of such evaluations, and it is unclear how these evaluations affect emotion-related brain activity observed in EEG. We hypothesized that emotion evaluations would amplify emotions and c alter Frontal Alpha Asymmetry (FAA), which has been shown to be related to emotional pleasantness and unpleasantness. We suspect this is because in order to evaluate one's emotions, one must pay attention to one's internal state, and this self-focused attention has been found to enhance the subjective emotional experience. We measured a 29-channel EEG when presented with unpleasant and highly arousing images from the International Affective Picture System (IAPS) from 40 healthy male and female participants. The results revealed that FAA was significantly lower in the condition in which participants rated their own emotions compared to the condition in which they did not. Similar to fMRI studies, this result indicates that emotion-related brain activity is amplified on an EEG. This paper provides a cautionary note regarding the use of such evaluations in EEG emotion estimation studies.
... The systematic review method is helpful in outlining the boundaries of knowledge [53] and was applied to identify and critically analyse contributions to the research topic [54].Using this method, a researcher is guided in the systematization and sharing of the results about a specific body of literature [55,56]. Moreover, this methodology is useful when the purpose of the study is to map a research field, identify research gaps, and develop an agenda for further research [57]. ...
Full-text available
Nowadays, the world is facing numerous sustainability challenges and the modern food system is called to innovate processes or products in order to remain competitive within the market, as well as answering to strategic government guidelines for a more sustainable food supply chain. This study aims to investigate what the main research routes of a sustainable food supply chain are, explored by the international scientific panorama, with a view for providing companies with a framework of the sustainability paths that can be followed, and, to researchers, gaps and future research routes to explore. A systematic review method is adopted through bibliometric analysis and results were obtained with VOSViewer software support. Descriptive and thematic analyses allowed us to discover the bibliometric characteristics of the sample, the main specific topics and the related research routes already addressed in sustainable food supply chain, the main food supply chain models studied in association with sustainability and the effort employed by academia to investigate the three sustainability dimensions: environmental, economic and social. Concluding, the research field of sustainability in the food supply chain is focused on management issues able to generate impacts on process, systems, practices, production and quality.
... However, the cognitive load and emotional states such as motivation, frustration and distractions greatly affected the EEG recording of children [43]. In the follow-up research, we plan to employ the TL-based feedback training method for neurofeedback training in children's affective states [44] [45]. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 The offline simulation feedback graph, where the small rectangles represented the"original" previous centers of mass. s2 and s6 belong to good-and low-performance subjects, respectively. ...
Full-text available
Objective: Feedback training is a practical approach to brain-computer interface (BCI) end-users learning to modulate their sensorimotor rhythms (SMR). BCI self-regulation learning has been shown to be influenced by subjective psychological factors, such as motivation. However, few studies have taken into account the users' self-motivation as additional guidance for the cognitive process involved in BCI learning. In this study we tested a transfer learning (TL) feedback method designed to increase self-motivation by providing information about past performance. Approach: EEG signals from the previous runs were affine transformed and displayed as points on the screen, along with the newly recorded EEG signals in the current run, giving the subjects a context for self-motivation. Subjects were asked to separate the feedback points for the current run under the display of the separability of prior training. We conducted a between-subject feedback training experiment, in which 24 healthy SMR-BCI naive subjects were trained to imagine left- and right-hand movements. The participants were provided with either TL feedback or typical cursorbar (CB) feedback (control condition), for three sessions on separate days. Main results: The behavioral results showed an increased challenge and stable mastery confidence, suggesting that subjects' motivation grew as the feedback training went on. The EEG results showed favorable overall training effects with TL feedback in terms of the class distinctiveness and EEG discriminancy. Performance was 28.5% higher in the third session than in the first. 41.7% of the subjects were learners including not only low-performance subjects, but also good-performance subjects who might be affected by the ceiling effect. Subjects were able to control BCI with TL feedback with a higher performance of 60.5% during the last session compared to CB feedback. Significance: The present study demonstrated that the proposed TL feedback method boosted psychological engagement through the self-motivated context, and further allowed subjects to modulate SMR effectively. The proposed TL feedback method also provided an alternative to typical CB feedback.
... Semi-nal advances are expected with regard to affective BCIs, which are designed to recognize, stimulate, and influence their users' affective states (Daly et al. 2016). Currently, however, affective BCIs cannot be applied in nonlaboratory settings or in real time (Liberati, Federici, and Pasqualotto 2015). Advances have been made recently, however, in recognizing and classifying discrete emotions (e.g., fear or surprise) (Lee and Hsieh 2014;Jahromy, Bajoulvand, and Daliri 2019;Shu et al. 2018). ...
In the present article we examine the anthropological implications of “intelligent” neurotechnologies (INTs). For this purpose, we first give an introduction to current developments of INTs by specifying their central characteristics. We then present and discuss traditional anthropological concepts such as the “homo faber,” the concept of humans as “deficient beings,” and the concept of the “cyborg,” questioning their descriptive relevance regarding current neurotechnological applications. To this end, we relate these anthropological concepts to the characteristics of INTs elaborated before. As we show, the explanatory validity of the anthropological concepts analyzed in this article vary significantly. While the concept of the homo faber, for instance, is not capable of adequately describing the anthropological implications of new INTs, the cyborg proves to be capable of grasping several aspects of today’s neurotechnologies. Nevertheless, alternative explanatory models are needed in order to capture the new characteristics of INTs in their full complexity.
... This systematic review was performed following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 10 and was registered in the International Prospective Register of Systematic Reviews-PROSPERO CRD42019110575. ...
The objective of this systematic review is to assess the risk of postoperative bleeding in oral surgery for implant placement in individuals taking antithrombotics (i.e., anticoagulants and/or antiplatelet agents). A literature search was performed in PubMed (MEDLINE), Web of Science, Scopus, and EMBASE databases for articles published until August 2020, with no date restriction, and manually completed. We included prospective clinical studies that provided information regarding the presence of an experimental group (i.e., implant placement), a control group (patients not under treatment with antithrombotics), and a well-established protocol for evaluating bleeding. Meta-analysis determined the risk of bleeding during the placement of implants in antithrombotic-treated patients. Of the 756 potentially eligible articles, 5 were included in the analysis with 4 ranked as high and 1 as medium quality. Antithrombotic treatment comprised the following drug classes: (1) anticoagulants: vitamin K antagonists, (2) nonvitamin K antagonist oral anticoagulants, (3) low-molecular-weight heparin, and (4) antiplatelet agents (not specified). The results suggest that the risk of bleeding is not substantially higher in antithrombotic-treated patients (odds ratio = 2.19; 95% confidence interval: 0.88–5.44, p = 0.09) compared with nontreated patients. This systematic review suggests that the absolute risk is low and there is no need to discontinue or alter the dose of the antithrombotic treatment for implant placement surgery.
In this study, a hybrid brain-computer interface (BCI) system combining P300 potential and emotion patterns was proposed to improve the performance of awareness detection. Two video clips were flashed randomly to evoke the P300 potential, while a laughing or crying video clip was used to induce the corresponding emotion pattern. The subjects were asked to concentrate on the laughing or crying video clip cued by the instruction and to count the flashes of the corresponding video clip. Two layers of classification were developed. In the first layer, P300 detection and emotion recognition were performed separately using two support vector machine (SVM) classifiers. Specifically, the activation, spatial and connection patterns were fused in emotion recognition. In the second layer, the SVM scores of P300 detection and emotion recognition were fed into another SVM classifier to determine which video clip the subjects responded to. Six healthy subjects and eight patients with disorders of consciousness (DOC) were involved in the command-following experiment. The results showed that the accuracy of the hybrid BCI system was better than those of the single-modality systems. Furthermore, three patients were able to perform tasks (66%-72%) using our hybrid BCI, which indicated their residual awareness and emotion-related abilities.
Conference Paper
Affective states play an important role in human behavior and decision-making. In recent years, several affective brain-computer interface (aBCI) studies have focused on developing an emotion classifier based on elicited emotions within the user. However, it is difficult to achieve consistency in elicited emotions across populations, which can lead to dataset imbalances. The experimental design presented in this paper seeks to avoid consistency issues by asking the participant to classify the emotion portrayed in images of facial expressions, rather than their own emotions. Priming is also a common technique used in psychology studies that is known to influence emotional perception. To improve participant accuracy, we investigated matching and mis-matched word priming for the facial expression images. Electro-encephalogram (EEG) data were used to generate images fed into a classifier based on the Big Transfer model, BiT-M R101x1. The primed images resulted in higher classification accuracy overall. Further, by building different classifier models for both mis-matched primed images and matching primed images, we were able to achieve classification accuracies above 90%. We also provided the classifier with the true labels of the photographs instead of the labels generated by the participants and achieved similar results. The experimental paradigm of measuring brain activity during the emotional classification of another individual provides consistently high, balanced classification accuracies.
Full-text available
The brain is involved in the registration, evaluation, and representation of emotional events and in the subsequent planning and execution of appropriate actions. Novel interface technologies—so-called affective brain-computer interfaces (aBCI)—can use this rich neural information, occurring in response to affective stimulation, for the detection of the user’s affective state. This chapter gives an overview of the promises and challenges that arise from the possibility of neurophysiology-based affect detection, with a special focus on electrophysiological signals. After outlining the potential of aBCI relative to other sensing modalities, the reader is introduced to the neurophysiological and neurotechnological background of this interface technology. Potential application scenarios are situated in a general framework of brain-computer interfaces. Finally, the main scientific and technological challenges that have yet to be solved on the way toward reliable affective brain-computer interfaces are discussed.
Full-text available
Systematic reviews should build on a protocol that describes the rationale, hypothesis, and planned methods of the review; few reviews report whether a protocol exists. Detailed, well-described protocols can facilitate the understanding and appraisal of the review methods, as well as the detection of modifications to methods and selective reporting in completed reviews. We describe the development of a reporting guideline, the Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015). PRISMA-P consists of a 17-item checklist intended to facilitate the preparation and reporting of a robust protocol for the systematic review. Funders and those commissioning reviews might consider mandating the use of the checklist to facilitate the submission of relevant protocol information in funding applications. Similarly, peer reviewers and editors can use the guidance to gauge the completeness and transparency of a systematic review protocol submitted for publication in a journal or other medium.
Full-text available
In this paper we review classification algorithms used to design brain–computer interface (BCI) systems based on electroencephalography (EEG). We briefly present the commonly employed algorithms and describe their critical properties. Based on the literature, we compare them in terms of performance and provide guidelines to choose the suitable classification algorithm(s) for a specific BCI.
Full-text available
Using recent regional brain activation/emotion models as a theoretical framework, we examined whether the pattern of regional EEG activity distinguished emotions induced by musical excerpts which were known to vary in affective valence (i.e., positive vs. negative) and intensity (i.e., intense vs. calm) in a group of undergraduates. We found that the pattern of asymmetrical frontal EEG activity distinguished valence of the musical excerpts. Subjects exhibited greater relative left frontal EEG activity to joy and happy musical excerpts and greater relative right frontal EEG activity to fear and sad musical excerpts. We also found that, although the pattern of frontal EEG asymmetry did not distinguish the intensity of the emotions, the pattern of overall frontal EEG activity did, with the amount of frontal activity decreasing from fear to joy to happy to sad excerpts. These data appear to be the first to distinguish valence and intensity of musical emotions on frontal electrocortical measures.
Full-text available
The central question of this Frontiers Research Topic is: What can we learn from brain and other physiological signals about an individual's cognitive and affective state and how can we use this information? This question reflects three important issues which are addressed by the 22 articles in this volume: (1) the combination of central and peripheral neurophysiological measures; (2) the diversity of cognitive and affective processes reflected by these measures; and (3) how to apply these measures in real world applications.
Full-text available
Estimating cognitive or affective state from neurophysiological signals and designing applications that make use of this information requires expertise in many disciplines such as neurophysiology, machine learning, experimental psychology, and human factors. This makes it difficult to perform research that is strong in all its aspects as well as to judge a study or application on its merits. On the occasion of the special topic "Using neurophysiological signals that reflect cognitive or affective state" we here summarize often occurring pitfalls and recommendations on how to avoid them, both for authors (researchers) and readers. They relate to defining the state of interest, the neurophysiological processes that are expected to be involved in the state of interest, confounding factors, inadvertently "cheating" with classification analyses, insight on what underlies successful state estimation, and finally, the added value of neurophysiological measures in the context of an application. We hope that this paper will support the community in producing high quality studies and well-validated, useful applications.
Systematic reviews and meta-analyses are essential to summarize evidence relating to efficacy and safety of health care interventions accurately and reliably. The clarity and transparency of these reports, however, is not optimal. Poor reporting of systematic reviews diminishes their value to clinicians, policy makers, and other users. Since the development of the QUOROM (QUality Of Reporting Of Meta-analysis) Statement-a reporting guideline published in 1999-there have been several conceptual, methodological, and practical advances regarding the conduct and reporting of systematic reviews and meta-analyses. Also, reviews of published systematic reviews have found that key information about these studies is often poorly reported. Realizing these issues, an international group that included experienced authors and methodologists developed PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) as an evolution of the original QUOROM guideline for systematic reviews and meta-analyses of evaluations of health care interventions. The PRISMA Statement consists of a 27-item checklist and a four-phase flow diagram. The checklist includes items deemed essential for transparent reporting of a systematic review. In this Explanation and Elaboration document, we explain the meaning and rationale for each checklist item. For each item, we include an example of good reporting and, where possible, references to relevant empirical studies and methodological literature. The PRISMA Statement, this document, and the associated Web site ( should be helpful resources to improve reporting of systematic reviews and meta-analyses.
By focus group methodology, we examined the opinions and requirements of persons with ALS, their caregivers, and health care assistants with regard to developing a brain-computer interface (BCI) system that fulfills the user's needs. Four overarching topics emerged from this analysis: 1) lack of information on BCI and its everyday applications; 2) importance of a customizable system that supports individuals throughout the various stages of the disease; 3) relationship between affectivity and technology use; and 4) importance of individuals retaining a sense of agency. These findings should be considered when developing new assistive technology. Moreover, the BCI community should acknowledge the need to bridge experimental results and its everyday application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.