ArticlePDF Available

Usability and Workload of Access Technology for People With Severe Motor Impairment: A Comparison of Brain-Computer Interfacing and Eye Tracking

Authors:
  • University of Perugia. Italy

Abstract and Figures

Background. Eye trackers are widely used among people with amyotrophic lateral sclerosis, and their benefits to quality of life have been previously shown. On the contrary, Brain-computer interfaces (BCIs) are still quite a novel technology, which also serves as an access technology for people with severe motor impairment. Objective. To compare a visual P300-based BCI and an eye tracker in terms of information transfer rate (ITR), usability, and cognitive workload in users with motor impairments. Methods. Each participant performed 3 spelling tasks, over 4 total sessions, using an Internet browser, which was controlled by a spelling interface that was suitable for use with either the BCI or the eye tracker. At the end of each session, participants evaluated usability and cognitive workload of the system. Results. ITR and System Usability Scale (SUS) score were higher for the eye tracker (Wilcoxon signed-rank test: ITR T = 9, P = .016; SUS T = 12.50, P = .035). Cognitive workload was higher for the BCI (T = 4; P = .003). Conclusions. Although BCIs could be potentially useful for people with severe physical disabilities, we showed that the usability of BCIs based on the visual P300 remains inferior to eye tracking. We suggest that future research on visual BCIs should use eye tracking-based control as a comparison to evaluate performance or focus on nonvisual paradigms for persons who have lost gaze control. © The Author(s) 2015.
Content may be subject to copyright.
Neurorehabilitation and
Neural Repair
1 –8
© The Author(s) 2015
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1545968315575611
nnr.sagepub.com
Original Research Articles
Introduction
Because of degenerative neuromuscular diseases or neuro-
logical disorders, such as amyotrophic lateral sclerosis
(ALS), persons with severe physical disabilities can gradu-
ally lose control of speech muscles and limbs and, conse-
quently, the ability to communicate with their voice or with
conventional assistive devices.
1
Self-expression is funda-
mental for quality of life,
2
and lack of communication can
result in restrictions of participation, as defined by the
International Classification of Functioning, Disability and
Health.
By using gaze, or pupil size, eye trackers enable users to
communicate or control devices.
3
These systems require
only a short training period
4
and result in a low workload.
5
In a comparison between an eye tracker and a single switch
scanning system,
6
participants with ALS reported less
fatigue and a faster access when using the eye tracker. The
main difficulty in using eye trackers as an access technol-
ogy (AT), is the so-called Midas touch problem.
3
This refers
to the fact that gaze direction is not always related to the
focus of the attention, causing users to select a command
against their will.
People with ALS were involved only in a small number
of studies regarding the assessment of eye trackers.
2,5,7
Calvo et al
5
found significant improvement in the quality of
life of people with ALS after being provided with an eye
575611NNR
XXX10.1177/1545968315575611Neurorehabilitation and Neural RepairPasqualotto et al
research-article2015
1
Université Catholique de Louvain, Louvain-la-Neuve, Belgium
2
Eberhard Karls Universität, Tübingen, Germany
3
University of Perugia, Perugia, Italy
4
Sapienza Università di Roma, Rome, Italy
5
Istituto di Ricovero e Cura a Carattere Scientifico (IRCCS), Venezia
Lido, Italy
6
Universität Würzburg, Würzburg, Germany
7
National Rehabilitation Center for Persons with Disabilities,
Tokorozawa, Japan
Corresponding Author:
Emanuele Pasqualotto, PhD, Université Catholique de Louvain, Avenue
Hippocrate 54, bte B1.54.09, 1200 Brussels, Belgium.
Email: emanuele.pasqualotto@uclouvain.be
Usability and Workload of Access
Technology for People With Severe Motor
Impairment: A Comparison of Brain-
Computer Interfacing and Eye Tracking
Emanuele Pasqualotto, PhD
1
, Tamara Matuz, PhD
2
, Stefano Federici, PhD
3,4
,
Carolin A. Ruf, PhD
2
, Mathias Bartl
2
, Marta Olivetti Belardinelli, MA
4
,
Niels Birbaumer, PhD
2,5
, and Sebastian Halder, PhD
6,7
Abstract
Background. Eye trackers are widely used among people with amyotrophic lateral sclerosis, and their benefits to quality
of life have been previously shown. On the contrary, Brain-computer interfaces (BCIs) are still quite a novel technology,
which also serves as an access technology for people with severe motor impairment. Objective. To compare a visual P300-
based BCI and an eye tracker in terms of information transfer rate (ITR), usability, and cognitive workload in users with
motor impairments. Methods. Each participant performed 3 spelling tasks, over 4 total sessions, using an Internet browser,
which was controlled by a spelling interface that was suitable for use with either the BCI or the eye tracker. At the end of
each session, participants evaluated usability and cognitive workload of the system. Results. ITR and System Usability Scale
(SUS) score were higher for the eye tracker (Wilcoxon signed-rank test: ITR T = 9, P = .016; SUS T = 12.50, P = .035).
Cognitive workload was higher for the BCI (T = 4; P = .003). Conclusions. Although BCIs could be potentially useful for
people with severe physical disabilities, we showed that the usability of BCIs based on the visual P300 remains inferior to
eye tracking. We suggest that future research on visual BCIs should use eye tracking–based control as a comparison to
evaluate performance or focus on nonvisual paradigms for persons who have lost gaze control.
Keywords
BCI, brain-computer interface, eye tracking, usability, cognitive workload, assistive technology, ALS
at Univ Catholique Louvain Bib on March 16, 2015nnr.sagepub.comDownloaded from
2 Neurorehabilitation and Neural Repair
tracker. The participants in the study were able to communi-
cate independently and rated the communication as easier,
faster, and less effortful than before. Ball et al
7
investigated
the acceptance, training, and extended use patterns of an
eye tracker in a group of 15 people with ALS. The capacity
of the users for social interactions increased considerably.
Indeed, the system was not only used for face-to-face com-
munication, but also for many other functions, such as
group communication (43%), phone calls (71%), e-mail
(79%), and Internet access (86%).
Independent of motor inputs, a brain-computer interface
(BCI) provides a direct connection between the brain and
any device capable of receiving brain signals. Most BCI
systems use electroencephalogram (EEG) signals and
require users to intentionally control specific features of
their own brain activity, such as slow cortical potentials or
the sensorimotor rhythm.
8,9
The P300 Speller
10
has been
successfully used by people with severe motor impair-
ment.
11-14
The P300 is an event-related potential, recorded
using EEG, evoked by a rare, task-relevant stimulus,
15
with
a latency between 200 and 700 ms, and is often related to
attention. P300 BCIs do not require users to learn to modu-
late their brain response.
9
In a typical P300 paradigm, a participant is presented
with a 6 × 6 alphanumeric matrix, where each row and each
column flashes randomly in a fixed interval, and the user
selects the desired character by focusing the attention on the
corresponding cell. This combined mechanism of focused
attention and random flashing makes the single matrix cell
a rare task-relevant stimulus eliciting the P300 peak
response. The P300 Speller has been adapted to control
wheelchairs,
16
to control real and virtual environments,
17
to
browse the Internet,
18
and to paint.
19
In a recent telephone survey of people with ALS,
Huggins et al
20
investigated the users’ opinions and priori-
ties on BCI design. Their study reports that the desired BCI
accuracy for people with ALS is at least 90%, but accura-
cies reported in the literature are usually lower. Moreover,
the expected spelling speed would be 15 to 19 letters per
minute, whereas in the published studies, it is only about 5
letters per minute.
Through a recent literature review, we have shown that
although there is a large amount of studies on BCIs, most of
them focus on methodological approaches, neglecting
usability aspects.
9,21
Most of the BCI studies, in fact, do not
consider that users often discard assistive technology after
only a few attempts and that personal factors (eg, mood,
motivation, belief, and predispositions) should be taken into
account because they can serve as both barriers and facilita-
tors in sustaining efficient use.
22,23
Although the effectiveness of BCIs in terms of character
selection is assessed in most studies, the efficiency and the
satisfaction from the users’ perspective are not always
addressed; however, they are part of a recent area of
growth.
12,21,24-27
Kleih et al
24
and Nijboer et al
12
explored the
effects of motivational factors on BCI performance.
Pasqualotto and colleagues
21,25
compared 2 prototypes of
BCIs in terms of usability and cognitive workload. Riccio et
al
26
investigated the influence of workload on the perfor-
mance of 2 P300-based BCI applications in healthy partici-
pants. Among these studies, only the study by Zickler et al
27
examined usability and cognitive workload in people with
severe disabilities, even though on a very limited sample.
Anyway, none of these studies compared BCIs with any
other existing AT.
It is our aim to provide a full usability assessment of a
P300-based BCI for controlling an Internet browser. To
accomplish this aim, we compared users’ performance and
usability scores using a P300 BCI and an eye tracking sys-
tem. More specifically, in a group of people with severe dis-
ability, we assessed and compared (1) the accuracy of
correct selections during 3 Internet tasks and (2) usability
indicators, such as control and cognitive workload, and
users’ satisfaction with both communication systems.
This study addresses the issue of whether and under
which conditions a BCI (specifically, the P300-BCI-based
Internet browser) could be a method of choice for users
with severe physical disabilities who still have residual con-
trol over some specific muscle groups and who could,
therefore, use conventional ATs. Furthermore, the study will
provide insights regarding which direction future BCI
research should follow in order to provide a viable alterna-
tive to conventional ATs.
Methods
We performed a comparison between the user’s experience
of controlling a BCI and an eye tracker, in a within-subject
design. Through 4 sessions of AT use (2 for the BCI and 2
for the eye tracker), the participants carried out 3 character
selection tasks, which represent common tasks that AT
users may perform when browsing the Internet. In each ses-
sion, participants used one of the ATs and were asked to
complete 2 questionnaires. The study was approved by the
Ethics Committee of the Medical Faculty of the University
of Tübingen and was performed in compliance with the
Code of Ethics of the World Medical Association
(Declaration of Helsinki).
Participants and Procedure
A total of 12 native German-speaking participants (4
women; mean age = 56.5 years; standard deviation =
±10.07) with severe motor impairment (11 affected by ALS,
and 1 affected by Duchenne muscular dystrophy; see Table
1), all naive to the AT assessed, were involved in the study.
All participants had home care assistance. Measurements
were performed in the participants’ homes.
at Univ Catholique Louvain Bib on March 16, 2015nnr.sagepub.comDownloaded from
Pasqualotto et al 3
After obtaining informed consent, we made 4 weekly
appointments, depending on the participants’ availability.
The order of presentation of the technology was balanced
between participants. On average, including calibration,
BCI sessions lasted 4 hours, and eye tracker sessions 2
hours. This difference was because of the longer time
needed to prepare the EEG cap and BCI calibration and the
fixed amount of time required by the BCI for the selection
of characters. Letters selection was, in fact, constrained by
the time required by the Speller to run one, or more, full
cycles of random flashing of columns and rows, in contrast
to the eye tracker where the user could select at his/her own
pace.
In the first BCI session and in the first eye tracker ses-
sion, we administered the Amyotrophic Lateral Sclerosis
Functional Rating Scale–Revised (ALSFRS-R).
28
In each
session, participants were asked to complete 3 copy-spell-
ing tasks (as described below in the Task section) by using
an Internet browser, which was controlled by the BCI or by
the eye tracker. We used the information transfer rate (ITR)
to measure performance. At the end of each session, we
administered the System Usability Scale (SUS)
29
and the
National Aeronautics and Space Administration–Task Load
Index (NASA-TLX).
30
Equipment
Brain-Computer Interface. We used an IBM Thinkpad laptop
to collect data, using the BCI2000 software.
31
All sessions
were recorded using an electrode cap (Easycap GmbH, Ger-
many) with 16 Ag/AgCl ring electrodes (F3, Fz, F4, T7, C3,
Cz, C4, T8, CP3, CP4, P3, Pz, P4, PO7, PO8, Oz) and
impedance held under 5 kΩ. The electrodes were connected
to a g.USBamp amplifier (g.tec OG, Austria) with a sam-
pling frequency of 256 Hz (bandpass: 0.1-30 Hz; notch:
48-52 Hz). Reference and ground were, respectively, on the
right and left mastoid. The intensifying of rows/columns of
the P300 Speller (see Figure 1) lasted for 62.5 ms, with an
interstimulus interval of 125 ms. The number of intensifica-
tion sequences varied per participant, according to their
calibration, performed at each of the 2 sessions. The cali-
bration consisted in the spelling of the same 2 words for
everyone (Apfelkuchen and Goldfisch in German—respec-
tively, apple pie and goldfish), consisting of 20 letters in
total. We set the number of intensification sequences to the
minimum number needed to reach 70% accuracy offline.
Table 1. Summary of Demographic and Clinical Status of the Participants.
a
Sex Age Diagnosis
Year of
Diagnosis ALSFRS-R
Degree of
Impairment Communication
Artificial
Nutrition
(PEG) Ventilation
01 F 53 Spinal 2008 23 Moderate Verbal No No
02 M 55 Spinal 2003 43 Minor Verbal No No
03 F 50 Bulbar 2003 17 Moderate Keyboard text-
to-speech
No Noninvasive
04 M 55 Spinal 1992 0 LIS Eye blink Yes Invasive
05 M 66 Spinal 1999 2 LIS Eye blink Yes No
06 M 71 Spinal 2005 11 Major Verbal No Noninvasive
07 M 55 Spinal 2006 23 Moderate Verbal No No
08 F 48 Spinal 2007 10 LIS Verbal Yes Noninvasive
09 M 70 Spinal 2008 18 Moderate Verbal No Noninvasive
10 F 54 Spinal 2003 0 LIS Chin joystick Yes Invasive
11 M 36 Duchenne 1976 7 LIS Verbal No Noninvasive
12 M 65 Bulbar 2009 32 Minor Verbal No No
Abbreviations: ALSFRS-R, Amyotrophic Lateral Sclerosis Functional Rating Scale–Revised; LIS, locked-in syndrome; PEG, percutaneous endoscopic
gastrostomy.
a
Degree of impairment categories were drawn according to Kübler and Birbaumer (2008).
40
Figure 1. The P300 matrix contains the complete alphabet
and part of the asterisk commands needed to select a double
code. Every row and column flashes randomly in a fixed interval.
By focusing the attention on a cell, the user can perform the
selection. In this frame, the third row is intensified.
at Univ Catholique Louvain Bib on March 16, 2015nnr.sagepub.comDownloaded from
4 Neurorehabilitation and Neural Repair
The lowest number of sequences reached was 5 and the
highest 8.
Eye Tracker. We used a SeeTech Pro (HumanElektronik
GmbH, Germany) set with a 7 × 7 grid. The SeeTech Pro is
a binocular infrared system with a 32-sample/s camera. The
grid locks the gaze in a cell to obviate the lack of precision.
For character selection, we used a static matrix of letters
modeled on the graphical appearance of the P300 Speller.
Participants could select characters by gazing at the intended
target and closing their eyes for 1.5 s. The visual appearance
of the eye tracker interface was identical to the one used for
the BCI, except for the input modality. The calibration
phase, performed at both sessions, consisted in fixating 9
positions on the screen located in the corners, the center,
and the extremities on the horizontal and vertical middle
lines. The screen, for both the eye tracker and the BCI, was
located at around 50 cm from the participant.
The P300 Browser. For Internet navigation, we used a newer
version (see Figure 1) of the P300 Speller-based browser
described by Mugler et al.
18
When loading a page, the
browser automatically assigns an alphabetical code to rep-
resent the hyperlinks (eg, the character “A” or a combina-
tion of 2 characters such as “AB”). By means of the Speller,
the user can select the codes corresponding to a particular
hyperlink and thus explore the web pages.
Tasks
The copy-spelling task is widely used in the BCI field.
12,21,32
The general idea behind this task is to provide the user with
a set of words or sentences to write with the communication
interface. In each session with the BCI or the eye tracker,
participants were asked to select a sequence of predefined
characters in the matrix, representing common Internet
tasks. Each participant performed the same task, with the
same instructions and words to spell. However, the codes
assigned by the browser to the links in the instructions were
not always the same. Participants were allowed to correct
errors, resulting in trials that could differ in length. The first
task consisted in checking the weather forecast by using a
search engine. The participants spelled the name of a
German magazine in the search bar of a search engine Web
site. After selecting the first result of the search, participants
were asked to select the weather section. Then, they had
time to observe the image of the forecast by selecting pause.
Finally, after unpausing they were asked to select the legend
at the bottom of the page, after scrolling the page. A mini-
mum of 15 character selections was required to accomplish
the task. The second task was a search in an online encyclo-
pedia. Starting from the encyclopedia’s main page, partici-
pants performed a search for the term brain (in German,
Gehirn). After checking the resulting page, participants
were asked to select the section about the human brain and
to scroll the page to the bottom. This task required a mini-
mum of 11 selections. The third task involved playing 2
songs on a Web site. Starting from the main page of a music
Web site, the participants performed a search for the term
Jazz. The result was a list of playable songs 30 s in length.
After selecting and then listening to the sixth song on the
list, they were asked to select the “Country” section and
then to listen to the first song. This task needed a minimum
of 14 total selections.
Measures
Performance. We used bits per minute (or ITR) to compare
the performance of the BCI and the eye tracker. Bit rate is a
standard measure for communication systems and repre-
sents the amount of information communicated per unit
time, depending on both speed and accuracy.
33
Based on
Pierce’s formula,
33
the bit rate is defined as follows:
BNPP P
P
N
=+ −−
()
loglog log
22 2
1
1
1
(1)
Here, N represents the number of possible selections in the
matrix and P the accuracy of character selection of the user.
Usability. We used the SUS to assess the usability of the sys-
tems. The SUS is a 10-item scale, with a global subjective
assessment of usability. It consists of 10 sentences with a
5-point Likert scale that ranges from 1 (strongly disagree)
to 5 (strongly agree). The SUS scale provides a score, rang-
ing from 0 to 100, that can be used to compare the usability.
Although, as also stated by the original author of the ques-
tionnaire, usability does not exist in an absolute sense, a
score of 70 has been suggested as the acceptable
minimum.
34
Cognitive Workload. The NASA-TLX is a multiscale tool to
evaluate the subjective cognitive workload considering its
possible different sources.
30
The NASA-TLX consists of 6
scales (mental demands, physical demands, temporal
demands, performance, effort, and frustration level), rated
in 2 stages. In the first stage, the user assigns a value to each
scale. In the second stage, the user is provided with the 15
pairs obtained by combining the 6 scales, with the goal to
choose the scale more relevant to workload from each pair.
This procedure is used to assess the weight of each scale.
The ratings and weights are combined to obtain the final
score, ranging from 0 to 100, with 100 representing the
highest workload experienced by the participant.
Functional Status. To investigate the relationship between
the functional status and the performance and to monitor
potential changes in participants’ functionality, we assessed
at Univ Catholique Louvain Bib on March 16, 2015nnr.sagepub.comDownloaded from
Pasqualotto et al 5
the functionality using the ALSFRS-R.
28
This scale implies
rating 12 items referring to different functions, such as
speech, swallowing, handwriting, and walking, on a 5-point
scale. The obtained score may range from 0 (most severe
impairment) to 48 (no impairment).
Analysis
We performed the comparison between the BCI and the eye
tracker using a Wilcoxon signed-rank test, on the averaged
data of the 2 sessions. We used bits per minute and the
scores of SUS and NASA-TLX, as independent variables.
For each technology, we correlated the scores of the ques-
tionnaires and the demographic data with performance data
to investigate how person-related factors affect perfor-
mance. We used stepwise linear discriminant analysis to
classify the EEG signal.
10
Calibration was performed with
the P300 Classifier tool provided with BCI2000.
Results
All participants were able to use both interfaces to accom-
plish the tasks (see Table 2). The mean ITR with the BCI
was 8.67 bits/min and with the eye tracker, 12.87 bits/min.
Thus, the Wilcoxon signed-rank test showed that the eye
tracker (median = 12.72) had a significantly higher ITR
than the BCI (median = 9.04; T = 9; P = .016, with an effect
size r = −0.68). The outcome of the SUS and the NASA-
TLX yielded similar results. The mean SUS score for the
BCI was 71.15, which according to Bangor et al
34
is right
above the acceptability threshold (in the third quartile),
whereas the score for the eye tracker was 78.54, which is
just beneath the lower boundary of the fourth quartile. The
direct comparison showed that the difference was signifi-
cant (eye tracker median = 80; BCI median = 71.25; T =
12.50; P = .035; r = −0.60). When using the NASA-TLX,
the absolute values are usually not considered a viable way
of describing the workload, and a comparison (eg, pre/post
measures, 2 different devices) is normally preferred. The
results of the NASA-TLX showed that the cognitive work-
load is higher for the BCI (median = 49.75) than for the eye
tracker (median = 33.53; T = 4; P = .003; r = −0.79).
The Spearman ρ analysis showed correlations between
performance, usability, and workload data with functional
status and disease duration. Age was not significant in our
comparison. The correlations revealed that the lower the
functional status of the participants, the lower their com-
munication rate, both using the BCI (r
s
= 0.640; P = .001)
and the eye tracker (r
s
= 0.430; P = .046; see Table 3).
Finally, results show that the longer the disease duration,
the lower the usability of a BCI (r
s
= −0.547; P = .008) and
the higher the cognitive workload of the BCI (r
s
= 0.544;
P = .009). We did not find this relation when considering
the eye tracker.
Discussion
We performed an exploratory study with the aim of compar-
ing 2 access technologies designed for people with severe
motor impairment. Our comparison between these inter-
faces has shown that the eye tracker is a faster and more
accurate technology that allows users to communicate with
a higher ITR than the BCI. The performance measures used
in this study showed the advantages of using the eye tracker
as a communication device. This advantage in performance
matches the findings on the usability and cognitive work-
load of the 2 interfaces. Participants rated the eye tracker as
a more satisfying device and considered the BCI as a tech-
nology requiring more effort and that was more time-con-
suming than the eye tracker. Differences in usability may be
partially a result of the longer time required for the use of
the BCI, and it is known that time can affect the perceived
fatigue, as also pointed out in more recent findings.
35
Because this time is intrinsic to the way this technology
works, future work on BCIs should address the issue of the
effort required by the end-users. Refined technology, such
as dry and wireless EEG,
36
and shifts to a classical condi-
tioning paradigm
37-39
could enhance not only the perfor-
mance, but also help in reducing the perceived effort of
BCIs. It is interesting to note that with the BCI, disease
duration plays a role with usability and workload, whereas
age does not. This finding, not confirmed with the eye
Table 2. Summary of the BCI and ET Interface Mean
Scores and Standard Deviations for the SUS, the NASA-TLX
Questionnaire for the Cognitive Workload, and the Bits Per
Minute.
Bits Per Minute SUS NASA-TLX
BCI 8.67 (3.43) 71.15 (11.31) 47.64 (14.87)
ET 12.87 (4.41) 78.54 (13.25) 32.72 (8.83)
Abbreviations: BCI, brain-computer interface; ET, eye tracking; SUS,
System Usability Scale; NASA-TLX, National Aeronautics and Space
Administration–Task Load Index.
Table 3. Summary of the Spearman ρ Correlations.
a
BCI Eye Tracker
Bits Per
Minute SUS
NASA-
TLX
Bits Per
Minute SUS
NASA-
TLX
ALSFRS-R 0.640** 0.280 −0.202 0.430* 0.241 −0.207
Disease
duration
−0.332 −0.547* 0.544** −0.118 −0.328 0.291
Age 0.239 0.153 0.140 −0.217 −0.269 0.002
Abbreviations: BCI, brain-computer interface; SUS, System Usability Scale; NASA-
TLX, National Aeronautics and Space Administration–Task Load Index; ALSFRS-R,
Amyotrophic Lateral Sclerosis Functional Rating Scale–Revised.
a
*P < .05; **P < .01.
at Univ Catholique Louvain Bib on March 16, 2015nnr.sagepub.comDownloaded from
6 Neurorehabilitation and Neural Repair
tracker, may be also explained by the longer time required
by BCIs. The relation between the functional status and per-
formance is a surprising finding. This result, found both
with the BCI and the eye tracker, seems to be contrary to
that of other studies,
11,40-42
although differences in the per-
formance measures and in the statistical tests should be
taken into account. Consequently, we cautiously abstain
from drawing strong conclusions based on this specific
finding.
Because the concept of usability is context related, our
findings should also be confirmed using the same method-
ology in contexts other than Internet use. Using different
control paradigms or classification methods may optimize
the BCI implementation we used, but we decided to evalu-
ate a well-established version of the paradigm for better
compatibility. Moreover, the eye tracker may profit from an
interface tailored specifically for eye tracking, whereas in
our study, it was used to control an interface made for BCIs.
It could also be interesting to investigate the usability of
invasive BCIs. As Hochberg et al
43
showed, people with
severe motor impairment can use intracortical neuronal
ensemble activity to achieve control, although rudimental,
of neuromotor prostheses. Despite the potential benefits of
a better signal, either because of restrictions on candidacy
or because of personal preferences, even when informed
about the advantages, a large number of patients do not
undergo the procedure.
8
Nevertheless, it is worth to mention
that the use of the P300 Speller in our implementation is
similar to the one used in previous noninvasive studies.
11-14
Although our study is based on a limited number of par-
ticipants in a specific context, we showed in a direct com-
parison that the use of the visual P300 might not be the first
choice as a communication channel for people with severe
physical disabilities. As suggested by our findings, when
users can rely on eye movements, they tend to consider the
eye tracker as a superior technology. As the literature sug-
gests, people with ALS can lose control of their eye move-
ments.
44
Moreover, Brunner et al
45
have recently shown that
in healthy individuals, the use of the visual P300 BCI may
be dependent on gaze direction. To overcome this issue,
new gaze-independent paradigms have been devel-
oped.
42,46,47
However, other studies on neuro-ophthalmic
abnormalities in people with ALS report retinal damage
48
and loss of visual acuity
49
associated with the course of the
disease. Considering the implications of our finding together
with the literature, we may conclude that when gaze control
is retained, people with ALS will choose to use the eye
tracker instead of the BCI, but when gaze and acuity are
lost, neither the eye tracker nor the visual BCI will work.
A possible solution to overcome this issue could be the
so-called hybrid BCI, which combines different brain fea-
tures with different data acquisition techniques, or even dif-
ferent non-BCI systems (such as electro-oculography or
electromyography). Hybrid BCIs can be used sequentially
or simultaneously, and in several studies, it has been proved
that they improve accuracy by focusing on the advantages
offered by the combined communication devices. For
example, if the user of an eye tracker has difficulties per-
forming eye blinks for selections and using dwell time gen-
erates too many errors, a BCI can be used to control
selections (brain switch).
50
Exploring different P300 BCI modalities, such as tac-
tile
51,52
and auditory devices,
32,53-58
may be another viable
solution to overcome the issues of the visual P300. Only 1
study on the auditory P300 included clinically relevant end
users.
53
Moreover, whereas in a single case study on a per-
son in locked-in syndrome (LIS) the authors reported prom-
ising results in the tactile modality,
52
in another case study
about the transition from LIS to complete LIS (CLIS), the
authors found no vibrotactile brain-evoked response.
59
Because of the inconclusive results, further research is
needed to determine the value of the tactile-haptic modality
for BCIs. Even though a delayed response in the auditory
areas has been reported in the literature,
60
we suggest that in
addition, the auditory modality should be further explored.
Conclusions
The present study highlighted that in certain conditions,
people with severe physical disabilities may prefer eye
trackers to visual BCIs, for their performance, usability, and
required cognitive effort. We suggest that future research in
BCIs should take into account these preferences and explore
modalities other than visual.
Acknowledgments
We would like to thank all the participants involved in this study.
We are very grateful to Humanelektronik GmbH for providing us
with one of their eye trackers and to Sven Körber (SirValUse
Consulting GmbH) for the German version of the System Usability
Scale. This study was carried out with the precious support of
Lasse Wiesinger, Anna-Antonia Pape, and Slavica Von Hartlieb
during the measurements. We are also grateful to the anonymous
reviewers for their suggestions as well as those of Dr Giulia
Liberati, which helped in improving the quality of the article.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect
to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support
for the research, authorship, and/or publication of this article: This
study was partially funded by the Inter-University Centre for
Research on Cognitive Processing in Natural and Artificial
Systems (ECONA), the Werner Reichardt Centre for Integrative
Neuroscience (CIN) pool project 2009-10, the Deutsche
Forschungsgemeinschaft (DFG), and the European ICT Program
at Univ Catholique Louvain Bib on March 16, 2015nnr.sagepub.comDownloaded from
Pasqualotto et al 7
Project FP7-288566. SH received funding as international research
fellow from the Japan Society for the Promotion of Science (JSPS)
and the Alexander von Humboldt Foundation.
References
1. Ball LJ, Beukelman DR, Pattee GL. Acceptance of augmen-
tative and alternative communication technology by persons
with amyotrophic lateral sclerosis. Augment Altern Commun.
2004;20:113-122.
2. Pannasch S, Helmert JR, Malischke S, Storch A, Velichkovsky
BM. Eye typing in application: a comparison of two systems
with ALS patients. J Eye Mov Res. 2008;2:1-8.
3. Majaranta P, Räihä KJ. Twenty years of eye typing: sys-
tems and design issues. In: Duchowski AT, Vertegaal R,
Senders JW, eds. Proceedings of the Eye Tracking Research
and Application Symposium, ETRA 2002, New Orleans,
Louisiana, USA, March 25-27, 2002. New York, NY: ACM;
2002:15-22.
4. Stampe DM, Reingold EM. Selection by looking: a novel
computer interface and its application to psychological
research. In: Findlay JM, Walker R, Kentridge RW, eds.
Eye Movement Research: Mechanisms, Processes and
Applications. Amsterdam, Netherlands: Elsevier Science
Publishers; 1995:467-478.
5. Calvo A, Chiò A, Castellina E, et al. Eye tracking impact on
quality-of-life of ALS patients. In: Miesenberger K, Klaus J,
Zagler W, Karshmer A, eds. Computers Helping People With
Special Needs: Vol 5105. Lecture Notes in Computer Science.
Heidelberg, Germany: Springer; 2008:70-77.
6. Gibbons C, Beneteau E. Functional performance using eye
control and single switch scanning by people with ALS.
Perspect Augment Altern Commun. 2010;19:64.
7. Ball LJ, Nordness A, Fager S, et al. Eye-gaze access of AAC
technology for persons with amyotrophic lateral sclerosis. J
Med Speech Lang Pathol. 2010;18:11-23.
8. Birbaumer N. Breaking the silence: brain-computer
interfaces (BCI) for communication and motor control.
Psychophysiology. 2006;43:517-532.
9. Pasqualotto E, Federici S, Olivetti Belardinelli M. Toward
functioning and usable brain computer interfaces (BCIs): a lit-
erature review. Disabil Rehabil Assist Technol. 2012;7:89-103.
10. Farwell LA, Donchin E. Talking off the top of your head:
toward a mental prosthesis utilizing event-related brain poten-
tials. Electroencephalogr Clin Neurophysiol. 1988;70:510-523.
11. Nijboer F, Sellers EW, Mellinger J, et al. A P300-based brain-
computer interface for people with amyotrophic lateral scle-
rosis. Clin Neurophysiol. 2008;119:1909-1916.
12. Nijboer F, Birbaumer N, Kübler A. The influence of psy-
chological state and motivation on brain-computer interface
performance in patients with amyotrophic lateral sclerosis: a
longitudinal study. Front Neurosci. 2010;4:pii: 55.
13. Sellers EW, Donchin E. A P300-based brain-computer
interface: initial tests by ALS patients. Clin Neurophysiol.
2006;117:538-548.
14. Piccione F, Giorgi F, Tonin P, et al. P300-based brain com-
puter interface: reliability and performance in healthy and
paralysed participants. Clin Neurophysiol. 2006;117:531-537.
15. Duncan CC, Barry RJ, Connolly JF, et al. Event-related
potentials in clinical research: guidelines for eliciting, record-
ing, and quantifying mismatch negativity, P300, and N400.
Clin Neurophysiol. 2009;120:1883-1908.
16. Iturrate I, Antelis J, Kübler A, Minguez J. Non-invasive
brain-actuated wheelchair based on a P300 neurophysiologi-
cal protocol and automated navigation. IEEE Trans Robot.
2009;25:614-627.
17. Bayliss JD. Use of the evoked potential P3 component for
control in a virtual apartment. IEEE Trans Neural Syst
Rehabil Eng. 2003;11:113-116.
18. Mugler EM, Ruf CA, Halder S, Bensch M, Kler A. Design
and Implementation of a P300-based brain-computer inter-
face for controlling an internet browser. IEEE Trans Neural
Syst Rehabil Eng. 2010;18:599-609.
19. Münßinger JI, Halder S, Kleih SC, et al. Brain painting: evalu-
ation of a new brain-computer interface application with ALS
patients and healthy volunteers. Front Neurosci. 2010;4:182.
20. Huggins JE, Wren PA, Gruis KL. What would brain-computer
interface users want? Opinions and priorities of potential
users with amyotrophic lateral sclerosis. Amyotroph Lateral
Scler. 2011;12:318-324.
21. Pasqualotto E, Simonetta A, Gnisci V, Federici S, Olivetti
Belardinelli M. Toward a usability evaluation of BCIs. Int J
Bioelectromagn. 2011;13:121-122.
22. Federici S, Borsci S. The use and non-use of assistive tech-
nology in Italy: a pilot study. In: Gelderblom GJ, Soede M,
Adriaens L, Miesenberger K, eds. Everyday Technology
for Independence and Care: Vol 29. Assistive Technology
Research Series. Amsterdam, Netherlands: IOS Press;
2011:979-986.
23. Scherer MJ, Craddock G, Mackeogh T. The relationship of
personal factors and subjective well-being to the use of assis-
tive technology devices. Disabil Rehabil. 2011;33:811-817.
24. Kleih SC, Nijboer F, Halder S, Kübler A. Motivation modu-
lates the P300 amplitude during brain-computer interface use.
Clin Neurophysiol. 2010;121:1023-1031.
25. Pasqualotto E, Federici S, Simonetta A, Olivetti Belardinelli
M. Usability of brain computer interfaces. In: Gelderblom
GJ, Soede M, Adriaens L, Miesenberger K, eds. Everyday
Technology for Independence and Care: Vol 29. Assistive
Technology Research Series. Amsterdam, Netherlands: IOS
Press; 2011:481-488.
26. Riccio A, Leotta F, Bianchi L, et al. Workload measurement in
a communication application operated through a P300-based
brain-computer interface. J Neural Eng. 2011;8:025028.
27. Zickler C, Halder S, Kleih SC, Herbert C, Kübler A. Brain
painting: usability testing according to the user-centered
design in end users with severe motor paralysis. Artif Intell
Med. 2013;59:99-110.
28. Cedarbaum JM, Stambler N, Malta E, et al. The ALSFRS-R:
a revised ALS functional rating scale that incorporates assess-
ments of respiratory function. BDNF ALS Study Group
(Phase III). J Neurol Sci. 1999;169:13-21.
29. Brooke J. SUS: a “quick and dirty” usability scale. In: Jordan
PW, Thomas B, Weerdmeester BA, McClelland IL, eds.
Usability Evaluation in Industry. London, UK: Taylor &
Francis; 1996:189-194.
at Univ Catholique Louvain Bib on March 16, 2015nnr.sagepub.comDownloaded from
8 Neurorehabilitation and Neural Repair
30. Hart SG, Staveland LE. Development of NASA-TLX (task
load index): results of empirical and theoretical research. In:
Hancock PA, Meshkati N, eds. Human Mental Workload.
Amsterdam, Netherlands: Elsevier Science; 1988:139-183.
31. Schalk G, McFarland DJ, Hinterberger T, Birbaumer N,
Wolpaw JR. BCI2000: a general-purpose brain-com-
puter interface (BCI) system. IEEE Trans Biomed Eng.
2004;51:1034-1043.
32. Käthner I, Ruf CA, Pasqualotto E, Braun C, Birbaumer N,
Halder S. A portable auditory P300 brain-computer interface
with directional cues. Clin Neurophysiol. 2013;124:327-338.
33. Pierce JR. An Introduction to Information Theory: Symbols,
Signals and Noise. New York, NY: Dover Publications; 1980.
34. Bangor A, Kortum PT, Miller JT. An empirical evaluation
of the system usability scale. Int J Hum Comput Interact.
2008;24:574-594.
35. Käthner I, Wriessnegger SC, Müller-Putz GR, Kübler A,
Halder S. Effects of mental workload and fatigue on the P300,
alpha and theta band power during operation of an ERP (P300)
brain-computer interface. Biol Psychol. 2014;102:118-129.
36. Zander TO, Lehne M, Ihme K, et al. A dry EEG-system
for scientific research and brain-computer interfaces. Front
Neurosci. 2011;5:53.
37. Furdea A, Ruf CA, Halder S, et al. A new (semantic) reflexive
brain-computer interface: in search for a suitable classifier. J
Neurosci Methods. 2012;203:233-240.
38. Liberati G, Dalboni da Rocha JL, van der Heiden L, et al.
Toward a brain-computer interface for Alzheimer’s disease
patients by combining classical conditioning and brain state
classification. J Alzheimers Dis. 2012;31:S211-S220.
39. van der Heiden L, Liberati G, Sitaram R, et al. Insula and
inferior frontal triangularis activations distinguish between
conditioned brain responses using emotional sounds for basic
BCI communication. Front Behav Neurosci. 2014;8:247.
40. Kübler A, Birbaumer N. Brain-computer interfaces and com-
munication in paralysis: extinction of goal directed think-
ing in completely paralysed patients? Clin Neurophysiol.
2008;119:2658-2666.
41. Silvoni S, Volpato C, Cavinato M, et al. P300-based brain-
computer interface communication: evaluation and follow-up
in amyotrophic lateral sclerosis. Front Neurosci. 2009;3:60.
42. Marchetti M, Piccione F, Silvoni S, Gamberini L, Priftis
K. Covert visuospatial attention orienting in a brain-com-
puter interface for amyotrophic lateral sclerosis patients.
Neurorehabil Neural Repair. 2013;27:430-438.
43. Hochberg LR, Serruya MD, Friehs GM, et al. Neuronal
ensemble control of prosthetic devices by a human with tet-
raplegia. Nature. 2006;442:164-171.
44. Sharma R, Hicks S, Berna CM, Kennard C, Talbot K, Turner
MR. Oculomotor dysfunction in amyotrophic lateral sclero-
sis: a comprehensive review. Arch Neurol. 2011;68:857-861.
45. Brunner P, Joshi S, Briskin S, Wolpaw JR, Bischof H, Schalk
G. Does the “P300” speller depend on eye gaze? J Neural
Eng. 2010;7:056013-056013.
46. Lim JH, Hwang HJ, Han CH, Jung KY, Im CH. Classification
of binary intentions for individuals with impaired oculomotor
function: “eyes-closed” SSVEP-based brain-computer inter-
face (BCI). J Neural Eng. 2013;10:026021.
47. Ahani A, Wiegand K, Orhan U, et al. RSVP IconMessenger:
icon-based brain-interfaced alternative and augmentative
communication. Brain Comput Interface. 2014;1:192-
203.
48. Ringelstein M, Albrecht P, Südmeyer M, et al. Subtle retinal
pathology in amyotrophic lateral sclerosis. Ann Clin Transl
Neurol. 2014;1:290-297.
49. Moss HE, McCluskey L, Elman L, et al. Cross-sectional
evaluation of clinical neuro-ophthalmic abnormalities in
an amyotrophic lateral sclerosis population. J Neurol Sci.
2012;314:97-101.
50. Amiri S, Fazel-Rezai R, Asadpour V. A review of hybrid
brain-computer interface systems. Adv Hum Comput Interact.
2013;2013:1.
51. Brouwer A-M, van Erp JBF. A tactile p300 brain-computer
interface. Front Neurosci. 2010;4:19.
52. Kaufmann T, Holz EM, Kübler A. Comparison of tactile,
auditory and visual modality for brain-computer interface
use: a case study with a patient in the locked-in state. Front
Neurosci. 2013;7:129.
53. Kübler A, Furdea A, Halder S, Hammer EM, Nijboer F,
Kotchoubey B. A brain-computer interface controlled audi-
tory event-related potential (p300) spelling system for locked-
in patients. Ann N Y Acad Sci. 2009;1157:90-100.
54. Halder S, Rea M, Andreoni R, et al. An auditory oddball brain-
computer interface for binary choices. Clin Neurophysiol.
2010;121:516-523.
55. Schreuder M, Blankertz B, Tangermann M. A new auditory
multi-class brain-computer interface paradigm: spatial hear-
ing as an informative cue. PLoS One. 2010;5:e9813.
56. Furdea A, Halder S, Krusienski DJ, et al. An auditory odd-
ball (P300) spelling system for brain-computer interfaces.
Psychophysiology. 2009;46:617-25.
57. Hill NJ, Moinuddin A, Häuser A-K, Kienzle S, Schalk G.
Communication and control by listening: toward optimal
design of a two-class auditory streaming brain-computer
interface. Front Neurosci. 2012;6:181.
58. Simon N, Käthner I, Ruf CA, Pasqualotto E, Kübler A,
Halder S. An auditory multiclass brain-computer interface
with natural stimuli: Usability evaluation with healthy partic-
ipants and a motor impaired end user. Front Hum Neurosci.
2015;8:1039.
59. Ramos Murguialday A, Hill J, Bensch M, et al. Transition
from the locked in to the completely locked-in state: a
physiological analysis. Clin Neurophysiol. 2011;122:925-
933.
60. Lulé D, Diekmann V, Müller HP, Kassubek J, Ludolph AC,
Birbaumer N. Neuroimaging of multimodal sensory stimu-
lation in amyotrophic lateral sclerosis. J Neurol Neurosurg
Psychiatry. 2010;81:899-906.
at Univ Catholique Louvain Bib on March 16, 2015nnr.sagepub.comDownloaded from
... In the online BCI system, while classification accuracy and bit rate are crucial metrics (Wolpaw et al., 2002), the paramount goal is to establish a system that is not only comprehensive but also user-friendly. This involves enhancing the system's usability Quek et al., 2013;van de Laar et al., 2013;Zickler et al., 2013;Riccio et al., 2015;Kübler et al., 2020), user experience (van de Laar et al., 2013), and user satisfaction (Zickler et al., 2011;Rupp et al., 2012;Holz et al., , 2015Quek et al., 2013;van de Laar et al., 2013;Pasqualotto et al., 2015;Vasilyev et al., 2017;Zander et al., 2017;Kübler et al., 2020). ...
... The overall usability of BCI systems can be evaluated using the System Usability Scale (SUS) after prototype testing (Pasqualotto et al., 2015;Zander et al., 2017). The SUS contains 10 items, with a global subjective assessment of overall usability. ...
... The SUS contains 10 items, with a global subjective assessment of overall usability. Each item's score ranges from 0 to 100 points, as illustrated in Table 8, where higher scores indicate better overall usability of the BCI system, and a score of 70 has been suggested as the acceptable minimum (Brooke, 1996;Bangor et al., 2008;Pasqualotto et al., 2015). ...
Article
Full-text available
Although brain-computer interface (BCI) is considered a revolutionary advancement in human-computer interaction and has achieved significant progress, a considerable gap remains between the current technological capabilities and their practical applications. To promote the translation of BCI into practical applications, the gold standard for online evaluation for classification algorithms of BCI has been proposed in some studies. However, few studies have proposed a more comprehensive evaluation method for the entire online BCI system, and it has not yet received sufficient attention from the BCI research and development community. Therefore, the qualitative leap from analyzing and modeling for offline BCI data to the construction of online BCI systems and optimizing their performance is elaborated, and then user-centred is emphasized, and then the comprehensive evaluation methods for translating BCI into practical applications are detailed and reviewed in the article, including the evaluation of the usability (including effectiveness and efficiency of systems), the evaluation of the user satisfaction (including BCI-related aspects, etc.), and the evaluation of the usage (including the match between the system and user, etc.) of online BCI systems. Finally, the challenges faced in the evaluation of the usability and user satisfaction of online BCI systems, the efficacy of online BCI systems, and the integration of BCI and artificial intelligence (AI) and/or virtual reality (VR) and other technologies to enhance the intelligence and user experience of the system are discussed. It is expected that the evaluation methods for online BCI systems elaborated in this review will promote the translation of BCI into practical applications.
... The results of both works are discussed in Section 4.3.1. In addition, a comparison of BCI and eye tracking in eye typing studies using a spelling program with people with severe motor impairments (Pasqualotto et al., 2015). ...
... The decision of which input modality is more appropriate depends strongly on the technology used, the goal of the task, and the control algorithm. A number of comparative studies with different experimental setups were identified in this review (Dünser et al., 2015;Pasqualotto et al., 2015;Jones et al., 2018;Schäfer and Gebhard, 2019;Stalljann et al., 2020). Figure 4 shows the articles and the input modalities. ...
... Overall, VOG and IOG were rated more positively than the other modalities. Positive aspects were identified as subjective rating (low workload/cognitive load (Pasqualotto et al., 2015;Stalljann et al., 2020), comfort (Schäfer and Gebhard, 2019), and robust functionality (Schäfer and Gebhard, 2019). Pasqualotto et al. (2015) suggested its use as a communication device, similar to Schäfer and Gebhard (2019), who stated its use for discrete events, such as trigger events, based on the traceability of fast eye movements. ...
Article
Background: Assistive Robotic Arms are designed to assist physically disabled people with daily activities. Existing joysticks and head controls are not applicable for severely disabled people such as people with Locked-in Syndrome. Therefore, eye tracking control is part of ongoing research. The related literature spans many disciplines, creating a heterogeneous field that makes it difficult to gain an overview. Objectives: This work focuses on ARAs that are controlled by gaze and eye movements. By answering the research questions, this paper provides details on the design of the systems, a comparison of input modalities, methods for measuring the performance of these controls, and an outlook on research areas that gained interest in recent years. Methods: This review was conducted as outlined in the PRISMA 2020 Statement. After identifying a wide range of approaches in use the authors decided to use the PRISMA-ScR extension for a scoping review to present the results. The identification process was carried out by screening three databases. After the screening process, a snowball search was conducted. Results: 39 articles and 6 reviews were included in this article. Characteristics related to the system and study design were extracted and presented divided into three groups based on the use of eye tracking. Conclusion: This paper aims to provide an overview for researchers new to the field by offering insight into eye tracking based robot controllers. We have identified open questions that need to be answered in order to provide people with severe motor function loss with systems that are highly useable and accessible.
... Prior studies have significantly contributed to the understanding of assistive technologies' usability and workload. For example, Pasqualotto et al. [23] compared access technologies, highlighting the need for user-friendly and low-workload solutions for individuals who have severe motor impairments. Similarly, other studies [22] have demonstrated their potential in enhancing smartphone authentication and smart home control for users who have disabilities using eye gaze interaction. ...
... However, the limited scope of this study, which focused on a single user who had a disability, raises questions about the generalizability of the findings. Another study, [23], found that users who had a motor disability (but provided no information on their speech ability) gave eye gaze interactions an average SUS score of 78.54, which is higher than our result but lower than that of [22]. A broader participant base in future studies could offer more comprehensive insights into the usability of eye gaze systems. ...
Article
Full-text available
This study explores the effectiveness and user experience of different interaction methods used by individuals with dysarthria when engaging with Smart Virtual Assistants (SVAs). It focuses on three primary modalities: direct speech commands through Alexa, non-verbal voice cues via the Daria system, and eye gaze control. The objective is to assess the usability, workload, and user preferences associated with each method, catering to the varying communication capabilities of individuals with dysarthria. While Alexa and Daria facilitate voice-based interactions, eye gaze control offers an alternative for those unable to use voice commands, including users with severe dysarthria. This comparative approach aims to determine how the usability of each interaction method varies, conducted with eight participants with dysarthria. The results indicated that non-verbal voice interactions, particularly with the Daria system, were favored because of their lower workload and ease of use. The eye gaze technology, while viable, presented challenges in terms of the higher workload and usability. These findings highlight the necessity of diversifying interaction methods with SVAs to accommodate the unique needs of individuals with dysarthria.
... For instance, eye tracking technology can serve as an aid for severely impaired patients and re-enable them to participate and communicate in everyday life (e.g. Ball et al., 2010;Borgestig et al., 2016;Pasqualotto et al., 2015). As eye tracking technology is becoming increasingly widespread, it is also used in everyday applications (Hyrskykari et al., 2005;Majaranta & Bulling, 2014). ...
... First, our results can form the basis for deeper insights into sense of agency for eye movements, which in turn can be helpful in the development of eye-tracking applications. This technology plays a vital role for severely disabled patients in communication and participation in everyday life (e.g., Ball et al., 2010;Borgestig et al., 2016;Pasqualotto et al., 2015) and is being integrated into an increasing number of everyday life applications (e.g., Hyrskykari et al., 2005;Majaranta & Bulling, 2014). However, since the ability to influence the environment with eye movements is an unnatural and poorly learned phenomenon, everyday eye-tracking faces several problems with respect to its application. ...
Article
Full-text available
This study investigates the sense of agency (SoA) for saccades with implicit and explicit agency measures. In two eye tracking experiments, participants moved their eyes towards on-screen stimuli that subsequently changed color. Participants then either reproduced the temporal interval between saccade and color-change (Experiment 1) or reported the time points of these events with an auditory Libet clock (Experiment 2) to measure temporal binding effects as implicit indices of SoA. Participants were either made to believe to exert control over the color change or not (agency manipulation). Explicit ratings indicated that the manipulation of causal beliefs and hence agency was successful. However, temporal binding was only evident for caused effects, and only when a sufficiently sensitive procedure was used (auditory Libet clock). This suggests a feebler connection between temporal binding and SoA than previously proposed. The results also provide evidence for a relatively fast acquisition of sense of agency for previously never experienced types of action-effect associations. This indicates that the underlying processes of action control may be rooted in more intricate and adaptable cognitive models than previously thought. Oculomotor SoA as addressed in the present study presumably represents an important cognitive foundation of gaze-based social interaction (social sense of agency) or gaze-based human-machine interaction scenarios.
... For instance, eye tracking technology can serve as an aid for severely impaired patients and re-enable them to participate and communicate in everyday life (e.g. Ball et al., 2010;Borgestig et al., 2016;Pasqualotto et al., 2015). As eye-tracking technology is becoming increasingly widespread, it is also used in everyday applications (Hyrskykari et al., 2005;Majaranta & Bulling, 2014). ...
... First, our results can form the basis for deeper insights into sense of agency for eye movements, which in turn can be helpful in the development of eyetracking applications. This technology plays a vital role for severely disabled patients in communication and participation in everyday life (e.g., Ball et al., 2010;Borgestig et al., 2016;Pasqualotto et al., 2015) and is being integrated into an increasing number of everyday life applications (e.g., Hyrskykari et al., 2005;Majaranta & Bulling, 2014). ...
... NHMIs are characterized by low demand for physical effort (resulting in high usability and immersivity); hence, they are suitable even for individuals affected by physical disabilities [14] or lacking prior experience in XR-based HMI [7]. A useful categorization of XR-based NHMIs can be drawn between: ...
Article
Full-text available
This paper presents an innovative measurement method for assessing the information transfer performance of hands-free Human-Machine Interfaces (HMIs) based on eXtended Reality (XR) technology. The proposed method primarily involves the design and implementation of a dedicated XR environment, which serves as a testbed for data acquisition. Following this, an experimental campaign is conducted, involving multiple acquisition cycles for different individuals. Finally, the proposed method enables the extraction of two primary metrics, namely Selection Accuracy and Information Transfer Rate (ITR), indicative of the potential of the considered HMIs to transfer information. These metrics account for both intra-individual and inter-individual variability within the HMIs, thus providing a metrologically sound assessment of performance. The proposed method is validated through a practical case study. Three NHMIs are considered: Eye-Tracking, Head-Tracking, and Brain-Computer Interfaces (BCIs) based on Steady-State Visually Evoked Potentials (SSVEPs), as they allow hands-free interactions solely through visual observation. Without loss of generality, Microsoft HoloLens 2 and Unicorn Hybrid Black were used as XR and BCI platforms, respectively. The experimental findings obtained from eight healthy individuals allowed a comparative analysis of the performance of the three distinct HMIs, facilitating a better understanding of which interface might be more robust for a given application scenario. Overall, the proposed method represents a reliable performance assessment of innovative HMIs. This becomes increasingly significant considering the evolution of wearable HMIs and the current lack of comprehensive strategies for their characterization.
... Second, the continuous collection of self-utilized data can circumvent the inconvenience of patients and caregivers traveling long distances for research and clinical appointments [11,12]. Finally, as late-stage ALS patients rely on eye-tracking technology to communicate with their caregivers [13], it is dually important to understand the eye-movement abnormalities that could potentially affect this important means of communication. ...
Article
Full-text available
Introduction Amyotrophic lateral sclerosis (ALS) can affect various eye movements, making eye tracking a potential means for disease monitoring. In this study, we evaluated the feasibility of ALS patients self-recording their eye movements using the “EyePhone,” a smartphone eye-tracking application. Methods We prospectively enrolled ten participants and provided them with an iPhone equipped with the EyePhone app and a PowerPoint presentation with step-by-step recording instructions. The goal was for the participants to record their eye movements (saccades and smooth pursuit) without the help of the study team. Afterward, a trained physician administered the same tests using video-oculography (VOG) goggles and asked the participants to complete a questionnaire regarding their self-recording experience. Results All participants successfully completed the self-recording process without assistance from the study team. Questionnaire data indicated that participants viewed self-recording with EyePhone favorably, considering it easy and comfortable. Moreover, 70% indicated that they prefer self-recording to being recorded by VOG goggles. Conclusion With proper instruction, ALS patients can effectively use the EyePhone to record their eye movements, potentially even in a home environment. These results demonstrate the potential for smartphone eye-tracking technology as a viable and self-administered tool for monitoring disease progression in ALS, reducing the need for frequent clinic visits.
... Blink-To-Live follows an Indirect-eye contact tracking approach called a computer-vision-based eye-tracking approach. The comparison considered the results reported from different studies that evaluate different eye-tracking approaches for ALS patient communication 15,19,35,57,58 . The Blink-To-Live system does not rely on special hardware devices or sensors to initiate the patient's communication. ...
Article
Full-text available
Eye-based communication languages such as Blink-To-Speak play a key role in expressing the needs and emotions of patients with motor neuron disorders. Most invented eye-based tracking systems are complex and not affordable in low-income countries. Blink-To-Live is an eye-tracking system based on a modified Blink-To-Speak language and computer vision for patients with speech impairments. A mobile phone camera tracks the patient’s eyes by sending real-time video frames to computer vision modules for facial landmarks detection, eye identification and tracking. There are four defined key alphabets in the Blink-To-Live eye-based communication language: Left, Right, Up, and Blink. These eye gestures encode more than 60 daily life commands expressed by a sequence of three eye movement states. Once the eye gestures encoded sentences are generated, the translation module will display the phrases in the patient’s native speech on the phone screen, and the synthesized voice can be heard. A prototype of the Blink-To-Live system is evaluated using normal cases with different demographic characteristics. Unlike the other sensor-based eye-tracking systems, Blink-To-Live is simple, flexible, and cost-efficient, with no dependency on specific software or hardware requirements. The software and its source are available from the GitHub repository (https://github.com/ZW01f/Blink-To-Live).
... In addition, a study reported that the conventional control interfaces such as mouse and buttons scored SUS of 84 on average with elderly participants [38]. Given that most of the BCI studies focus on methodological aspects while neglecting usability aspects [39], enhancing the usability of the proposed HA system could be an interesting research topic that has to be pursued in the future study. The usability of the proposed system may be enhanced by i) employing control functions preferred by participants via a pre-experimental survey, or ii) adding functions such as communication as in [38]. ...
Article
Full-text available
Over the past decades, brain-computer interfaces (BCIs) have been developed to provide individuals with an alternative communication channel toward external environment. Although the primary target users of BCI technologies include the disabled or the elderly, most newly developed BCI applications have been tested with young, healthy people. In the present study, we developed an online home appliance control system using a steady-state visual evoked potential (SSVEP)-based BCI with visual stimulation presented in an augmented reality (AR) environment and electrooculogram (EOG)-based eye tracker. The performance and usability of the system were evaluated for individuals aged over 65. The participants turned on the AR-based home automation system using an eye-blink-based switch, and selected devices to control with three different methods depending on the user’s preference. In the online experiment, all 13 participants successfully completed the designated tasks to control five home appliances using the proposed system, and the system usability scale exceeded 70. Furthermore, the BCI performance of the proposed online home appliance control system surpassed the best results of previously reported BCI systems for the elderly.
Article
Full-text available
The purpose of this study was to describe a group of individuals with amyotrophic lateral sclerosis training and using the Eye-gaze Response Interface Computer Aid (ERICA) with Type & Talk or LifeMate 1.1 communication software. Fifteen people with ALS participated in the study, and all but one successfully used the ERICA as his or her primary communication device. The sole participant who discontinued use experienced the onset of impaired eyelid control during training. Results indicate that the ERICA was used to support a number of different communication functions, such as face-to-face interaction (100%), group communication (43%), phone calls (71%), e-mail (79%), and Internet access (86%). In an effort to optimize eye-gaze tracking to support communication, a number of environmental, positioning, and calibration adjustments are reported.
Article
Full-text available
Brain-computer interfaces (BCIs) can serve as muscle independent communication aids. Persons, who are unable to control their eye muscles (e.g., in the completely locked-in state) or have severe visual impairments for other reasons, need BCI systems that do not rely on the visual modality. For this reason, BCIs that employ auditory stimuli were suggested. In this study, a multiclass BCI spelling system was implemented that uses animal voices with directional cues to code rows and columns of a letter matrix. To reveal possible training effects with the system, 11 healthy participants performed spelling tasks on 2 consecutive days. In a second step, the system was tested by a participant with amyotrophic lateral sclerosis (ALS) in two sessions. In the first session, healthy participants spelled with an average accuracy of 76% (3.29 bits/min) that increased to 90% (4.23 bits/min) on the second day. Spelling accuracy by the participant with ALS was 20% in the first and 47% in the second session. The results indicate a strong training effect for both the healthy participants and the participant with ALS. While healthy participants reached high accuracies in the first session and second session, accuracies for the participant with ALS were not sufficient for satisfactory communication in both sessions. More training sessions might be needed to improve spelling accuracies. The study demonstrated the feasibility of the auditory BCI with healthy users and stresses the importance of training with auditory multiclass BCIs, especially for potential end-users of BCI with disease.
Article
Full-text available
One of the principal application areas for brain-computer interface (BCI) technology is augmentative and alternative communication (AAC), typically used by people with severe speech and physical disabilities (SSPI). Existing word- and phrase-based AAC solutions that employ BCIs that utilize electroencephalography (EEG) are sometimes supplemented by icons. Icon-based BCI systems that use binary signaling methods, such as P300 detection, combine hierarchical layouts with some form of scanning. The rapid serial visual presentation (RSVP) IconMessenger combines P300 signal detection with the icon-based semantic message construction system of iconCHAT. Language models are incorporated in the inference engine and some modifications that facilitate the use of RSVP were performed such as icon semantic role order selection and the tight fusion of language evidence and EEG evidence. The results of a study conducted with 10 healthy participants suggest that the system has potential as an AAC system in real-time typing applications. Ability to construct messages with reduced physical movement demands due to RSVP and increased message construction speed and accuracy due to the incorporation of an icon-based language model in the inference process are the significant findings of this study.
Article
Full-text available
In order to enable communication through a brain-computer interface (BCI), it is necessary to discriminate between distinct brain responses. As a first step, we probed the possibility to discriminate between affirmative (“yes”) and negative (“no”) responses using a semantic classical conditioning paradigm, within an fMRI setting. Subjects were presented with congruent and incongruent word-pairs as conditioned stimuli (CS), respectively eliciting affirmative and negative responses. Incongruent word-pairs were associated to an unpleasant unconditioned stimulus (scream, US1) and congruent word-pairs were associated to a pleasant unconditioned stimulus (baby-laughter, US2), in order to elicit emotional conditioned responses (CR). The aim was to discriminate between affirmative and negative responses, enabled by their association with the positive and negative affective stimuli. In the late acquisition phase, when the US were not present anymore, there was a strong significant differential activation for incongruent and congruent word-pairs in a cluster comprising the left insula and the inferior frontal triangularis. This association was not found in the habituation phase. These results suggest that the difference in affirmative and negative brain responses was established as an effect of conditioning, allowing to further investigate the possibility of using this paradigm for a binary choice BCI.
Article
Full-text available
Amyotrophic lateral sclerosis (ALS) is characterized by neuro-ophthalmological abnormalities beyond disturbed oculomotor control such as decreased visual acuity and disturbed visual evoked potentials. Here we report retinal alterations in a cohort of 24 patients with clinically definite (n = 20) or probable (n = 4) ALS as compared to matched controls. High-resolution spectral domain optical coherence tomography with retinal segmentation revealed a subtle reduction in the macular thickness and the retinal nerve fiber layer (RNFL) as well as a marked thinning of the inner nuclear layer (INL). Our data indicate an unprecedented retinal damage pattern and suggest neurodegeneration beyond the motor system in this disease.
Article
Full-text available
Increasing number of research activities and different types of studies in brain-computer interface (BCI) systems show potential in this young research area. Research teams have studied features of different data acquisition techniques, brain activity patterns, feature extraction techniques, methods of classifications, and many other aspects of a BCI system. However, conventional BCIs have not become totally applicable, due to the lack of high accuracy, reliability, low information transfer rate, and user acceptability. A new approach to create a more reliable BCI that takes advantage of each system is to combine two or more BCI systems with different brain activity patterns or different input signal sources. This type of BCI, called hybrid BCI, may reduce disadvantages of each conventional BCI system. In addition, hybrid BCIs may create more applications and possibly increase the accuracy and the information transfer rate. However, the type of BCIs and their combinations should be considered carefully. In this paper, after introducing several types of BCIs and their combinations, we review and discuss hybrid BCIs, different possibilities to combine them, and their advantages and disadvantages.
Article
The study aimed at revealing electrophysiological indicators of mental workload and fatigue during prolonged usage of a P300 brain-computer interface (BCI). Mental workload was experimentally manipulated with dichotic listening tasks. Medium and high workload conditions alternated. Behavioral measures confirmed that the manipulation of mental workload was successful. Reduced P300 amplitude was found for the high workload condition. Along with lower performance and an increase in the subjective level of fatigue, an increase of power in the alpha band was found for the last as compared to the first run of both conditions. The study confirms that a combination of signals derived from the time and frequency domain of the electroencephalogram is promising for the online detection of workload and fatigue. It also demonstrates that satisfactory accuracies can be achieved by healthy participants with the P300 speller, despite constant distraction and when pursuing the task for a long time.
Article
For many years the reestablishment of communication for people with severe motor paralysis has been in the focus of brain-computer interface (BCI) research. Recently applications for entertainment have also been developed. Brain Painting allows the user creative expression through painting pictures. The second, revised prototype of the BCI Brain Painting application was evaluated in its target function - free painting - and compared to the P300 spelling application by four end users with severe disabilities. According to the International Organization for Standardization (ISO), usability was evaluated in terms of effectiveness (accuracy), efficiency (information transfer rate (ITR)), utility metric, subjective workload (National Aeronautics and Space Administration Task Load Index (NASA TLX)) and user satisfaction (Quebec User Evaluation of Satisfaction with assistive Technology (QUEST) 2.0 and Assistive Technology Device Predisposition Assessment (ATD PA), Device Form). The results revealed high performance levels (M≥80% accuracy) in the free painting and the copy painting conditions, ITRs (4.47-6.65bits/min) comparable to other P300 applications and only low to moderate workload levels (5-49 of 100), thereby proving that the complex task of free painting did neither impair performance nor impose insurmountable workload. Users were satisfied with the BCI Brain Painting application. Main obstacles for use in daily life were the system operability and the EEG cap, particularly the need of extensive support for adjustment. The P300 Brain Painting application can be operated with high effectiveness and efficiency. End users with severe motor paralysis would like to use the application in daily life. User-friendliness, specifically ease of use, is a mandatory necessity when bringing BCI to end users. Early and active involvement of users and iterative user-centered evaluation enable developers to work toward this goal.