Conference PaperPDF Available

Brain-Computer Interface for a Prosthetic Hand Using Local Machine Control and Haptic Feedback

Authors:
  • University of Tuebingen / TECNALIA, San Sebastian, Spain

Abstract

A brain-computer interface (BCI) uses electrophysiological measures of brain function to enable individuals to communicate with the external world, bypassing normal neuromuscular pathways. While it has been suggested that this control can be applied for neuroprostheses, few studies have demonstrated practical BCI control of a prosthetic device. In this paper, an electroencephalogram (EEG)-based motor imagery BCI is presented to control movement of a prosthetic hand. The hand was instrumented with force and angle sensors to provide haptic feedback and local machine control. Using this system, subjects demonstrated the ability to control the prosthetic's grasping force with accuracy comparable to an EMG-based control scheme. Further work is necessary to improve the integration of BCI control strategies with prostheses.
Proceedings
of
the
2007
IEEE
10th
International
Conference
on
Rehabilitation
Robotics,
June
12-15,
Noordwijk,
The
Netherlands
Brain-Computer
Interface
for
a
Prosthetic
Hand
Using
Local
Machine
Control
and
Haptic
Feedback
Ander
Ramos
Murguialday,
Vikram
Aggarwal,
Aniruddha
Chatterjee,
Yoonju
Cho,
Robert
Rasmussen,
Brandon
O'Rourke,
Soumyadipta
Acharya,
Nitish
V.
Thakor
Abstract-A
Brain-Computer
Interface
(BCI)
uses
electrophysiological
measures
of
brain
function
to
enable
individuals
to
communicate
with
the
external
world,
bypassing
normal
neuromuscular
pathways.
While
it
has
been
suggested
that
this
control
can
be
applied
for
neuroprostheses,
few
studies
have demonstrated
practical
BCI
control
of
a
prosthetic
device.
In
this
paper,
an
electroencephalogram
(EEG)-based
motor
imagery
BCI
is
presented
to
control
movement
of
a
prosthetic
hand.
The
hand
was
instrumented
with
force
and
angle
sensors
to
provide
haptic
feedback
and
local
machine
control.
Using
this
system,
subjects
demonstrated
the
ability
to
control
the
prosthetic's
grasping
force
with
accuracy
comparable
to
an
EMG-based
control
scheme.
Further
work
is
necessary
to
improve
the
integration
of
BCI
control
strategies
with
prostheses.
I.
INTRODUCTION
R
ecent
advances
in
the
development
of
Brain-Computer
Interfaces
(BCI)
have
enabled
practical
solutions
to
help
individuals
overcome
severe
neuromuscular
disabilities.
By
modulating
changes
in
their
electroencephalographic
(EEG)
activity,
BCI
users
have
demonstrated
two-dimensional
cursor
control
and
the
ability
to
type
out
messages
on
a
virtual
keyboard
[1-3].
It
has
been
suggested
that
an
EEG-based
BCI
may
also
eventually
lead
to
neural
control
of
advanced
prostheses
[4-5].
The
current
staple
for
prosthetic
hand
control
is
to
use
myoelectric
control
signals
to
enable
simple
grasping
movements.
However,
this
method
is
limited
to
patients
that
have
sufficient
independent
control
points
to
interface
with
electromyogram
(EMG)
electrodes.
Users
with
a
high-level
of
amputation,
such
as
a
shoulder
disarticulation,
lack
many
muscle-based
control
points
and
must
use
unwieldy
cable
and
switch-operated
systems
to
control
their
prosthesis
[6-7].
BCI
control
strategies
may
eventually
offer
a
more
natural
way
to
operate
prostheses
than
conventional
myoelectric
and
cable-driven
mechanisms.
While
previous
literature
has
focused
on
more
invasive
cortical
strategies
for
neuroprosthetic
control
[8-10],
there
has
been
little
in
the
way
Manuscript
received
February
10,
2007.
This
work
was
supported
by
the
Revolutionizing
Prosthetics
2009
program
and
funded
by
the
Defense
Advanced
Research
Project
Agency
(DARPA).
Vikram
Aggarwal,
Aniruddha
Chatterjee,
Yoonju
Cho,
Robert
Rasmussen,
Brandon
O'Rourke,
Soumyadipta
Acharya
and
Nitish
V.
Thakor
are
with
the
Department
of
Biomedical
Engineering
at
The
Johns
Hopkins
University,
Baltimore,
USA
(e-mail:
aa
jh)
and
Ander
Ramos
Murguialday
is
with
the
Department
of
Biomedical
Engineering
at
the
Technical
University
of
Munich, Munich,
Germany
and
with
the
Fatronik
Foundation,
San
Sebastian,
Spain
(e-mail:
a.
of
demonstrating
practical
control
of
an
advanced
upper-limb
prosthesis
using
a
noninvasive
BCI
platform.
Moreover,
the
effect
of
incorporating
haptic
feedback
with
a
cortical
control
strategy
to
close
the
control
loop
has
not
been
studied
in
depth.
A
noninvasive
BCI
platform
is
described
that
can
be
used
to
control
a
prosthetic
hand.
The
EEG-based
BCI
is
based
on
the
modulation
of
mu
(8-12
Hz)
rhythm
activity
via
motor
imagery
tasks,
which
is
a
well-documented
BCI
control
strategy
[1
1-13
].
Actual
or
imagined
motor
movements
result
in
an
event-related
desynchronization
(ERD)
in
spectral
power
at
those
frequencies
over
the
sensorimotor
cortex,
and
subjects
can
learn
to
modulate
their
mu
band
power
to
produce
a
reliable
1-D
control
signal.
The
platform
is
designed
to
distinguish
between
three
states:
relaxation
and
two
separable
desynchronizations
that
are
operant-conditioned
from
a
starting
baseline
of
right
hand
versus
left
hand
motor
imagery.
Furthermore,
feedback
from
force
and
position
sensors
on
the
prosthetic
hand
allows
the
user
to
achieve
a
stable
grip
without
applying
unnecessary
force
and
further
cognitive
input
-
dubbed
as
local
machine
control.
Visual
and
vibrotactile
haptic
feedback
provide
the
user
with
closed-loop
control
and
improved
user
perception
and
force
modulation.
Vibrotactile
feedback
is
a
simple
and
safe
mechanism
commonly
used
in
noninvasive
haptic
feedback
systems
[14],
and
previous
prosthetics
research
has
looked
at
how
such
feedback
systems
have
been
used
to
convey
the
intensity
of
grasping
force
[15-16].
Since
any
advanced
neuroprosthetic
control
will
inevitably
require
communicating
different
haptic
inputs
to
the
user,
the
integration
of
haptic
biofeedback
to
BCI
applications
deserves
to
be
investigated.
Figure
1
shows
the
overall
experimental
platform,
which
demonstrates
how
an
EEG-based
BCI
coupled
with
local
machine
control
and
various
modalities
of
haptic
feedback
can
achieve
prosthetic
control.
It
is
also
shown
how
a
context-sensitive
control
strategy
can
help
overcome
the
low-bandwidth
and
high-limitations
of
conventional
BCI
platforms
in
order
to
perform
more
complex
tasks.
II.
METHODS
A.
Experimental
Setup
Subjects
used
a
three-state
EEG-based
BCI
to
control
a
prosthetic
hand.
Upon
hearing
auditory
cues
(overlaid
white
noise),
the
subject
would
use
the
corresponding
motor
imagery
task to
open
or
close
the
prosthetic
hand.
In
addition
1-4244-1320-6/07/$25.00
(c)2007
IEEE
609
Proceedings
of
the
2007
IEEE
10th
International
Conference
on
Rehabilitation
Robotics,
June
12-15,
Noordwijk,
The
Netherlands
Fig.
1.
Experimental
Setup.
Subject
uses
either
EMG
or
EEG
control
to
enable
grasping
of
objects
using
an
Otto
Bock
prosthetic
hand.
Local
machine
control
and
different
haptic
feedback
modalities
help
the
subject
achieve
closed-loop
control.
to
direct
visual
feedback
of
the
hand,
two
methods
of
haptic
feedback
were
supported
for
the
BCI;
1)
visual
feedback
of
force
as
a
vertical
bar
on
a
laptop
screen
1
m
from
the
subject
and
2)
vibrotactile
feedback
proportional
to
the
force
through
a
voice
coil
motor
placed
on
the
subject's
arm.
B.
Data
Acquisition
EEG
and
EMG
data
were
acquired
using
a
Compumedics
(El
Paso,
TX,
USA)
Neuroscan
SynAmps2
64-channel
amplifier.
A
QuickCap
64-channel
EEG
cap
(modified
10-20
system)
from
the
same
company
was
used
for
EEG
data
acquisition;
referenced
between
Cz
and
CPz,
and
grounded
anteriorly
to
Fz.
Two
bipolar
Ag/AgCl
electrodes
from
Myotronics-Noromed
(Tukwila,
WA,
USA)
were
used
for
EMG
data
acquisition
and
placed
on
antagonistic
muscle
pairs;
one
close
to
the
external
epicondyle
on
the
extensor
digitorum
(extension),
and
the
other
on
the
flexor
carpi
radialis
(flexion).
A
single
Cleartrace
LT
reference
electrode
from
ConMed
Corporation
(Utica,
NY,
USA)
was
placed
over
the
olecranon.
The
SynAmps2
amplifier
and
signal
processing
modules
were
connected
through
a
client-server
architecture,
with
the
amplifier
acting
as
the
server
and
the
signal
processing
module
running
on
a
stand-alone
client
PC.
Data
were
sampled
at
250
Hz
and
transmitted
over
a
TCP/IP
protocol
to
the
client
PC
for
storage
and
real-time
signal
processing
using
a
custom
BCI
platform.
C.
Prosthetic
Hand
A
digital
myolectric
prosthetic
hand
from
Otto
Bock
(Duderstadt,
Germany)
was
used
for
the
experiment
setup.
The
standard
myoelectric
connections
were
bypassed
and
the
hand
was
driven
directly
by
EMG
and
EEG
control
signals
through
a
buffered
motor
driver.
A
FlexiForce
sensor
from
Tekscan,
Inc
(South
Boston,
MA,
USA)
was
used
to
provide
cutaneous
force
feedback
once
contact
with
an
object
was
achieved.
The
sensor
was
adhesively
attached
to
the
index
digit
of
the
hand.
A
Hall-effect
sensor
was
placed
at
the
thumb
bridge
to
provide
proprioceptive
feedback
of
the
hand's
angular
position.
Force
and
angle
data
were
acquired
using
a
USB-6251
DAQ
card
from
National
Instruments
(Austin,
TX,
USA).
D.
EEG
Signal
Processing
The
EEG
activity
from
right
and
left
hand
motor
imagery
were
focused
at
electrodes
C3
and
C4,
respectively,
which
broadly
overlay
the
primary
motor
cortex
(MI)
hand
area
[17].
The
data
was
spatially
filtered
by
re-referencing
the
electrodes
using
a
large
Laplacian
filter
[18].
The
spatially
filtered
EEG
activity
from
each
electrode
was
modeled
as
an
autoregressive
(AR)
process
[19]
over
a
sliding
temporal
window
of
duration
2
s
shifting
every
250
ms.
The
power
spectral
density
of
the
AR
model
for
each
electrode
was
computed
to
calculate
the
peak
mu-band
power.
Fig.
2
shows
a
sample
raw
EEG
timecourse
and
power
spectrum
data.
A
hierarchical
linear
classification
scheme
was
used
to
create
the
BCI
control
signal
in
two
stages.
A
gating
classifier
G
was
designed
to
distinguish
between
motor
imagery
and
relaxation,
I
if
WIG
PC3(tk)+W2GPC4(tk)+BG
<TG
(1)
G(k)t
else
where
PC3(tk)
and
Pc4(tk)
are
peak
mu-band
power
at
discrete
times
tk;
and
WIG
w2G,
BG,
and
TG
are
the
weights,
biases,
and
thresholds,
respectively,
determined
during
optimization
for
each
subject.
A
second
movement
classifier
M
was
then
designed
to
distinguish
between
right
hand
and
left
hand
motor
imagery
tasks,
[+1
if
JW1MPC3(tk)+
W2MmPC4
(tk)+BM
<TM
(2)
-
1
else
where
WJM,
W2M,
BM,
and
TM
are
the
weights,
biases,
and
thresholds,
respectively,
determined
during
optimization
for
each
subject.
The
final
output
F(tk)
was
the
product
of
the
two
classifiers,
F(tk)
=
G(t.)
x
M(t.)
3
where
+
1
is
right
hand
movement
and
mapped
to
incremental
closing
of
the
prosthetic
hand,
-1
is
left
hand
movement
and
mapped
to
incremental
opening
of
the
prosthetic
hand,
and
0
is
relaxation
and
mapped
to
no
movement.
A
classifier
decision
was
made
every
250
ms.
E.
EMG
Signal
Processing
The
EMG
signal
underwent
a
simple
preprocessing
of
full
wave
rectification.
A
linear
classifier,
based
on
the
standard
deviation
of
the
signal,
was
used
to
distinguish
between
each
contraction
type
(flexion
and
extension).
The
threshold
was
set
manually
for
each
subject
depending
on
the
relative
difference
of
their
antagonistic
muscle
activity.
Flexion
was
mapped
to
closing
of
the
hand,
extension
was
mapped
to
opening
of
the
hand,
and
lack
of
either
muscle
activation
was
mapped
to
no
movement.
A
classifier
decision
was
made
every
250
ms.
1-4244-1320-6/07/$25.00
(c)2007
IEEE
610
Robotics,
June
12-15,
Noordwijk,
The
Netherlands
(A)
C3
Left
Hand
Motor
Imagery
Task
B
60
40
20
60
40
-1
-0.5
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
Sec
C4
Left
Hand
Motor
Imagery
Task
B
20
5
order
to
stimulate
high-frequency
Pacinian
skin
mechanoreceptors
[20].
The
tactor
was
placed
on
the
upper
arm
region
because
this
area
has
higher
spatial
discrimination
than
the
torso
or
back,
and
because
this
location
has
been
used
in
previous
studies
testing
haptic
feedback
with
prostheses
[21
-23].
Traditional
BCI
platforms
rely
on
visual
feedback
in
the
form
of
a
moving
point
or
bar
[24]
to
relay
biofeedback
of
the
subject's
neural
activity.
However,
since
the
goal
of
this
study
was
to
determine
feasibility
for
a
prosthetic
application,
subjects
used
direct
visual
feedback
of
the
prosthetic
hand
opening
and
closing
as
biofeedback
of
their
EEG
activity.
After
grasping
an
object,
the
vibrotactile
and
visual
haptic
feedback
provided
further
information
on
grasping
force,
which
allowed
the
user
to
maintain
awareness
of
the
prosthesis.
G.
Study
Design
0
-1
-0.5
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
Sec
(B)
Fig.
2
EEG
Data.
a)
Raw
EEG
data
for
channels
C3
and
C4
during
a
single
trial.
C3
and
C4
correspond
to
EEG-electrode
positions
over
the
left
and
right
motor
regions
of
the
brain,
respectively.
B)
Time-frequency
plots
for
EEG
activity
from
C3
and
C4
averaged
across
all
trials
for
the
left
hand
motor
imagery
task
without
haptic
feedback.
The
dashed
vertical
line
indicates
the
time
of
cue
presentation,
and
the
two
solid
horizontal
lines
indicate
the
range
frequencies
(7-14
Hz)
used
to
capture
the
mu
band
power.
The
color
bar
represents
spectral
power
in
dB,
with
blue
regions
indicating
decrease
in
mu
band
power
(desynchronization).
Event-related
desynchronization
(ERD)
is
observed
in
the
EEG
from
C4
since
the
right
(contralateral)
hemisphere
responds
to
left
hand
motor
imagery.
F.
Local
Machine
Control
and
Haptic
Feedback
Intelligent
grasping
of
objects
was
achieved
using
both
force
feedback
from
the
FlexiForce
sensors
and
angular
feedback
from
the
Hall-effect
sensor.
Once
initial
contact
was
made,
the
motor
driver
remained
on
for
only
a
short
period
of
time
afterwards
to
achieve
a
stable
grip.
This
level
of
control
was
known
as
local
machine
control.
Visual
haptic
feedback
of
force
was
provided
via
a
bar
graph
display
running
in
LabView
8.0
(National
Instruments).
Vibrotactile
feedback
was
provided
via
a
commercial
C2
tactor
from
Engineering
Acoustics,
Inc.
(Winter
Park,
FL,
USA).
The
vibratory
stimulus
waveform
was
a
pulse-width
modulated
series
of
discrete
pulses.
Shorter,
more
rapid
pulses
led
to
an
increase
in
stimulus
intensity,
and
longer,
less
rapid
pulses
led
to
a
decrease
in
stimulus
intensity.
A
200
Hz
carrier
frequency
was
used
in
As
shown
in
Fig.
3,
each
experiment
trial
consisted
of
two
distinct
stages,
whereby
the
context
of
the
situation
changed.
In
the
first
stage,
following
a
variable
Rest
period
of
3-8
s,
the
subject
was
presented
with
an
auditory
Ready
cue.
After
a
Ready
period
of
t
s
the
subject
was
presented
with
an
auditory
cue
to
either
Open
the
prosthesis
fully
or
to
Close
the
prosthesis
to
grip
an
object
securely.
75%
of
the
cues
were
Close
cues,
while
the
remaining
25%
were
Open
cues.
The
trial
ended
successfully
if
the
subject
a)
opened
the
hand
fully
and
maintained
this
position
for
one
second,
or
b)
grasped
the
object
and
maintained
the
hold
for
one
second
without
slipping
or
crushing.
A
hold
required
the
subject
to
maintain
the
force
between
a
minimum
and
maximum
force
thresholds
that
were
empirically
determined
for
the
object.
The
trial
ended
with
failure
if
the
subject
completed
the
wrong
task
or
did
not
successfully
complete
the
task
within
the
allotted
15
s
timeout
period.
If
the
subject
failed
the
trial
or
successfully
completed
the
Open
task,
a
new
trial
was
started,
beginning
with
the
initial
Ready
period.
If
the
subject
successfully
completed
the
Close
task,
the
trial
progressed
to
the
second
stage.
Fig.
4
illustrates
the
sequence
of
cues
for
each
trial.
In
the
second
stage,
the
prosthesis
was
already
grasping
an
object
and
the
user
was
presented
with
an
auditory
Ready
cue.
Following
the
Ready
period
of
1
s,
the
subject
was
presented
with
an
auditory
cue
to
either
Release
the
object,
Crush
the
object,
or
continue
to
Hold
the
object.
The
cues
were
presented
with
equal
probability.
The
trial
ended
successfully
if
the
subject
a)
released
the
object,
b)
maintained
a
hold
on
the
object
for
15
s
within
the previously
defined
force
range,
or
c)
crushed
the
object
and
maintained
the
crush
for
one
second.
As
with
the
hold,
a
crush
required
the
subject
to
maintain
the
force
between
different
minimum
and
maximum
force
thresholds
that
were
empirically
determined
for
the
object.
The
trial
ended
with
failure
if
the
subject
completed
the
wrong
task
or
did
not
successfully
complete
the
task
within
the
allotted
15
s
timeout
period.
In
both
cases
of
success
and
failure,
a
new
trial
started,
beginning
with
the
initial
Rest
period
of
the
first
stage.
1-4244-1320-6/07/$25.00
(c)2007
IEEE
Proceedings
611
gs
of
the
2007
IEE
national
Conference
X
1N
Cue
Re
n
Rehab
Timneout
Fixed
I
I
0
123
4
5678a
Id
I
~~5
ation
Robotics,
June
12-15,
Noordwijk,
The
Netherlands
being
grasped.
Table
II
presents
the
accuracy
results
for
stage
2
trials
(Release,
Hold,
and
Crush
cues).
Once
again,
the
results
indicate
that
the
user
had
comparable
accuracy
for
the
EEG-based
system
with
local
machine
control
(450
o)
as
with
the
EMG-based
system
(500/O).
The
results
also
show
significant
variation
in
accuracy
depending
on
the
modality
of
haptic
feedback,
with
the
subject
performing
better
using
visual
haptic
feedback
(67%)
than
with
vibrotactile
haptic
feedback
(350/O).
Although
the
intention
of
both
haptic
feedback
channels
was
to
provide
more
accurate
force
information
than
could
be
obtained
visually,
the
nature
of
the
feedback
appeared
to
affect
the
subject's
ability
to
concentrate
and
maintain
the
desired
motor
imagery
task.
This
was
especially
evident
in
the
Hold
phase,
as
the
vibratory
stimulus
appeared
to
impede
the
subject's
ability
to
relax
and
maintain
grip
force.
The
subject
likely
performed
better
with
visual
haptic
feedback
because
this
modality
is
more
precise
and
did
not
interfere
with
the
subject's
motor
imagery.
Fig.
4.
Timing
diagram
for
each
trial
of
the
first
stage.
Each
trial
starts
with
a
variable
3-8
s
rest
period,
followed
by
an
auditory
Ready
cue.
After
1
s,
the
grasping
command
cue
is
given.
The
maximum
length
of
each
trial
is
15
s.
The
effectiveness
of
this
platform
was
systematically
tested
to
investigate
how
it
compared
with
conventional
myoelectric
prosthetic
control
as
well
as
to
assess
how
changes
in
the
haptic
feedback
affected
user
performance.
Experiment
A
used
an
EMG
control
system
with
only
direct
visual
feedback
of
the
hand.
Experiment
B
used
an
EEG
control
system
with
direct
visual
feedback
of
the
hand,
as
well
as
local
machine
control
for
intelligent
grasping.
Experiments
C
and
D
also
used
an
EEG
control
strategy
with
local
machine
control,
but
the
user
was
now
additionally
presented
with
visual
and
vibrotactile
haptic
feedback
of
force
respectively.
III.
RESULTS
Accuracy
data
from
a
single
subject
are
presented
in
Table
I,
separated
by
stage,
cue
type,
and
experiment
type.
This
subject
was
an
experienced
user
with
over
30
hours
of
previous
BCI
training.
Table
I
presents
the
accuracy
results
for
stage
1
trials
(Open
and
Close
cues).
The
results
indicate
that
the
user
had
comparable
accuracy
for
the
EEG-based
system
with
local
machine
control
(87%)
as
with
the
EMG-based
system
(80%).
Local
machine
control
allowed
the
subject
to
focus
on
the
task
without
further
cognitive
input
once
the
object
was
touched,
thus
making
him
more
confident
of
activating
the
hand
without
the
risk
of
crossing
the
force
threshold
and
crushing
the
object.
These
advantages
allow
the
subject
to
demonstrate
a
fast
and
secure
grasp
using
only
cortical
control.
Furthermore,
the
subject
had
similar
accuracy
for
the
EEG
trials
with
haptic
feedback
(average
78%)
as
with
EEG
trials
without
haptic
feedback
(87%).
This
was
expected
since
haptic
feedback
plays
no
assistive
role
when
the
object
is
not
IV.
DISCUSSION
The
integrated
BCI
platform
presented
demonstrates
how
a
high-level
cortical
control
signal
can
be
coupled
with
local
machine
control
and
a
context-sensitive
functionality
to
enable
a
user
to
perform
complex
grasping
actions
with
a
prosthetic
hand.
Furthermore,
haptic
feedback
channels
are
utilized
to
implement
biofeedback
and
improve
user
awareness
of
object
interaction.
The
development
of
effective
closed-loop
control
systems
tailored
to
the
low-bandwidth
nature
of
BCI
signals
could
provide
a
more
effective
and
intuitive
control
of
prosthetic
devices.
Although
the
subject
began
the
experiment
by
performing
motor
imagery
tasks
he
eventually
learned
how
to
change
his
brain
activation
to
achieve
good
control
of
the
prosthetic
hand
without
the
need
to
perform
the
previous
motor
imagery
tasks.
TABLE
I
RESULTS
FOR
STAGE
1
Open
Close
Total
EMG
(A)
100%
73%
80%
EEG
(B)
50%
100%
87%
EEG
&
Visual
(C)
72% 78% 76%
EEG
&
Vibro
(D)
43%
95%
82%
Accuracy
results
from
an
experienced
BCI
subject
for
stage
1
trials
separated
by
cue
type
and
testing
mode.
TABLE
II
RESULTS
FOR
STAGE
2
Release
Hold
Crush
Total
EMG(A)
100% 100%
25%
50%
EEG(B)
50% 50%
33%
45%
EEG
&
Visual
(C)
100%
50%
57%
67%
EEG
&
Vbrol
(D)
67%
13%
33% 35%
Accuracy
results
from
an
experienced
BCI
subject
for
stage
2
trials
separated
by
cue
type
and
testing
mode.
1-4244-1320-6/07/$25.00
(c)2007
IEEE
Proceedin
612
Proceedings
of
the
2007
IEEE
10th
International
Conference
on
Rehabilitation
Robotics,
June
12-15,
Noordwijk,
The
Netherlands
Further
experiments
will
need
to
be
completed
with
multiple
subjects
to
determine
whether
the
trends
visible
in
this
subject's
accuracies
are
statistically
significant.
Latency
and
trajectory
data
will
also
be
examined
to
look
for
significant
differences
between
the
testing
modes.
Lastly,
the
choice
of
haptic
feedback
will
need
to
be
systematically
evaluated
in
the
context
of
the
BCI
paradigm
employed.
Although
the
information
transfer
bit-rate
of
an
EEG-based
BCI
is
slower
than
direct
recordings
of
single-unit
or
multi-unit
activity
from
neuron
populations
[25],
the
noninvasive
approach
provides
certain
advantages
for
systems
design
and
development.
For
example,
BCI
systems
can
be
employed
on
multiple
healthy
subjects
to
test
the
new
algorithms,
control
strategies,
etc,
with
minimal
safety
concerns
whereas
more
invasive
Brain-Computer
Interfaces
have
had
limited
human
subject
experience,
unknown
long-term
effects
from
chronic
implantation
and
ethical
considerations
[26-27].
ACKNOWLEDGMENT
The
authors
thank
Dongwon
Lee
for
his
hard
work
and
dedication
towards
development
of
the
BCI
platform.
The
authors
would
also
like
to
thank
Mihaela
Ungureanu
for
her
assistance
with
the
EEG
signal
processing.
REFERENCES
[1]
L.
J.
Trejo,
R.
Rosipal,
and
B.
Matthews,
"Brain-computer
interfaces
for
1
-D
and
2-D
cursor
control:
designs using
volitional
control
of
the
EEG
spectrum
or
steady-state
visual
evoked
potentials,"
IEEE
Trans
Neural
Syst
Rehabil
Eng,
vol.
14,
pp.
225-9,
2006.
[2]
T.
M.
Vaughan,
D.
J.
McFarland,
G.
Schalk,
W.
A.
Sarnacki,
D.
J.
Krusienski,
E.
W.
Sellers,
and
J.
R.
Wolpaw,
"The
Wadsworth
BCI
Research
and
Development
Program:
at
home
with
BCI,"
IEEE
Trans
Neural
Syst
Rehabil
Eng,
vol.
14,
pp.
229-33,
2006.
[3]
T.
M.
Vaughan
and
J.
R.
Wolpaw,
"The
Third
International
Meeting
on
Brain-Computer
Interface
Technology:
making
a
difference,"
IEEE
Trans
Neural
SystRehabilEng,
vol.
14, pp.
126-7,
2006.
[4]
A.
Abbott,
"Neuroprosthetics:
in
search
ofthe
sixth
sense,"
Nature,
vol.
442,
pp.
125-7,
2006.
[5]
A.
B.
Schwartz,
X.
T.
Cui,
D.
J.
Weber,
and
D.
W.
Moran,
"Brain-controlled
interfaces:
movement
restoration
with
neural
prosthetics,"
Neuron,
vol.
52,
pp.
205-20,
2006.
[6]
V.
S.
Nelson,
K.
M.
Flood,
P.
R.
Bryant,
M.
E.
Huang,
P.
F.
Pasquina,
and
T.
L.
Roberts,
"Limb
deficiency
and
prosthetic
management.
1.
Decision
making
in
prosthetic
prescription
and
management,"
Arch
Phys
MedRehabil,
vol.
87,
pp.
S3-9,
2006.
[7]
R.
A.
Roeschlein
and
E.
Domholdt,
"Factors
related
to
successful
upper
extremity
prosthetic
use,"
Prosthet
OrthotInt,
vol.
13,
pp.
14-8,
1989.
[8]
D.
M.
Taylor,
S.
I.
Tillery,
and
A.
B.
Schwartz,
"Direct
cortical
control
of
3D
neuroprosthetic
devices,"
Science,
vol.
296,
pp.
1829-32,
2002.
[9]
L.
R.
Hochberg,
M.
D.
Serruya,
G.
M.
Friehs,
J.
A.
Mukand,
M.
Saleh,
A.
H.
Caplan,
A.
Branner,
D.
Chen,
R.
D.
Penn,
and
J.
P.
Donoghue,
"Neuronal
ensemble
control
of
prosthetic
devices
by
a
human
with
tetraplegia,"
Nature,
vol.
442,
pp.
164-71,
2006.
[10]
J.
M.
Carmena,
M.
A.
Lebedev,
R.
E.
Crist,
J.
E.
O'Doherty,
D.
M.
Santucci,
D.
F.
Dimitrov,
P.
G.
Patil,
C.
S.
Henriquez,
and
M.
A.
Nicolelis,
"Learning
to
control
a
brain-machine
interface
for
reaching
and
grasping
by
primates,"
PLoSBiol,
vol.
1,
pp.
E42,
2003.
[11]
J.
R.
Wolpaw
and
D.
J.
McFarland,
"Multichannel
EEG-based
brain-computer
communication,"
Electroencephalogr
Clin
Neurophysiol,
vol.
90,
pp.
444-9,
1994.
[12]
J.
A.
Pineda,
D.
S.
Silverman,
A.
Vankov,
and
J.
Hestenes,
"Learning
to
control
brain
rhythms:
making
a
brain-computer
interface
possible,"
IEEE
Trans
Neural
Syst
Rehabil
Eng,
vol.
11,
pp.
181-4,
2003.
[13]
J.
R.
Wolpaw
and
D.
J.
McFarland,
"Control
of
a
two-dimensional
movement
signal
by
a
noninvasive
brain-computer
interface
in
humans,"
Proc
Natl
Acad
Sci
USA,
vol.
101,
pp.
17849-54,
2004.
[14]
K.
A.
Kaczmarek,
J.
G.
Webster,
P.
Bach-y-Rita,
and
W.
J.
Tompkins,
"Electrotactile
and
vibrotactile
displays
for
sensory
substitution
systems,"
IEEE
Trans
Biomed
Eng,
vol.
38,
pp.
1-16,
1991.
[15]
G.
F.
Shannon,
"Sensory
feedback
for
artificial
limbs,"
Med
Prog
Technol,
vol.
6,
pp.
73-9,
1979.
[16]
G.
F.
Shannon,
"A
comparison
of
alternative
means
of
providing
sensory
feedback
on
upper
limb
prostheses,"
Med
Biol
Eng,
vol.
14,
pp.
289-94,
1976.
[17]
C.
Guger,
H.
Ramoser,
and
G.
Pfurtscheller,
"Real-time
EEG
analysis
with
subject-specific
spatial
patterns
for
a
brain-computer
interface
(BCI),"
IEEE
Trans
Rehabil
Eng,
vol.
8,
pp.
447-56,
2000.
[18]
H.
Ramoser,
J.
Muller-Gerking,
and
G.
Pfurtscheller,
"Optimal
spatial
filtering
of
single
trial
EEG
during
imagined
hand
movement,"
IEEE
Trans
Rehabil
Eng,
vol.
8,
pp.
441-6,
2000.
[19]
R.
Bos,
S.
deWaele,
and
P.
M.
T.
Broersen,
"Autoregressive
spectral
estimation
by
application
of
the
Burg
algorithm
to
irregularly
sampled
data,"
IEEE
Trans
Instrum.
Meas.,
vol.
51,
pp.
1289,
2002.
[20]
V. B.
Mountcastle,
R.
H.
LaMotte,
and
G.
Carli,
"Detection
thresholds
for
stimuli
in
humans
and
monkeys:
comparison
with
threshold
events
in
mechanoreceptive
afferent
nerve
fibers
innervating
the
monkey
hand,
"
J
Neurophysiol,
vol.
3
5,
pp.
122-3
6,
1972
[21]
S.
G.
Meek,
S.
C.
Jacobsen,
and
P.
P.
Goulding,
"Extended
physiologic
taction:
design
and
evaluation
of
a
proportional
force
feedback
system,"
JRehabil
Res
Dev,
vol.
26,
pp.
53-62,
1989.
[22]
P.
E.
Patterson
and
J.
A.
Katz,
"Design
and
evaluation
of
a
sensory
feedback
system
that
provides
grasping
pressure
in
a
myoelectric
hand,"
J
Rehabil
Res
Dev,
vol.
29,
pp.
1-8,
1992.
[23]
R.
W.
Cholewiak
and
A.
A.
Collins,
"Vibrotactile
localization
on
the
arm:
effects
of
place,
space,
and
age,"
Percept
Psychophys,
vol.
65,
pp.
1058-77,
2003.
[24]
B.
Blankertz,
K.R.
Muiller,
G.
Curio,
T.
M.
Vaughan,
G.
Schalk,
J.
R.
Wolpaw,
A.
Schlogl,
C.
Neuper,
G.
Pfurtscheller,
T.
Hinterberger,
M.
Schroder,
N.
Birbaumer,
"The
BCI
Competition
2003:
Progress
and
Perspectives
in
Detection
and
Discrimination
of
EEG
Single
Trials,"
IEEE
transactions
on
biomedical
engineering.
Vol.
XX,
No
Y,
2004.
[25]
G.
Santhanam,
S.
I.
Ryu,
B.
M.
Yu,
A.
Afshar,K.
V.
Shenoy,
"A
high-performance
brain-computer
interface,"
Nature,
October
2006.
[26]
S.
Rodota,
R.
Capurro,
"Ethical
aspects
of
ICT
implants
in
the
human
body;
"European
Group
on
Ethics
in
Science
and
new
technologies
to
the
European
Commission.
Opinion,
No
20,
March
2005.
[27]
Leuthardt,
Eric
C.;
Schalk,
Gerwin.;
Moran,
Daniel;
Ojemann,
Jeffrey
G.,
"The
Emerging
World
of
Motor
Neuroprosthetics:
A
Neurosurgical
Perspective,"
Neurosurgery.
59(1):1-14,
July
2006.
1-4244-1320-6/07/$25.00
(c)2007
IEEE
613
... The few reported hierarchical classifiers were mainly focused on idle state detection for asynchronous BCIs. Indeed, hierarchical BCI decoders were typically organized in a two layer structure which firstly isolated the idle state from the active states and then applied another classifier to select one of the available active state [Abascal et al., 2020] In particular, a hierarchical linear classifier designed to control the closure of a robotic hand based on EEG mu-band power modulation was reported [Murguialday et al., 2007]. ...
... In [Murguialday et al., 2007], a first classifier discriminated the active and idle states whereas a second one selected between three states (release, maintain, or crush an object in the robotic hand). Similarly, hierarchical architecture was considered for a 5-class problem based on offline EEG neural signal analysis to cluster firstly the idle and action states using unsupervised K-mean algorithm and supervised SVM to distinguish left hand, right hand, tongue or foot imaginary movements [Salazar-Ramirez et al., 2019]. ...
... [Bashashati et al., 2007b] [Hotson et al., 2016a] [Jeong et al., 2020] [Kee et al., 2017] [Murguialday et al., 2007] [Omedes et al., 2017] [Salazar-Ramirez et al., 2019. ...
Thesis
Brain-computer interfaces (BCIs) are systems that allow the control of external devices from the brain’s neural signals without neuromuscular activation. Among the various applications, functional compensation and rehabilitation of individuals suffering from severe motor disabilities (with motor BCIs) has always been a focus for BCI research. Brain signals are translated, through signal processing steps, into orders realized by an effector which returns feedbacks (visual, tactile, proprioceptive…) to the patient giving him back some mobility and autonomy. Nevertheless, numerous challenges to translate BCI from offline experiments based on healthy subjects recordings to daily life applications for disable patients. Even though BCI decoding highlights good control performance during specific task such as center out experiments, the development of decoder for online asynchronous decoding, stable during long period, is still one of the BCI community claim. Moreover, a lack of studies and algorithms on multi-limb effector control were highlighted.Relying on the “BCI and Tetraplegia” clinical trial of CEA/LETI/CLINATEC, the development of new decoders for real-time closed-loop adaptive asynchronous multi-limb is addressed in the present doctoral thesis. Recursive exponentially weighted Markov switching multi-linear model (REW-MSLM) was designed to handle complex / high dimensional multi-limb effector control with online closed-loop calibration of the decoding model.Based on a mixture of expert architecture, REW-MSLM allows a tetraplegic patient who underwent bilateral epidural electrocorticographic (ECoG) arrays implantation of chronic wireless implants (WIMAGINE) 8D control of a whole body exoskeleton over several months without model recalibration. The patient was able to perform alternative 3D left and right hand translations and 1D left and right wrist rotations with high accuracy and during long period without any model recalibration. Experiments with higher controlled dimensions and other effectors such as wheelchair have also been tested and highlighted promising results. This PhD thesis aims to present new innovative adaptive BCI decoder adapted to multi-limb decoding for clinical applications and highlights the interest of such decoder in the perspective of the current state-of-the-art.
... Brain-computer interface (BCI) technology is typically used to restore functionality to injured or impaired patients. While many such BCIs have been developed using invasive neurophysiology modalities [1][2][3][4][5][6][7][8][9], non-invasive electroencephalography (EEG) based BCIs have also been used to restore functions such as locomotion, motor control, and communication to impaired patients [10][11][12][13][14][15][16][17][18][19]. Beyond the treatment of pathological conditions such as motor impairment and neuropsychiatric disorders, there is also growing interest in developing non-invasive BCIs that can improve individual capabilities [20][21][22][23][24]. ...
Preprint
When making decisions, humans can evaluate how likely they are to be correct. If this subjective confidence could be reliably decoded from brain activity, it would be possible to build a brain-computer interface (BCI) that improves decision performance by automatically providing more information to the user if needed based on their confidence. But this possibility depends on whether confidence can be decoded right after stimulus presentation and before the response so that a corrective action can be taken in time. Although prior work has shown that decision confidence is represented in brain signals, it is unclear if the representation is stimulus-locked or response-locked, and whether stimulus-locked pre-response decoding is sufficiently accurate for enabling such a BCI. We investigate the neural correlates of confidence by collecting high-density EEG during a perceptual decision task with realistic stimuli. Importantly, we design our task to include a post-stimulus gap that prevents the confounding of stimulus-locked activity by response-locked activity and vice versa, and then compare with a task without this gap. We perform event-related potential (ERP) and source-localization analyses. Our analyses suggest that the neural correlates of confidence are stimulus-locked, and that an absence of a post-stimulus gap could cause these correlates to incorrectly appear as response-locked. By preventing response-related activity to confound stimulus-locked activity, we then show that confidence can be reliably decoded from single-trial stimulus-locked pre-response EEG alone. We also identify a high-performance classification algorithm by comparing a battery of algorithms. Lastly, we design a simulated BCI framework to show that the EEG classification is accurate enough to build a BCI and that the decoded confidence could be used to improve decision making performance particularly when the task difficulty and cost of errors are high. Our results show feasibility of non-invasive EEG-based BCIs to improve human decision making.
... and has focused on grasp force feedback (Clemente et al. 2016;Pena et al. 2019;Saunders and Vijayakumar 2011;Witteveen et al. 2015), or conveying more complex information (Markovic et al. 2018), and used kinematic measures to assess the efficacy of VTF. VTF has successfully conveyed grasp information while using brain-computer interfaces (Chatteijee et al. 2007;Murguialday et al. 2007). However, few studies to date have examined differences in cortical activity or grasp aperture due to VTF. ...
Article
Full-text available
Prosthesis disuse and abandonment is an ongoing issue in upper-limb amputation. In addition to lost structural and motor function, amputation also results in decreased task-specific sensory information. One proposed remedy is augmenting somatosensory information using vibrotactile feedback to provide tactile feedback of grasping objects. While the role of frontal and parietal areas in motor tasks is well established, the neural and kinematic effects of this augmented vibrotactile feedback remain in question. In this study, we sought to understand the neurobehavioral effects of providing augmented feedback during a reach-grasp-transport task. Ten persons with sound limbs performed a motor task while wearing a prosthesis simulator with and without vibrotactile feedback. We hypothesized that providing vibrotactile feedback during prosthesis use would increase activity in frontal and parietal areas and improve grasp-related behavior. Results show that anticipation of upcoming vibrotactile feedback may be encoded in motor and parietal areas during the reach-to-grasp phase of the task. While grasp aperture is unaffected by vibrotactile feedback, the availability of vibrotactile feedback does lead to a reduction in velocity during object transport. These results help shed light on how engineered feedback is utilized by prostheses users and provide methodologies for further assessment in advanced prosthetics research.
Article
Motor imagery-based brain computer interfaces (BCI) performance can be reinforced by visual presentation of feedback about the motor imagery displayed on a screen. However, to directly control robots by the BCI, a different feedback modality may be preferred. The objective of this study was to develop a BCI with kinaesthetic haptic feedback based on the detected motor imagery, and compare its performance to using visual feedback. Twelve online BCI runs with both feedback modalities were performed by ten adults without impairments, and four runs in a game-like task were performed by one adult with cerebral palsy and one child without impairments. The participants completed the BCI training with an average accuracy of 67.28 ± 11.2% for visual feedback and 75.12 ± 12.3% for kinaesthetic haptic feedback. The BCI training with kinaesthetic haptic feedback resulted in less workload and statistically higher classification accuracy than visual feedback (p = 0.03).
Conference Paper
Motor disability is the loss of the ability to move a limb of the body. Motor disabilities make difficult the interaction between a disabled person and her/his environment. Recent research has focused on developing innovative technologies that could be used by disabled people to improve their life quality. In this paper, a brain-computer interface for controlling IoT devices is proposed. This system is based on the use of the MuseHeadband sensor which captures EEG signals when a person blinks. This sensor is placed on the forehead of the user of the system. The EEGs are preprocessed and then classified into short and long blinks by computing their signal envelopes and using the kNN algorithm. The classified blinks are then used to form control commands that are sent to a mosquitto server hosted in the cloud. This server is responsible for sending the control action to the connected IoT devices. The accuracy of the classifier designed in this work is 99.53%. Usability tests show that the probability of a user sending a wrong command, with the proposed system, is 4.83%. However, this probability decreases when the time of use of the system proposed increases.
Article
This systematic review addresses the plausibility of using novel feedback modalities for brain–computer interface (BCI) and attempts to identify the best feedback modality on the basis of the effectiveness or learning rate. Out of the chosen studies, it was found that 100% of studies tested visual feedback, 31.6% tested auditory feedback, 57.9% tested tactile feedback, and 21.1% tested proprioceptive feedback. Visual feedback was included in every study design because it was intrinsic to the response of the task (e.g. seeing a cursor move). However, when used alone, it was not very effective at improving accuracy or learning. Proprioceptive feedback was most successful at increasing the effectiveness of motor imagery BCI tasks involving neuroprosthetics. The use of auditory and tactile feedback resulted in mixed results. The limitations of this current study and further study recommendations are discussed.
Thesis
The BCI or Brain Computer Interface is a systematic attempt at establishing a direct line of communication between the brain and a computer. BCI have various applications that extends to fields like researching, mapping, assisting, augmenting, or repairing human cognitive or sensory-motor functions. The concept revolves around using brain activity to measure and generate ‘signals’ that can be used to interact with digital or physical devices and peripherals. This research paper focusses on using electroencephalography (EEG) to measure brain activity and propose a model to correctly extract, analyze and classify accurately the EEG patterns responsible for certain motor functions, specifically the movement of the right arm. Using data set provided by Department of Medical Informatics, Institute for Biomedical Engineering, University of Technology Graz. (Gert Pfurtscheller), we have successfully filtered and extracted the specific frequency range required for our operation. We have then completed analysis using PLS with classification accuracy rates of 56%.
Article
Full-text available
Providing accurate sensory information to the individual with a myoelectric limb is of great importance for improving device use in a wide variety of tasks. A number of feedback systems presently being investigated rely on either vibrotactile or electrotactile skin stimulation, which does not provide sensory patterns similar to those in a natural grasping hand. A prototype system was developed to enhance sensory information transfer by using a technique in which the feedback modality (pressure) was the same as the grasping pressure. The present study compared the developed system (pressure) with vibrotactile feedback, vision, and compounds of these three modes. It was found that the pressure-pressure concept reduced grasping pressure replication errors and error variability.
Article
Over the years various approaches have been made to help solve the problem of replacing mechanical function of a lost limb, but there has been little direct interest in the associated problem of replacing the sensory function also lost in the amputation. Psychological rejection of electrically powered hands (despite their good cosmesis) suggests that greater attention should be given to the provision of sensory feedback on artificial limbs. Myoelectrically controlled electric hands fitted with a sense of touch have been fitted to two patients and in a nine month trial period they reported favourably on these fittings. There has not been any adverse skin reaction to the application of the electrical feedback stimulus and the patients say that because of the feedback they have an increased level of confidence in using the prosthesis.