Conference PaperPDF Available

Brain-Operated Assistive Devices: the ASPICE Project

Authors:
  • Alfameg s.r.l.

Abstract

The ASPICE project aims at the development of a system which allows the neuromotor disabled persons to improve or recover their mobility (directly or by emulation) and communication within the surrounding environment. The system pivots around a software controller running on a personal computer, which offers to the user a proper interface to communicate through input interfaces matched with the individual's residual abilities. The system uses the user's input to control domotic devices - such as remotely controlled lights, TV sets, etc. - and a Sony AIBO robot. At this time, the system is under clinical validation, that will provide assessment through patients' feedback and guidelines for customized system installation
Brain-Operated Assistive Devices:
the ASPICE Project *
F. Cincotti
1,#
, F. Aloise
1,2
, F. Babiloni
1,3
, M. G. Marciani
1,4
, D. Morelli
1
, S. Paolucci
1
, G. Oriolo
5
,
A. Cherubini
5
, S. Bruscino
5
, F. Sciarra
6
, F. Mangiola
6
, A. Melpignano
7
, F. Davide
7
, D. Mattia
1
.
1
Fondazione Santa Lucia IRCCS, Roma, Italy;
2
Dip. di Elettronica, Informatica e Sistemistica, Univ. della
Calabria, Rende (CS), Italy;
3
Dip. di Fisiologia Umana, Univ. “La Sapienza”, Roma, Italy;
4
Dip. di
Neuroscienze, Univ. “Tor Vergata”, Roma, Italy;
5
Dip. di Informatica e Sistemistica, Univ. “La
Sapienza”, Roma, Italy;
6
Unione Italiana Lotta alla Distrofia Muscolare, Sezione del Lazio, Roma, Italy;
7
Telecom Italia Learning Services, Roma, Italy
.Abstract – The ASPICE project aims at the
development of a system which allows the neuromotor
disabled persons to improve or recover their mobility
(directly or by emulation) and communication within
the surrounding environment. The system pivots
around a software controller running on a personal
computer, which offers to the user a proper interface to
communicate through input interfaces matched with
the individual’s residual abilities. The system uses the
user’s input to control domotic devices – such as
remotely controlled lights, TV sets, etc. – and a Sony
AIBO robot.
At this time, the system is under clinical validation,
that will provide assessment through patients’ feedback
and guidelines for customized system installation.
Index Terms – Technologies for Independent Life,
Brain-Computer Interfaces, Robotic Navigation,
Ambient Intelligence, Severe Motor Impairment.
I.
INTRODUCTION
The ultimate objective of medical care or treatment is
the recover from the disease, or alternatively the
improvement of the clinical symptomatology proper of the
disease. In the field of rehabilitation, the main goal is the
reduction of the disability provoked by any pathological
condition that is the achievement of the maximum
independence for a given clinical frame, by means of
orthesis and the management of the disability related to the
social disadvantage by means of different types of aids.
Recently, the growing evidence for the development of
electronic devices capable of ameliorating the possibility
to increase the communication and the management of the
house-environment has opened new avenues for patients
affected by severe movement disorders with preserved
cognitive functions. These devices still suffer from
limitations due to the necessity of a residual motor ability
.
* This work is partially supported by the Italian Telethon Foundation
Grant GUP03562 to the institutions: Fondazione Santa Lucia, Dip. di
Informatica e Sistemistica - Università La Sapienza, Telecom Italia
Learning Services.
# Corresponding author, Febo Cincotti, Fondazione Santa Lucia IRCCS,
via Ardeatina 306, I-00179 Roma, Italy. f.cincotti@hsantalucia.it
which might prevent some pathological condition from
their use.
It exists today the knowledge to convey a cutting edge
technological and scientific result in a way that the largest
part of the population can benefit from it.
The project offered the opportunity to integrate into a
prototype the technologies described in the three following
sections, in order to prove that an application in every
day’s life is possible, with particular attention to people
who suffer from diseases that affect their mobility.
A Brain-Computer Interfaces
“Brain–computer interfaces (BCI’s) give their users
communication and control channels that do not depend on
the brain’s normal output channels of peripheral nerves
and muscles.”
[1]. This is the most accepted definition of
the so-called BCI’s. In other terms, a BCI can detect the
activation patterns of the brain, and whenever the user
induces a voluntary modification of these patterns, the
interface is able to detect it and to translate it into an action
that is associated to the user’s will.
Recent experiments have shown the possibility to use
the brain electrical activity to directly control the
movement of robots or prosthetic devices in real time
[2]-[5]. Different experience on BCI technology has been
gathered. In USA, the group of Wolpaw has develop a BCI
based on variations of EEG rhythmic activity, capturing
the signals by means of an electrode cap to control the
movement of a cursor on a computer screen
[6]-[8]. In
Europe, the BCI system developed by the group of
Pfurtscheller (Graz University) relays on an array with
high number of electrode leads located over the scalp
primary motor areas. They demonstrated that background
EEG activity variation underlying motor imagery can
control a prosthetic device for limited hand movements in
a tetraplegic subject
[9]-[10]. A different approach has
been proposed by Birbaumer et al., (Tübingen University),
utilizing a BCI based on the slow cortical potential that can
open a communication channel with “locked-in” patients
[11]-[13]. Finally, Millàn’s group (IDIAP Research
Center, Switzerland) has pioneered the use of BCI’s to
manipulate robots – a pocket-sized wheeled robot, a stand-
in for a smart wheelchair, is directed to navigate its way
through the rooms of a model house
[14].
Many other research group, more than it would be
possible to review here, contributed to the advancement of
the field. However, as it emerges from this concise review,
control tasks based on human EEG have been addressed to
simple application, such as moving a computer cursor
[6],
opening a hand orthosis
[10], or drive a miniaturized
wheeled robot. Beyond this pioneer approach, it is
conceivable to extend the communication between
disabled person and external environment towards mobility
interaction. In particular, the recognition of mental activity
will be put forward to guide devices (electronic
wheelchair) or to interact naturally with common devices
within the external word (telephone, switch; etc). This
latter application of BCI technology has not been explored,
yet; and it will represent the ultimate objective of this
proposal.
B Robotics
The possibility of taking advantage of robotic
technologies in the present research project stems from the
fact that, in the last decades, the morphology of robots has
undergone a remarkable mutation: from the fixed-base
industrial manipulator, it has evolved into a variety of
mechanical structures, characterized by the fact that the
robot is capable of locomotion, either on wheels or legs.
This ability has largely increased the domain of application
of robots, once limited to the traditional factory
environment, encompassing a number of different
situations, including material and goods transportation,
assistance to hospital patients and disabled people,
automatic surveillance, space exploration and many others
[14].
Navigation systems for sensor-based robot motion have
made impressive advancements in recent years. Along the
way, it has been necessary to address a number of
theoretical and technological issues, such as:
- perception: the robot should be able to acquire a reliable
local map of the surroundings using its sensory system
[16];
- self-localization: to execute a given task, a precise
estimate of the robot location in a world coordinate frame
must be maintained
[17];
- obstacle avoidance: unexpected or moving obstacles
should be avoided by appropriate reactive manoeuvres
[18];
- motion planning: on the basis of an environment map
(local or global), the robot must plan movements leading to
the destination, safely and efficiently
[19];
- motion control: the accurate execution of movements
(essential whenever the robot entrusts its own integrity to
the sensory system) requires the development of effective
feedback controllers whose performance must be robust
with respect to perturbations
[20].
While solutions are available for all the above problems,
there is a gap to be filled concerning the application
envisaged in the present project. In fact, the limited set of
low-rate, high-level commands received from the user
through the BCI must be integrated by an intelligence layer
so as guarantee safe and efficient and task execution.
C. Assistive Technology
Assistive technology device means any item, piece of
equipment, or product system, whether acquired
commercially off-the-shelf, modified, or customized, that
is used to increase maintain or improve the functional
capabilities of an individual with a disability. Assistive
devices can enhance the ability of an individual to perform
everyday life activities, including interaction with the
home environment.
Most homes today have appliances that allow for some
degree of remote control - TV and hi-fi sets, air
conditioning, alarm, etc. Domotics integrates and extends
this ability throughout the house. A house with a domotics
system probably will have at least one computer that will
allow the homeowner to control different applications in
various parts of the house remotely. A house that is
equipped with a domotic system likely will have the ability
to call the police or fire department by itself, unlike normal
alarm systems. Domotic systems are often able to
automatically gather data from several sensors and perform
such things as adjusting lights, pull back curtains and lift
window blinds without physical interaction. Also, the user
can open and unlock or lock doors and gates remotely,
control indoor temperature, set lights to go off, and doors
to lock-all with a touch of a button. With a domotic
system, you can even have your PC screen or TV set act as
a home monitoring system, so that if someone is at the
front door, you can see who it is without going to it. If you
want to check what is going on in another room, you can
do that with this system, provided that a compatible video
camera is installed.
Though potentially useful for the disabled, those
systems are not always designed to include the needs of
this part of the population.
II.
OVERVIEW OF THE ASPICE PROJECT
The ASPICE project (Assistive System for Patient’s
Increase of Communication, ambient control and mobility
in absence of muscular Effort) has received in 2004 a
renewable two-year funding grant from Italian medical
research charity foundation TELETHON. The project
Fig. 1 Outline of the architecture of the ASPICE project. The figure
shows that the system interfaces the user to the surrounding
environment. The modularity is assured by the use of a core unit tha
t
takes inputs by one of the possible input devices and sends commands to
one or more of the possible actuators. A feedback is provided to keep the
user informed about the status of the system.
Brain-Computer
Interface
Brain-Computer
Interface
AmI-based
Control
Unit
AmI-based
Control
Unit
or
or
Feedback
Feedback
Standard
Input Devices
Standard
Input Devices
USER
ENVIRONMENT
Domotic
Appliances
Domotic
Appliances
Mobilization
devices
Mobilization
devices
Personal
Computer
Personal
Computer
involves three partners, namely the Clinical
Neurophysiopathology Laboratory at the Fondazione Santa
Lucia IRCCS, the Robotics Laboratory by the
Dipartimento di Informatica e Sistemistica of University of
Rome “La Sapienza” and Telecom Italia Learning Services
S.p.A.
The project is aimed at the development of a
technological aid which allows neuromotor disabled
persons to improve or recover their mobility and
communication within the surrounding environment. This
aim is particularly addressed towards those stages of the
disease in which the residual muscular strength, if present,
cannot be adequate for the utilization of conventional aids
and in those conditions in which practical obstacles or
security concerns could prevent a displacement from bed.
The reduction of the patients’ independence involves a
consequent increase of caregiver work load. Nowadays,
Information Technology offers the chance to develop
devices which, if correctly integrated, allow relief from the
described limitations. The aid is being developed by
integrating the expertise of the partners of the project. The
key elements of the system are:
1) interfaces for easy access to computer: mouse,
joystick, eye tracker, voice recognition, up to utilization of
signals collected directly but non-invasevely from Central
Nervous System (BCI);
2) controllers of intelligent motion devices which
can follow complex paths, based on a small set of
commands (Robotics);
3) information transmission and domotics,
establishing an information flow between patient and
controlled appliances, minimizing structural modifications
of the house (Domotics).
The ASPICE architecture, with its input and output
devices, is outlined in Fig. 1.
A
CHIEVEMENTS OF THE PROJECT
At this stage of the project, a prototype of the system
has been implemented and is available at the Fondazione
Santa Lucia for the validation with patients.
In fact, a three-room space in the hospital has been
furnished like a common house, and the actuators of the
system have been installed afterwards. Care has been taken
to make an installation that would be easily replicable in
most houses. The place has been provided with a portable
computer to run the core program, and several aids (input
devices) are available to cope with as many categories of
users as possible.
A Input Devices
The system input devices are customized on the motor
users’ residual abilities. In fact users can utilize the aids
they are already familiar with, and that have been
interfaced to provide a low level input to a more
sophisticated assistive device. On the other hand, the
variety of input devices provides robustness to the patients
abilities’ worsening, which is a typical consequence of
degenerative diseases.
The software implementation of this modular attitude
benefited from the use of the ICon package
[21]
ICon is an editor designed to select a set of input
devices and connect them to actions into a graphical
interactive application. ICon allows physically challenged
users to connect alternative input devices and/or configure
their interaction techniques according to their needs. With
the help of a technician for the configuration, it allows
users to configure the Aspice application to use their
favorite input devices and interaction techniques.
A non exhaustive list of input devices that have been
successfully included follows:
- keyboard and mouse. Of course these are the most
widely used input devices, and all users that are able to
use them are incouraged to do so.
Fig. 2 Examples of input devices that have been interfaced to the Aspice
system. From the top left, clockwise, a keyboard-mouse pair, a special
j
oystick, a “leaf” button (can be operated with the neck), three
p
ushbuttons (usually operated with the hand). In the background, the
Graphical User Interface of the ICon package, that seamlessly connects
several types of inputs to a ICon-aware module of the Aspice.
- Joystick, rollerball. If the loss of muscular strength
makes the use of keyboard and mouse uncomfortable for
the patient, or if spastic movements impede a correct
use, devices like joysticks and rollerballs may reveal
more appropriate.
- Touchpad. For those patients whose strength is too low
to allow lifting arms, the touchpad (like the ones
available on many laptop computers) allows to control
the cursor with the movement of one finger.
- Head Tracker. If the motor disability is extremely
impairing for the limbs, but the neck muscles are
preserved, the user can use this device to control the
cursor with the movements of the head.
- Microphone. Some users have a well preserved speech,
which might reveal to be the most reliable way to send
commands. A speech recognition engine translates
phrases into commands for the core module.
- Joypads and Buttons. Joypads can be used to move the
cursor towards the selected direction, almost like a
joystick. Buttons can be used to send a very simple
command – like “select” – or small sets of commands –
like “move forward” and “select”, if the patient can use
two buttons. This kind of input must be used in
conjunction with an appropriate feedback strategy, like a
self-scanning set of icons. Buttons can be of the most
diverse fashions, and actuated with the most disparate
strategies (hand, elbow, chin, neck, teeth, puff),
according to the most effective strategy for the
individual users.
Fig. 3 Appearance of the feedback screen. The Feedback application has
been instructed to divide the window into three panels. In the top panel,
the available selections (commands) appear as icons. In the bottom right
p
anel, a feedback stimulus by the BCI (matching the one the subject has
been training with) is provided; the user uses his learnt modulation o
f
b
rain activity to move the cursor at the center to hit either the left or the
right bars – in order to focus the previous or following icon in the top
anel – or to hit the top bar – to select the current icon. In the bottom left
p
anel, the Feedback module displays the video stream from the video
camera that was chosen beforehand in the operation.
When the user is not able to use any of the above
mentioned devices, or when a degenerative diseases makes
likely that in the future he/she will no more be able, the
support team proposes him/her to start training on the use
of a BCI.
The BCI training is performed in a laboratory
environment, through the use of the BCI2000 software
package
[22] and a clinical EEG equipment. When the user
shows that he/she is able to select reliably one out of two
or four possible items in this simplified environment, the
BCI is connected to the Aspice system, and the user is
asked to utilize the button interface (see Feedback section
below) using the BCI control that he/she best masters.
B Core Operation
The system core receives the logical signals from the
input devices and converts them into commands that can
be used to drive the output devices.
Its operation is organized as a hierarchical structure of
possible actions, whose relationship can be static or
dynamic. In the static configuration, it behaves as a
“cascaded menu” choice system. A xml initialization file,
containing information about the structure is loaded upon
login of the user, and is used to feed the Feedback module
only with the options available at the moment (i.e. current
menu). It will be duty of the Feedback module to choose
the most appropriate representation of the available
choices (i.e. a text menu, a set of icons, a topographical
representation of the commands, etc).
In the dynamic configuration, an intelligent agent tries
to learn from use which is the most probable choice that
the user will make. This prediction can be made on the
basis of (i) frequent sequences (after the user has turned
the TV on, he most probably wishes to set the volume
level); (ii) time of the day (at the twilight, the “light on”
command has a higher probability; (iii) of environmental
information (if the temperature is high, the user will
probably wish to turn the fan on); (iv) of external events (if
someone rings the door bell, almost surely the next
commands will be “open the door”).
Whenever the user select an action that, rather than
changing the internal context of the core (i.e. selects a non-
leaf item of the cascaded menu), it instructs the system to
undertake a physical action, the Control Unit fulfils the
user’s demands by sending the appropriate control signals
to the output appliances. Drivers are used to offer a
homogeneous interface from the Control Unit’s point of
view. In some cases, previously existing drivers are
utilized for the devices, whereas in other cases (e.g. the
robotic platform device) the driver has been designed “ad
hoc” for the specific system.
C Feedback
The user can select the commands and monitor the
system’s behavior through a Graphic Interface. Like all
other modules, inter-module communication is transported
via TCP/IP socket. This is most important in the case of
the Feedback module, because this means that this module
can run on a different computer from the one that is
running the control unit. While, as mentioned, this is true
for all modules, the Feedback can significantly benefit
from this, since a lighter an low power computer such as a
palmtop PC or even a smart phone can be used to give the
subject the feedback he/she needs, while being of
minimum burden for the user.
Figure 3 shows a possible appearance of the feedback
screen. In this case, choices are pictured as button-shaped
icons. This is possibly the most simple interface, and for
sure the most practical to be operate with a reduced set of
available input signals. In fact, in principle a user must
only be able to master a single signal, that would be
associated to the “select” action, leaving to the system the
duty to move the focus on each icon for a pre-determined
interval of time (auto-scan mode).
D Actuators
The Aspice system allows the user to operate remotely
electric devices (e.g. TV, fan, lights) as well as monitoring
the environment with remotely controlled videocameras.
Moreover, a robotic platform can be controlled from the
ASPICE Control Unit in order to accomplish few simple
tasks.
While input and feedback signals are carried over a
wireless (wifi or Bluetooth) communication, so that
mobility of the patient is minimally affected, most of the
actuation commands (with the obvius exception of the
robot) are carried via a powerline-based control system,
namely the X10.
X10 is a communications protocol that uses the house's
electrical wiring as the medium for remote control of
electrical systems. It was chosen since many components
are available at a reasonable price, which is an uncommon
circumstance for assistive devices. We have tested the
benefits and the limits of this technology, and we are
exploring other possibilities to cope with situations when
the X10 is not advisable.
A non exhaustive list of electric and electronic devices
that have been successfully interfaced to the system
(through either a general purpose or a custom driver) is
reported below.
- Ceiling neon lights. A X10 module has been installed in
derivation to the wall switch. This required the
intervention of a technician for a few minutes.
- Bulbs. Installation of X10 devices that can be interposed
between a bulb and its socket require no technician to
operate.
- TV set and HiFi set. The Aspice system is able to control
any infrared remotely controlled (IRRC) appliance,
thank to its IRRC emulator. A radio-wave repeater
allows to control appliances that are not in the same
room as the Central Unit computer.
- Motorized mattress. Low voltage motors were made
controllable by interposing X10 modules between the
tethered remote control and the motors. This required
about two hours work by a technician.
- Buzzer. A X10 device that emits an alert sound can be
placed in any room of the house (simply by plugging it
into a mains socket) and operated at will.
- Door opener. A radio controlled device that turns a key
in the keyhole has been interfaced with the system,
allowing the user to open any door. Its installation
requires no cabling or modification to the door.
- Network Cam. A motorized wifi network cam allows the
user to watch a video stream of a wide set of wiews,
which is usually one of the first actions that are lost by a
movement impaired individual. Its installation is trivial,
only requires a power source nearby, and can be
configured in minutes.
Moreover, The Robotics Laboratory of University of
Rome “La Sapienza” has developed a robot navigation
system based on a small set of commands, which has been
interfaced with the Aspice system. In fact, as previously
mentioned, the system should cope with a variety of
disabilities depending on the patients’ conditions.
Therefore, three possible navigation systems have been
designed for robot control. Firstly, the autonomous mode,
which is based on high level commands (e.g. “walk to the
window”) to drive the robot, should be used by very
impaired patients’, which are unable to send frequent
commands. Alternatively, depending on their residual
abilities, patients can use a continuous directional joystick
mode with basic obstacle avoidance, or a single step
directional joystick to control the robot.
T
HE CLINICAL EXPERIMENTATION
The clinical experimental design has been curried out
with the participation of subjects suffering from Spinal
Muscular Atrophy type II (SMA II), Duchenne Muscular
Dystrophy (DMD) and Amyotrophic Lateral Sclerosis
(ALS). These neuromuscular diseases cause a severe
global motor impairment which totally reduces the
subject’s autonomy and creates a non-stop assistance
condition as essential. The clinical experimentation has
been taking place at the Santa Lucia Foundation where
patients have been admitted for a neurorehabilitation
program; the whole clinical experimental procedure
underwent the approval of the local ethical commette.
Accordingly, all patients have been informed on the nature
of the developed device and on the modalities of the
clinical experimentation and they gave their voluntary
informed written consent. To this aim, we generated a
booklet containing the project overview and the clinicians
performing patients’ inclusion have been trained with all
the required technical information. Once the patients had
decided to participate to the study, the first step of the
clinical procedure consisted of an interview and physical
examination performed by the clinicians, wherein several
levels of the variables of interest (and possible
combinations) were addressed as follows:
- the degree of motor impairment and of reliance on the
caregivers for everyday activities, as assessed by current
standardized scale (Barthel Index (BI) for ability to
perform daily activities);
- the use of and familiarity with transducers and aids
(sip/puff, switches, speech recognition, joysticks) that
can be used as input to the ASPICE system;
- the ability to speak or communicate, resulting
comprehensible to an unfamiliar person (professional
personnel have been devoted to monitor language
ability);
- the level of informatics alphabetization, measured by the
number of hours / week spent in front of a computer and
by the fact that the patient works / used to work with a
computer.
The second step consisted of the training with the
ASPICE system. The training sessions have been
integrated with the rehabilitation program as Occupational
Therapy. For a period of time ranging from 3 to 4 weeks,
the patient and (when required) her/his caregivers have
been practising with the ASPICE system, installed into a
house-like environment. During the whole period, even if
more intensively at its beginning, patients have had the
assistance of an engineer and a therapist in their interaction
with the system. This experience has been conveyed into a
manual of the system, for the advantage of users and
installers, and into teaching guidelines, that are available
for teachers with less experience.
Data on two main aspects have been collected during
the experimentation:
- increase of the user’s independence
- reduction of caregivers’ workload.
The index of daily activities performed (i) at the
beginning of the training and (ii) when the user masters the
system have been recorded and analyzed to evaluate the
degree of effectiveness (ability to produce the desired
result) of the system. In this regards, the individual degree
of motor impairment and the used types of ASPICE inputs
have been taken into account for system evaluation.
Finally, a subjective index of satisfaction has been derived
from a questionnaire administered both to the patient and
to the caregivers.
C
ONCLUSIONS
The quality of life of an individual suffering from
severe motor impairments is importantly affected by its
complete dependence upon the caregivers. An assistive
device, even the most advanced, cannot substitute – at the
state of the art – the assistance provided by a human.
Nevertheless, it can contribute to relief the caregiver from
a continuous presence in the room of the patient, since the
latter can perform some simple activities on its own, and
most importantly, because the attention of the caregiver
can be recalled by some form of alarm.
This means that in a clinical environment, the cost of
assistance can be reduced, since the same number of
paramedics or assistants can care after a higher number of
patients (in non emergency conditions). In a home
environment, the life of familiars can be less hardly
affected by the presence of the impaired relative.
Most importantly, the perception of the patient is that he
has no more to rely on the caregiver for each and every
action. On one side this increases the sense of
independence of the patient, on the other side this grants a
sense of privacy, that is almost absent in the case another
human has to take care. For both reasons, the quality of life
of the patient is sensibly improved.
The ASPICE project innovates the concept of assistive
technology device, bringing it to a system level – the user
is no more given many devices to perform separate
activities; rather the system provides unified (though
flexible) access to all controllable appliances. Moreover
we succeeded in the effort of including as many
commercially available components in the system, so that
affordability and availability of the components themselves
is maximized.
The usefulness of the BCI-based interface has been
recently investigated in other studies
[23]. The
improvement of quality of life brought by such a interface
is expected to be relevant only for those patients who are
non able to perform any voluntarily controlled movement.
The advances in the BCI field are expected to increase the
performance of this communication channel, thus making
it effective for a broader population of individuals.
R
EFERENCES
[1] J.R. Wolpaw, N. Birbaumer, D.J. McFarland, G. Pfurtscheller, and T.
M. Vaughan, “Brain–computer interfaces for communication and
control” Clin. Neurophysiol. 113, 767–791, March 2002.
[2] J.K. Chapin, K.A. Moxon,R. S. Markowitz and M.A. Nicolelis.
“Real-time control of a robot arm using simultaneously recorded
neurons in the motor cortex”,. Nat Neurosci;2:664–670, 1999.
[3] J. Wessberg, C.R. Stambaugh, J.D. Kralik,P.D. Beck, M. Laubach,
J.K. Chapin, J. Kim, J. Biggs, M.A. Srinivasan and M.A. Nicolelis.
“Real-time prediction of hand trajectory by ensemble of cortical
neurons in primates”. Nature; 408:361–365, 2000.
[4] G. Pfurtscheller and C. Neuper. “Motor imagery and direct brain-
computercommunication”. Proc IEEE;89:1123–1134, 2001.
[5] M.D. Serruya,N.G. Hatsopoulos, L. Paninski, M.R. Fellows and J.P.
Donoghue. “Brain-machine interface: Instant neural control of a
movement signal.” Nature;416:141–142. 2002
[6] J.R. Wolpaw, D.J. McFarland, G.W. Neat, C.A. Forneris. “An EEG-
based brain–computer interface for cursor control”. Electroenceph
clin Neurophysiol;78:252–259, 1991.
[7] J.R. Wolpaw, N. Birbaumer, W.J. Heetderks, D.J. McFarland, P.H.
Peckham, G. Schalk, E. Donchin, L.A. Quatrano, C.J. Robinson and
T.M. Vaughan . “Brain–computer interface technology: a review of
the first international meeting.” IEEE Trans Rehabil Eng, 8:161–163,
2000a.
[8] J.R. Wolpaw, D.J. McFarland, T.M. Vaughan. “Brain–computer
interface research at the Wadsworth Center.” IEEE Trans Rehabil
Eng, 8:222–225, 2000b.
[9] G. Pfurtscheller,D. Flotzinger,W Mohl and M Peltoranta. “Prediction
of the side of hand movements from single-trial multi-channel EEG
data using neural networks”. Electroencephalogr Clin Neurophysiol.
82(4):313-5, 1992 Apr.
[10] G. Pfurtscheller and C. Neuper. “Motor imagery and direct brain-
computer communication”. Proceedings of the IEEE, 89: 1123–1134,
2001.
[11] N. Birbaumer, T. Elbert, A.G.M. Caravan and B. Roch . “Slow
potentials of the cerebral cortex and behavior.” Physiol Rev, 70:1–41,
1990.
[12] N. Birbaumer,N. Ghanayim, T. Hinterberger, I Iversen, B.
Kotchoubey, A. Kubler, J. Perelmouter, E. Taub and H. Flor. “A
spelling device for the paralyzed.” Nature, 398:297–298. 1999.
[13] N. Birbaumer , A. Kubler, N. Ghanayim, T. Hinterberger, J.
Perelmouter, J. Kaiser, I. Iversen, B. Kotchoubey, N. Neumann and
H. Flor. “The thought translation device (TTD) for completely
paralyzed patients.”
IEEE Trans Rehabil Eng, 8:190–192, 2000.
[14] J. del R. Millán, F. Renkens, J. Mouriño, and W. Gerstner. Non-
invasive brain-actuated control of a mobile robot by human EEG.
IEEE Trans. on Biomedical Engineering, 51:1026–1033, 2004.
[15] A. Zelinsky. “Field and Service Robotics”, Springer Verlag, 1997.
[16] G. Oriolo, G. Ulivi and M. Vendittelli. "Real-time map building and
navigation for autonomous robots in unknown environments". IEEE
Transactions on Systems, Man, and Cybernetics, vol. 28, no. 3, pp.
316-333, 1998.
[17] G. Dissanayake, P. Newman, S. Clark, H. Durrant-Whyte and M.
Csorba "A solution to the simultaneous localization and map
building problem". IEEE Transactions on Robotics and Automation,
Vol 17, No 3, p229-241, June 2001
[18] J.C. Latombe. “Robot Motion Planning”, Kluwer Academic
Publishers, Boston, 1991.
[19] J.P. Laumond . “Robot Motion Planning and Control” Lecture Notes
in Control and Information Sciences 229, Springer, 1998.
[20] G. Oriolo, A. De Luca and M. Vendittelli. "WMR control via
dynamic feedback linearization: Design, implementation and
experimental validation". IEEE Transactions on Control Systems
Technology, vol. 10, no. 6, pp. 835-852, 2002.
[21] P. Dragicevic, and J.-D. Fekete “Input Device Selection and
Interaction Configuration with ICON”, proceedings of IHM-HCI
2001, A. Blandford, J. Vanderdonckt, and P. Gray, , (Eds.): People
and Computers XV - Interaction without Frontiers, Lille, France,
Springer Verlag, pp. 543-448.
[22] G. Schalk, D. McFarland, T. Hinterberger, N. Birbaumer, J. Wolpaw,
“BCI2000: A general- purpose brain-computer interface (BCI)
system”, IEEE Trans Biomed Eng, vol 51, p1034–1043, 2004.
[23] A. Kübler, F. Nijboer, J. Mellinger, T. M. Vaughan, H. Pawelzik, G.
Schalk, D. J. McFarland, N. Birbaumer, and J. R. Wolpaw, “Patients
with ALS can use sensorimotor rhythms to operate a brain-computer
interface”, Neurology,vol 64 p1775 – 1777, May 2005.
... These systems adapt to various levels of disability by offering different autonomy levels [8], [9]. In this paper, we present the basic algorithms developed for a robot navigation system and their application in the ASPICE (Assistive System for Patient's Increase of Communication , ambient control and mobility in absence of muscular Effort) project [10]. One central feature of the ASPICE system is the possibility for the user to remotely control the motion of a mobile robot (a Sony AIBO) with a reduced set of commands. ...
... ASPICE is aimed at developing a technological aid allowing neuromotor-disabled users to improve or recover their mobility and communication in the surrounding environment. The project is addressed towards those patients in which the residual muscular strength is low and practical obstacles or security concerns do not allow a displacement from bed [10]. Hence, the major requirements are: adaptability to different levels of disability, low cost, and robustness to the setting. ...
Conference Paper
Full-text available
Assistive technology is an emerging area where robotic devices can be used to strengthen the residual abilities of individuals with motor disabilities or to help them achieve independence in the activities of daily living. This paper deals with a project aimed at designing a system that provides remote control of home-installed appliances, including the Sony AIBO, a commercial mobile robot. The development of the project is described by focusing on the design of the robot navigation system. Single step, semi-autonomous and autonomous operating modes have been realized to provide different levels of interaction with AIBO. Automatic collision avoidance is integrated in all cases. The performance of the navigation system is shown by experiments. Moreover, the system underwent clinical validation, in order to obtain a definitive assessment through patient feedback.
... • Sandro Bruscino (2005): "ASPICE: an assistive robotic system", supervised with G. Oriolo. This work has contributed to the publication of [161]. ...
Thesis
The methods required to make humans and robots interact and collaborate are the subject of research in physical human-robot interaction (pHRI). One of the main issues, particularly when physical contact can occur between the two, is safety. A second fundamental requirement to establish pHRI is intuitive control by the human operator, in particular when s/he is non expert. Sensor-based control can address both issues, by closing the perception-to-action loop, in a way that may well be assimilated to that of the lower motor neurons in our nervous system. Traditionally, heterogeneous sensor data was fed to fusion algorithms (e.g., Kalman or Bayesian-based), so as to provide state estimation for modeling the environment. However, since robot sensors generally measure different physical phenomena, it is preferable to use them directly in the low-level servo controller rather than to apply them to multi-sensory fusion or to design complex state machines. This idea, originally proposed in the hybrid position-force control paradigm, when extended to multiple sensors brings new challenges to the control design; challenges related to the task representation and to the sensor characteristics (synchronization, hybrid control, task compatibility, etc.). The rationale behind my work has precisely been to use sensor-based control as a means to facilitate the physical interaction between robots and humans. In particular, I have utilized vision, proprioceptive force, touch and distance to address pHRI case studies, targeting different applications and robot platforms. My research has followed four main axes: teach-and-repeat navigation of wheeled mobile robots, collaborative industrial manipulation with safe physical interaction, force and visual control for interacting with humanoid robots, and shared robot control. Each of these axes will be presented here, before concluding with a general view of the issues at stake, and on the research projects that I plan to carry out in the upcoming years.
... Fraunhofer Gesellschaft Institute developed a BCI wheelchair integration (F. Cincotti et al., 2006). Other applications about robot integrations and BCI were made by Geng et al. with BCI controlling simulated mobile robot (T. ...
Article
Full-text available
This paper considers the issue of the brain-computer interface (BCI) – the human-machine interface (HMI) based on acquisition, processing and transformation of signals that generated by the central nervous system (CNS) as the aspect of its normal function. Brain-computer interface can be considered as the bridge which is building up direct one-way or two-way communication pathway between the external technical device and the brain. The work presents techniques based on non-invasive imaging of the brain used for acquiring data in non-invasive brain computer interfaces, and is emphasized on the reading neural activity of the brain using multi-channel electroencephalograph (EEG). As the part of this paper our experience with the commercially available equipment Emotiv EPOC Neuroheadset based on this technology is introduced.
... This effortlessly interaction approach is particularly suitable for enhanced rehabilitation systems. The ASPICE [212] and DAT [213] projects are examples of this kind of technology which allow the temporary or permanent neuro-motor disabled persons to improve or recover their mobility (directly or by emulation), as well as their communication skills. ...
Article
Full-text available
Ambient Intelligence (AmI) is a new paradigm in information technology aimed at empowering people's capabilities by the means of digital environments that are sensitive, adaptive, and responsive to human needs, habits, gestures, and emotions. This futuristic vision of daily environment will enable innovative human-machine interactions characterized by pervasive, unobtrusive and anticipatory communications. Such innovative interaction paradigms make ambient intelligence technology a suitable candidate for developing various real life solutions, including in the health care domain. This survey will discuss the emergence of ambient intelligence (AmI) techniques in the health care domain, in order to provide the research community with the necessary background. We will examine the infrastructure and technology required for achieving the vision of ambient intelligence, such as smart environments and wearable medical devices. We will summarize of the state of the art artificial intelligence methodologies used for developing AmI system in the health care domain, including various learning techniques (for learning from user interaction), reasoning techniques (for reasoning about users' goals and intensions) and planning techniques (for planning activities and interactions). We will also discuss how AmI technology might support people affected by various physical or mental disabilities or chronic disease. Finally, we will point to some of the successful case studies in the area and we will look at the current and future challenges to draw upon the possible future research paths.
Article
Full-text available
Stroke is a leading cause of disability worldwide. In this paper, a novel robot-assisted rehabilitation system based on motor imagery electroencephalography (EEG) is developed for regular training of neurological rehabilitation for upper limb stroke patients. Firstly, three-dimensional animation was used to guide the patient image the upper limb movement and EEG signals were acquired by EEG amplifier. Secondly, eigenvectors were extracted by harmonic wavelet transform (HWT) and linear discriminant analysis (LDA) classifier was utilized to classify the pattern of the left and right upper limb motor imagery EEG signals. Finally, PC triggered the upper limb rehabilitation robot to perform motor therapy and gave the virtual feedback. Using this robot-assisted upper limb rehabilitation system, the patient's EEG of upper limb movement imagination is translated to control rehabilitation robot directly. Consequently, the proposed rehabilitation system can fully explore the patient's motivation and attention and directly facilitate upper limb post-stroke rehabilitation therapy. Experimental results on unimpaired participants were presented to demonstrate the feasibility of the rehabilitation system. Combining robot-assisted training with motor imagery-based BCI will make future rehabilitation therapy more effective. Clinical testing is still required for further proving this assumption.
Conference Paper
Full-text available
India although has the largest numbers of young people, numbers of elderly people are also increasing. However with aging, human being starts losing their physical prowess and ability to take care of themselves. Various assistive devices have been already designed and manufactured outside the country so far for the elderly people. These assistive devices enable them to be self-dependent in carrying out their daily tasks. One of the very useful assistive devices for elderly with mobility impairment is a powered wheelchair, controlled manually and automatically in special cases but except providing mobility it cannot solve other problems associated with an elderly person. Every elderly person has different disabilities and adding each feature will only increase its cost. This paper deals with various issues of disability in the elderly people and incorporating modularity in the assistive to provide mass customization.
Article
Full-text available
This article describes our first experiments in BCI (Brain Computer Interface) and HCI (Human Computer Interaction). Through three case studies on two different BCI headsets (NIA and EPOC), we show the pro and cons of these devices about their programming and their uses in HCI.
Article
Full-text available
RESUME Cet article décrit nos premiers retours d'expériences concernant le BCI (Brain Computer Interface) et l'IHM Au travers de trois cas d'études portant sur deux casques BCI différents (le NIA et le EPOC), nous montrons les avantages et les inconvénients de ces matériels de leur programmation, et de leurs usages concern l'Interaction Homme-Machine. ABSTRACT This article describes our first experiments in Computer Interface) and HCI. Through three case st dies on two different BCI headsets (NIA and EPOC), we show the pro and cons of these devices about their pr gramming and their uses in HCI.
Article
Full-text available
Stroke is a leading cause of disability worldwide. In this paper, a novel robot‐assisted rehabilitation system based on motor imagery electroencephalography (EEG) is developed for regular training of neurological rehabilitation for upper limb stroke patients. Firstly, three‐dimensional animation was used to guide the patient image the upper limb movement and EEG signals were acquired by EEG amplifier. Secondly, eigenvectors were extracted by harmonic wavelet transform (HWT) and linear discriminant analysis (LDA) classifier was utilized to classify the pattern of the left and right upper limb motor imagery EEG signals. Finally, PC triggered the upper limb rehabilitation robot to perform motor therapy and gave the virtual feedback. Using this robot‐assisted upper limb rehabilitation system, the patientʹs EEG of upper limb movement imagination is translated to control rehabilitation robot directly. Consequently, the proposed rehabilitation system can fully explore the patientʹs motivation and attention and directly facilitate upper limb post‐stroke rehabilitation therapy. Experimental results on unimpaired participants were presented to demonstrate the feasibility of the rehabilitation system. Combining robot‐assisted training with motor imagery‐ based BCI will make future rehabilitation therapy more effective. Clinical testing is still required for further proving this assumption.
Article
Full-text available
Many laboratories have begun to develop brain-computer interface (BCI) systems that provide communication and control capabilities to people with severe motor disabilities. Further progress and realization of practical applications depends on systematic evaluations and comparisons of different brain signals, recording methods, processing algorithms, output formats, and operating protocols. However, the typical BCI system is designed specifically for one particular BCI method and is, therefore, not suited to the systematic studies that are essential for continued progress. In response to this problem, we have developed a documented general-purpose BCI research and development platform called BCI2000. BCI2000 can incorporate alone or in combination any brain signals, signal processing methods, output devices, and operating protocols. This report is intended to describe to investigators, biomedical engineers, and computer scientists the concepts that the BCI2000 system is based upon and gives examples of successful BCI implementations using this system. To date, we have used BCI2000 to create BCI systems for a variety of brain signals, processing methods, and applications. The data show that these systems function well in online operation and that BCI2000 satisfies the stringent real-time requirements of BCI systems. By substantially reducing labor and cost, BCI2000 facilitates the implementation of different BCI systems and other psychophysiological experiments. It is available with full documentation and free of charge for research or educational purposes and is currently being used in a variety of studies by many research groups.
Article
Full-text available
When Jean-Dominique Bauby suffered from a cortico-subcortical stroke that led to complete paralysis with totally intact sensory and cognitive functions, he described his experience in The Diving-Bell and the Butterfly as ``something like a giant invisible diving-bell holds my whole body prisoner''. This horrifying condition also occurs as a consequence of a progressive neurological disease, amyotrophic lateral sclerosis, which involves progressive degeneration of all the motor neurons of the somatic motor system. These `locked-in' patients ultimately become unable to express themselves and to communicate even their most basic wishes or desires, as they can no longer control their muscles to activate communication devices. We have developed a new means of communication for the completely paralysed that uses slow cortical potentials (SCPs) of the electro-encephalogram to drive an electronic spelling device.
Article
Full-text available
To determine whether simultaneously recorded motor cortex neurons can be used for real-time device control, rats were trained to position a robot arm to obtain water by pressing a lever. Mathematical transformations, including neural networks, converted multineuron signals into 'neuronal population functions' that accurately predicted lever trajectory. Next, these functions were electronically converted into real-time signals for robot arm control. After switching to this 'neurorobotic' mode, 4 of 6 animals (those with > 25 task-related neurons) routinely used these brain-derived signals to position the robot arm and obtain water. With continued training in neurorobotic mode, the animals' lever movement diminished or stopped. These results suggest a possible means for movement restoration in paralysis patients.
Book
Joe Engelberger, the pioneer of the robotics industry, wrote in his 1989 book Robotics in Service that the inspiration to write his book came as a reaction to an industry-sponsored forecast study of robot applications, which predicted that in 1995 applications of robotics outside factories - the traditional domain of industrial robots - would amount to less than 1% of total sales. Engelberger believed that this forecast was very wrong, and instead predicted that the non-industrial class of robot applications would become the largest class. Engelbergers prediction has yet to come to pass. However, he did correctly foresee the growth in non-traditional applications of robots. Robots are now beginning to march from the factories and into field and service applications. This book presents a selection of papers from the first major international conference dedicated to field and service applications of robotics. This selection includes papers from the leading research laboratories in the world together with papers from companies that are building and selling new and innovative robotic technology. It describes interesting aspects of robots in the field ranging from mining, agriculture, construction, cargo handling, subsea operations, removal of landmines, to terrestrial exploration. It also covers a diverse range of service applications, such as cleaning, propagating plants and aiding the elderly and handicapped, and gives considerable attention to the technology required to realise robust, reliable and safe robots.
Book
1 Introduction and Overview.- 2 Configuration Space of a Rigid Object.- 3 Obstacles in Configuration Space.- 4 Roadmap Methods.- 5 Exact Cell Decomposition.- 6 Approximate Cell Decomposition.- 7 Potential Field Methods.- 8 Multiple Moving Objects.- 9 Kinematic Constraints.- 10 Dealing with Uncertainty.- 11 Movable Objects.- Prospects.- Appendix A Basic Mathematics.- Appendix B Computational Complexity.- Appendix C Graph Searching.- Appendix D Sweep-Line Algorithm.- References.
Article
The brain's electrical signals enable people without muscle control to physically interact with the world.
Article
Thirty channels of EEG data were recorded prior to voluntary right or left hand movements. Event-related desynchronization (ERD) was quantified in the 8-10 Hz and 10-12 Hz bands in single-trial data and used as training input for a neural network comprised of a learning vector quantizer (LVQ). After a training period, the network was able to predict the side of hand movement from single-trial EEG data recorded prior to movement onset.
Article
This study began development of a new communication and control modality for individuals with severe motor deficits. We trained normal subjects to use the 8-12 Hz mu rhythm recorded from the scalp over the central sulcus of one hemisphere to move a cursor from the center of a video screen to a target located at the top or bottom edge. Mu rhythm amplitude was assessed by on-line frequency analysis and translated into cursor movement: larger amplitudes moved the cursor up and smaller amplitudes moved it down. Over several weeks, subjects learned to change mu rhythm amplitude quickly and accurately, so that the cursor typically reached the target in 3 sec. The parameters that translated mu rhythm amplitudes into cursor movements were derived from evaluation of the distributions of amplitudes in response to top and bottom targets. The use of these distributions was a distinctive feature of this study and the key factor in its success. Refinements in training procedures and in the distribution-based method used to translate mu rhythm amplitudes into cursor movements should further improve this 1-dimensional control. Achievement of 2-dimensional control is under study. The mu rhythm may provide a significant new communication and control option for disabled individuals.
Article
Studies at the Wadsworth Center over the past 14 years have shown that people with or without motor disabilities can learn to control the amplitude of mu or beta rhythms in electroencephalographic (EEG) activity recorded from the scalp over sensorimotor cortex and can use that control to move a cursor on a computer screen in one or two dimensions. This EEG-based brain-computer interface (BCI) could provide a new augmentative communication technology for those who are totally paralyzed or have other severe motor impairments. Present research focuses on improving the speed and accuracy of BCI communication.