Conference PaperPDF Available

A User-friendly Wearable Single-channel EOG-based Human-Computer Interface for Cursor Control

Authors:

Abstract and Figures

This paper presents a novel wearable single-channel electrooculography (EOG) based human-computer interface (HCI) with a simple system design and robust performance. In the proposed system, EOG signals are generated from double eye blinks, collected by a commercial wearable device (the NeuroSky MindWave headset), and then converted into a sequence of commands that can control cursor navigations and actions. The EOG-based cursor control system was tested on 8 subjects, and the average accuracy is about 84% for indoor uses. Compared with other EOG-based HCI systems, this system is highly user-friendly and does not require any training. Therefore, this system has the potential to provide an easy-to-use and cheap assistive technique for locked-in patients who have lost their main body muscular abilities but with proper eye-condition. Keywords: electrooculography, human-computer interface, assistive devices, cursor control, augmentative and alternative communication
Content may be subject to copyright.
AbstractThis paper presents a novel wearable
single-channel electrooculography (EOG) based
human-computer interface (HCI) with a simple system design
and robust performance. In the proposed system, EOG
signals are generated from double eye blinks, collected by a
commercial wearable device (the NeuroSky MindWave
headset), and then converted into a sequence of commands
that can control cursor navigations and actions. The
EOG-based cursor control system was tested on 8 subjects,
and the average accuracy is about 84% for indoor uses.
Compared with other EOG-based HCI systems, this system is
highly user-friendly and does not require any training.
Therefore, this system has the potential to provide an
easy-to-use and cheap assistive technique for locked-in
patients who have lost their main body muscular abilities but
with proper eye-condition.
Keywords: electrooculography, human-computer
interface, assistive devices, cursor control, augmentative and
alternative communication
I. INTRODUCTION
In recent years, numerous human-computer interface
(HCI) systems have been developed as assistive
technologies for improving life quality of the people with
neuromuscular disabilities [6]. Examples of these assistive
technologies include specially designed joystick,
infrared-oculography, tongue-computer interface and
brain-computer interface [7-8]. Generally, all these HCI
systems serve as the bridge between the human and the
computer by translating or decoding the signals generated
from physiological processes into control commands.
Particularly, because the eyes and related facial muscles are
rarely affected by neuromuscular mobility impairments,
many HCI systems are developed by translating
electrooculography (EOG) signals generated by intended
actions of these intact organs to control commands. A
majority of existing EOG-HCI systems [1-4] rely on multi
wet electrodes, because they can achieve a high
signal-to-noise ratio (SNR) of EOG and can provide more
discriminative information for recognizing more types of
eye activities. As a result, characteristic structure of EOG in
temporal or spatial domain can be more accurately extracted
by multiple wet electrodes, and hence the systems are more
capable of classifying different types of eye-movements
such as looking at different directions, resulting in a higher
system performance score.
Although the HCI systems based on multi-channel EOG
normally have a good performance, they are not considered
as user-friendly and practical. The overall quality of an
Research supported by a Hong Kong RGC GRF Grant (HKU
785913M)
A. M. S. Ang, Z. G. Zhang and Y. S. Hung are with the Department of
Electrical and Electronic Engineering, the University of Hong Kong,
Pokfulam Road, Hong Kong (email:angms@hku.hk;
zgzhang@eee.hku.hk; yshung@eee.hku.hk).
J.N.F. Mak is with NeuroSky Hong Kong, Science Park, Sha Tin, Hong
Kong, (e-mail: jmak@neurosky.com ).
assistive technology can be evaluated by two types of
factors, the ergonomic factors and the system performance
factors. System performance factors, such as the accuracy of
classifying signals encoded by different intentions, the false
alarm rate, the execution speed, and the information transfer
rate, are often regarded as the most important factors when
designing a HCI system, particularly, in laboratory.
However, for practical applications for end users, the
ergonomic factors such as comfortableness, portability, cost
effectiveness, and time spent on training are crucial because
a HCI system ultimately has to be used by an end user for a
long period of time and in an easy-to-use manner.
Ergonomic factors are less considered in many existing
EOG-based HCI systems developed in laboratory, and
therefore, these systems cannot provide a practical assistive
technique for users. For example, current HCI systems
based on EOG activities usually consist of a large bulk of
hardware, such as wiring and amplifiers, which are not
user-friendly for disabled people since they have a higher
setup and preparation cost. Another important ergonomic
factor is the time spent on training for users. Most
EOG-based HCI systems require users to perform intensive
practice in a training session, or to memorize certain kind of
eye movements for specific control commands, which is not
user-friendly and could not be achievable for patients with
cognitive impairments.
In this paper, we develop a novel EOG-based HCI
system which is aimed at maximizing both ergonomic
factors (usability, less training time, etc.) and system
performance factors (accuracy, information transfer rate,
etc.). The proposed system only use EOG produced by one
type of eye activity, double blink (DB), to encode users
intentions. In the proposed system, EOG activities are
captured by a single-channel commercial headset, the
NeuroSky MindWave Mobile Headset (NeuroSky, CA,
US). The headset is basically a single-channel sensor with a
dry electrode made of stainless steel. The sensor is attached
on the forehead of the user to collect electrical signals
generated by the brain and muscles continuously. The
system then sends the collected signals wirelessly through a
Bluetooth communication protocol to a computer for
processing. The processing blocks consist of filtering,
activity detection and classification. The recording will first
be filtered to remove noise and any undesired components,
and then the continuous recording will go through an
activity detector to extract a short data segment containing
EOG activities. The extracted data segment will be analyzed
to yield a set of discriminative features to be fed into a
classifier. Finally the classification output is used to control
the mouse cursor for multiple applications in a computer.
The control of the cursor is based on the following
switching control scheme : (1) the mouse cursor will keep
moving either vertically (from top to bottom) or horizontally
(from left to right) on the screen with a constant speed if no
double blink is detected; (2) when the first double blink (i.e.
a control command) is detected from EOG, the cursor will
switch its moving direction (from vertically to horizontally,
A User-friendly Wearable Single-channel EOG-based
Human-Computer Interface for Cursor Control
A. M. S. Ang, Z. G. Zhang, Member IEEE, Y. S. Hung, Senior Member IEEE, J. N. F. Mak
CONFIDENTIAL. Limited circulation. For review only.
Preprint submitted to 7th International IEEE EMBS Conference
on Neural Engineering. Received November 24, 2014.
or from horizontally to vertically); (3) when the second
double blink occurs, a clicking action will be performed on
that current cursor location and the mouse cursor will be
reset to the top-left corner of the screen and move again.
As compared with existing EOG-based HCI systems,
the advantages of the new system are three folds. First, it is
based on a commercial device, NeuroSky MindWave
headset, so that it has a high mobility and can be potentially
used in more scenarios. Second, the system only uses one
eye action, double blinks, to encode users’ intentions so that
the control strategy is simple and user friendly. Third,
because of the high inter-subject consistency of
double-blink EOG signals, the system is capable of
performing cross-subject decoding, which means that no
training is required for new users.
Unlike other multi-channel counterparts, the proposed
system only has one single electrode to collect information,
and therefore, the proposed system is more sensitive to noise
and hence having a lower SNR. To handle these problems,
the system adapts some advanced signal processing
methods (such as wavelet filtering and support vector
machine) to boost the system performance.
The rest of the paper is organized as follows. In Section
II, the system architecture, data analysis methods and
implementations of the new EOG-based system are
proposed. The experimental results and discussion are given
in Section III. Summary is drawn and discussed in Section
IV.
II. METHODS
A. System Architecture
Fig. 1 shows the structural diagram of the proposed
system. EOG signals generated from the user are collected
by the NeuroSky MindWave Mobile headset with a
sampling rate of 512Hz. The raw digital signals are packed
and then transmitted into a computer through Bluetooth
communication protocol. Data packet parsing is then
performed in the computer to obtain the raw numerical
values of the signals. These raw signals then go through
pre-processing blocks, which include de-trending and
wavelet de-noising to remove noise and un-desired
components. The filtered signals then pass through an
activity detector to extract the signal segments that contain
double blinks (or other major eye activities) and store the
extracted signal segment into a buffer of 1 second.
Subsequently, feature extraction is performed for each
buffer and the features extracted are fed into the SVM
classifier. All the functional blocks are cascaded, once a
non-interested activity is detected, the system will throw
away that segment immediately to save computational
resources.
Fig. 1. The cascade architecture of the whole system.
B. EOG Signals
EOG signals are the electrical activity generated by the
movements of the eyeballs or the eyelid muscles. As shown
in our previous study, various kinds of EOG signals, such as
looking towards different directions and blinking in
different ways, can be used to indicate the users intentions
[5]. However, because the system proposed in this paper is
primarily focused on practical aspects and is aimed at
providing an alternative communication pathway for the
disabled people, only both-eye double blinks (DB) will be
used as control signals in this paper.
Fig. 2. The raw double blink EOGs collected from 6
subjects. Signals are aligned by peaks.
C. Filtering and Real-time Detection
Although the EOGs look highly consistent, filtering is
necessary for noise removal and enhancement of the
reliability of the detector and the extracted features. Wavelet
filtering is used instead of traditional bandpass filtering to
remove noise from raw recordings because of its better
de-noising ability and smaller phase distortion.
Activity detection is used in the system to isolate DB
signals from continuous recordings. Because there is a great
difference in terms of magnitude between the DB signals
and the background recording, a magnitude-based detection
method is used. When the signal magnitude exceeds a
certain threshold, a 1-second segment of that instant will be
extracted out and stored into a buffer. To handle the
non-stationarity of DB signals, the threshold value
is
dynamically updated by adjusting previous threshold value
with the current estimated magnitude of background
recordings. That is,
is calculated as follows:
magnitude noise previous
magnitudes noise new
PreviousNew
.
(1)
Experimental data shows that the dynamic threshold
for
the DB signal is 0.0498mV 0.0072mV (meansd).
D. Feature Extraction and Classification
From the 1s data segments we extract features that can
differentiate DB signal of interest from signals generated by
other eye movements, such as single blinks. We extracted
numerous features (either in time domain or in frequency
domain) from data segments and performed feature
selection by comparing the inter-cluster distances of each
feature. Finally, the following three features are selected for
CONFIDENTIAL. Limited circulation. For review only.
Preprint submitted to 7th International IEEE EMBS Conference
on Neural Engineering. Received November 24, 2014.
classifying DB activity and non-DB activity: L1-norm,
Kurtosis and Entropy. L1-norm measures the magnitude of
the signal by summing up the absolute values of all the
samples in the signal vector. Kurtosis measures the
peakedness of the signal and entropy measures the amount
of information in the signal.
After feature extraction, three features are fed into the
classifier, which is basically a kernelized support vector
machine (SVM). In training the SVM, both regularization
and cross-validation are performed. The classification
output will be translated to digital commands for
controlling the mouse cursor.
E. Cursor Control
DB signals are used as a switch in the system to control
the following two actions: (1) the switching of the moving
direction of mouse cursor and (2) left-clicking. Suppose the
mouse cursor has the coordinate of (X,Y) on the screen with
an initial condition of X=Y=1 (the top left corner of the
screen). When the system starts running, the mouse cursor
will start moving horizontally (i.e., Y is kept constant but X
is kept increasing with a step size of
x). When a DB activity
is detected, the cursor will stop at the current location and
start moving in vertical (i.e., X is kept constant and Y is kept
increasing with step size of
y). When the second DB
activity is detected, the mouse will left-click at that location
(X,Y), and then its coordination will be reset to X=Y=1. The
whole process repeats so that complex functions, such as
text input, can be realized. The cursor control paradigm is
summarized in Table I.
TABLE I. FLOWCHART OF CURSOR CONTROL USING DOUBLE BLINKS
The default values of XMax and YMax are the screen sizes,
while
x and
y are the step sizes along horizontal and
vertical directions, respectively. These values affect the
cursor moving speed and thus can be tuned to meet
different people’s need. In this paper the screen resolutions
is 1920×1080.
F. Experiment Set up
Eight subjects (aged 18-40; all males) with proper eye
conditions participated in the experiment. Data were
recorded from two scenarios: indoor (in a quite laboratory)
and outdoor (in a canteen with different sources of noise to
stimulate real-life situation). We collected indoor data from
6 subjects, and outdoor data from 2 subjects. The subject is
seated in a chair and the headset sensor is attached to the
forehead with a sampling frequency at 512Hz. A computer
was placed about 50-100cm in front of the subject. The
experiment consists of two sessions: a calibration session
and a testing session. In the calibration session, we collected
a few (less than 5) DB signals from each subject and the
signals are used to check whether the sensor is detached
from the subject’s forehead as well as to calibrate threshold
as (1). No training is performed after the calibration. Then
system is tested using the virtual on screen keyboard. All
subjects gave their written informed consent, and the local
ethics committee approved the experimental procedures.
G. Performance evaluation
To evaluate the system performance, the accuracy, the
information transfer rate and the processing time will be
used. The accuracy is the ratio of correctly classified trials
to the total number of signal trials. The information transfer
rate (ITR) is a measure of amount of bits transferred per
minute. It is calculated as
)](l og)
1
1
(l og)1()(l og[
60
222 N
N
p
ppp
T
ITR
,
(2)
where T is the time interval between two consecutive
commands, N is the number of commands, and P is the
classification accuracy. In the proposed system N=2 and
T=2.0179s0.59s. Notice that ITR is a function of time
intervals between commands and accuracy, thus high
accuracy does not always implies a higher ITR.
Finally, the processing time required for the user to input a
short English phrase is also evaluated. To test the control of
the cursor on texting, the Windows On-Screen Keyboard is
used as shown in Fig. 3. When the mouse moves to the
corresponding button, the user performs a DB to “press” the
button. The English phrase “hello worldwhich contains 11
characters (space included) is used to simulate daily usage
of the computer for word input.
Fig. 3. The Windows On-Screen keyboard and the input
text.
CONFIDENTIAL. Limited circulation. For review only.
Preprint submitted to 7th International IEEE EMBS Conference
on Neural Engineering. Received November 24, 2014.
III. RESULTS
On average, the accuracy and ITR for indoor and
outdoor testing are (84.41%, 45.47 bits/min) and (71.50%,
41.39 bits/min) respectively. The following tables show the
results for each subject in indoor and outdoor environment.
It is important to note that, the SVM classifier is trained on
Subject 1 and applied to all subjects. In another word, the
performance of Subjects 2-6 listed in the following tables is
from cross-subject prediction and these subjects did not
undergo any training phase before they used the EOG-HCI
system.
TABLE II-A. IN DOOR SYSTEM PERFORMANCE OF THE EOG-HCI SYSTEM
Subject
1
2
3
4
5
6
Accuracy
(%)
95.12
80.00
88.10
85.29
88.00
70.00
ITR
(bit/min)
54.60
48.15
50.68
39.78
38.37
38.70
thello-world
(min)
2.1
3.3
3.1
3.5
4.0
3.8
TABLE II-B. OUTDOOR SYSTEM PERFORMANCE OF THE EOG-HCI SYSTEM
Subject
7
8
Accuracy
(%)
69.00
74.00
ITR
(bit/min)
40.00
42.77
thello-world
(min)
5.5
5.2
The times spent (thello-world) on typing “hello world” from
different subjects are largely different, which is due to the
following reason. Actually, the computational complexity
of the proposed system is moderately high. Because of the
advanced signal processing algorithms used in this system,
there is a small amount of time delay (0.1~0.5 seconds).
Such time delay makes it possible for the users to click at the
wrong location on the screen. Therefore, the time spent on
typing “hello world” actually includes the time spent on
pressing “backspace” to delete the wrong inputted
characters. To address the problem of lagging, the following
improvements can be made: (1) the selection of
x and
y
should be smaller, (2) a larger cursor (by changing the
operating system setting) should be used, (3) the computer
screen size should be larger so that the button size of the
virtual on screen keyboard can be larger and becomes easier
to press. It is also suggested that the movable range of the
cursor position should be constrained. For example, when
initializing the cursor position, the top-left corner of the
virtual on-screen keyboard should be used instead of the
position (0, 0).
In addition, we did a survey on the user-friendliness of
the system. All subjects agreed that there is no any
discomfort or pain during the experiment, and they all agree
that the system is very easy to use.
The following table shows the comparison of the
proposed system to other established systems on the
literature. It can be seen that, the new system has an
acceptable high accuracy, but the usability of the system is
higher than other existing counterparts (mainly because of
the NeuroSky MindWave Systems).
TABLE III. PERFORMANCES OF DIFFERENT SYSTEMS
System
[1]
[2]
[3]
[4]
This paper
Accuracy (%)
95
82-100
78-97
86
72-84
#channels
3
16
5
8
1
IV. CONCLUSION
This paper presented a computer-access solution for
people who can only move their eyes to control the
computer by translating eye-blinks into a series of mouse
cursor control sequence. With only single channel, the
system has a higher degree of usability but still can achieve
an acceptable accuracy rate. Not just in laboratory, the
system can be used in outdoor environment. The system
only requires double-blink action, which is natural to those
with proper eye-condition and causes no discomfort. Thus
in terms of performance, both ergonomic factors and system
performance factors are maximized. The system has the
potential to find many applications in daily computer usage
such as cursor control, text processing and web-browsing. In
this paper, only DB signals are utilized in the system. In
future, the proposed system can be extended further into a
more powerful system by utilizing other non-DB activities,
to form a system with a higher information transfer rate.
REFERENCES
[1]. E. English, A. Hung, E. Kesten, D. Latulipe, and Z. P. Jin,
“EyePhone: A mobile EOG-based human-computer
interface for assistive healthcare,” in Proc. IEEE Conf.
EMBS NER, 2013.
[2]. S. L. Wu, L. D. Liao, S. W. Lu, W. L. Jiang, S. A. Chen, and
C. T. Lin, “Controlling a human-computer interface system
with a novel classification method that uses
electrooculography signals,” IEEE Trans. Biomed. Eng., vol.
60, no. 8, pp. 2133-2141, Aug. 2013.
[3]. Y. Nam, B. Koo, A. Cichocki, and S. Choi, “GOM-Face:
GKP, EOG, and EMG-based multimodal interface with
application to humanoid robot control,” IEEE Trans. Biomed.
Eng., vol. 61, no. 2, pp. 453-462, Feb. 2014.
[4]. T. Yagi, Y. Kuno, K. Koga, and T. Mukai, “Drifting and
blinking compensation in electro-oculography (EOG)
eye-gaze interface,” in Proc. IEEE Conf. SMC, 2006.
[5]. J. F. Wu, A. M. S. Ang, K. M. Tsui, H. C. Wu, Y. S. Hung, Y.
Hu, J. N. F. Mak, S. C. Chan, and Z. G. Zhang, “Efficient
implementation and design of a new single-channel
electrooculography-based human-machine interface
system,’’ IEEE Trans. Circuit and Systems II, in press.
[6]. A. D. N. Edwards, ed. Extraordinary Human-Computer
Interaction: Interfaces for Users with Disabilities. vol. 7.
CUP Archive, 1995.
[7]. L. N. S. Andreasen Struijk, An inductive tongue computer
interface for control of computers and assistive devices."
IEEE Trans. Biomed. Engineering vol. 53, no. 12, pp.
2594-2597, 2006.
[8]. J. R. Wolpaw, D. J. McFarlanda, G. W. Neatb, and C. A.
Fornerisa, An EEG-based brain-computer interface for
cursor control,” Electroencephalography Clin.
Neurophysiology, vol. 78, no. 3, pp. 252-259, 1991.
CONFIDENTIAL. Limited circulation. For review only.
Preprint submitted to 7th International IEEE EMBS Conference
on Neural Engineering. Received November 24, 2014.
... Some prominent EOG based applications that have been produced and executed in most recent two decades incorporate, electrical wheelchair control [9,35,11,33,34], versatile robot control [36,37], cursor mouse control [38,26,39], virtual keyboards [41,40,42,43], and eye action acknowledgment [23,44,10,45]. Most of the existing EOG based HMI system secure signals from numerous (3)(4)(5)(6)(7)(8) wet and often pre-wired electrodes, set around the eyes and the setup is not convenient and user friendly [48]. The user needs to put some portion of energy in setting up the terminals and ensuring an enduring wear of such systems. ...
... No. of channels multi-channel (wired / wireless ) Less no of channels (3)(4)(5)(6)(7)(8) Single channel [13,14], [15,16] [17], [18,19], [20], [21,22,23,24,8,25,26]. [27,28,29,30,31], [32] [ 9,11,33,34,35], [36,37], [38,39,26], [40,41,42,43], [44,10,45,23] [ 46,47,37,48,49,12,50] screen based application imposes a serious limitation on their practical applications. It is therefore the multi channel EOG systems are not really welcomed by users in real time practice. ...
... The GUI based control strategy is followed. In another comparable investigation [48], the emphasis was given to maximize the system performance along with ergonomic consideration, a NeuroSky sensor is used for developing EOG based MMI for the cursor control application. Only one type of eye activity i.e. ...
Article
Full-text available
Please use the below link to download the full article 'Bio-potentials for smart control applications' https://rdcu.be/bEdWd
... Many factors contribute to the variation in the electrooculogram reading, such as electrode placement, head movement, blinking, lightening conditions, etc [4,8]. EOG based systems have been designed for controlling the virtual keyboard and mouse cursor [11][12][13]. Recently some researchers have used a stable EOG interfacing unit for controlling electric wheelchairs for disabled people [14][15][16][17][18]. Signal de-noising methods such as wavelet transform technique, amplification procedure and artifacts reduction techniques have been employed to get an excellent noise-free EOG signal [5,19]. ...
... The functioning of the circuit has been verified by its ability to produce clear signals with minimal distortions. The amplitude values in Figures 9,10,11,12,13,14,15,16,17,18,19,20 and 21 are consistent with that of the EOG signals, thus proving that all applied electrode positions are active sites of signal acquisition that can be used as inputs in biomedical applications. Furthermore, it can be confirmed that the signals are indeed EOG signals due to the similarity in waveform shape to that of a general EOG signal [6,14]. ...
... Manabe et al. [11] placed four electrodes on the head at the locations of the common headphone cushion. Ang et al. [12] used an alternative EOG system where activities were recorded by a single-channel commercial headset NeuroSky MindWave Mobile Headset (NeuroSky, San Jose, CA, USA). The device was formed by a single-channel sensor based on a dry electrode with stainless steel. ...
Article
Electro-oculography (EOG)-based brain-computer interface (BCI) is a relevant technology influencing physical medicine, daily life, gaming and even the aeronautics field. EOG-based BCI systems record activity related to users' intention, perception and motor decisions. It converts the bio-physiological signals into commands for external hardware, and it executes the operation expected by the user through the output device. EOG signal is used for identifying and classifying eye movements through active or passive interaction. Both types of interaction have the potential for controlling the output device by performing the user's communication with the environment. In the aeronautical field, investigations of EOG-BCI systems are being explored as a relevant tool to replace the manual command and as a communicative tool dedicated to accelerating the user's intention. This paper reviews the last two decades of EOG-based BCI studies and provides a structured design space with a large set of representative papers. Our purpose is to introduce the existing BCI systems based on EOG signals and to inspire the design of new ones. First, we highlight the basic components of EOG-based BCI studies, including EOG signal acquisition, EOG device particularity, extracted features, translation algorithms, and interaction commands. Second, we provide an overview of EOG-based BCI applications in the real and virtual environment along with the aeronautical application. We conclude with a discussion of the actual limits of EOG devices regarding existing systems. Finally, we provide suggestions to gain insight for future design inquiries.
... EOG based HCI system is used for controlling the movement of wheelchair [20][24] [25] [26]. In paper [11][15] [18], EOG signal are used for controlling cursor and mouse in the computer and virtual keyboard [16].EOG based security system for analyzing the eye movement [29]. ...
Research
Full-text available
The proposed system can be used for paralyzed patient, trauma and stroke patient whose condition cause reduced in muscle movement in the body due to sensory loss. Because of the damage in nervous system leads to loss of motor function. Due to this condition the affected person cannot complete his day to day activities and requires a helping hand that can carry out simple tasks. The EOG signal are used in rehabilitation for the paralyzed patients to perform the daily activities and also it is convenient form when compared to other signal acquisition. Simple tasks like hand movement can be performed by paralysis person using the proposed system. This paper consists of the analysis on EOG signal used for movementof hand for a paralyzed person. Here the eye movement and tracking of signals from the patient's eye is taken. This plays an important role in receiving the right-hand movement. This proposed design utilizes few numbers of electrodes for signal acquisition which significantly reduces the cost consumed by the circuit. The potential acquired by the electrode surrounding the eye region is due to the horizontal and vertical movement of the eye ball. This signal is fed as an input to the microcontroller. The output of the microcontroller was fed to the servomotor that is connected with hand glove. This helps the patient to grip an object and motivate them to do their task by themselves.
... EOG based HCI system is used for controlling the movement of wheelchair [20][24] [25] [26]. In paper [11][15] [18], EOG signal are used for controlling cursor and mouse in the computer and virtual keyboard [16].EOG based security system for analyzing the eye movement [29]. ...
Conference Paper
Full-text available
... EOG based HCI system is used for controlling the movement of wheelchair [20][24] [25] [26]. In paper [11][15] [18], EOG signal are used for controlling cursor and mouse in the computer and virtual keyboard [16].EOG based security system for analyzing the eye movement [29]. ...
Article
Full-text available
The proposed system can be used for paralyzed patient, trauma and stroke patient whose condition cause reduced in muscle movement in the body due to sensory loss. Because of the damage in nervous system leads to loss of motor function. Due to this condition the affected person cannot complete his day to day activities and requires a helping hand that can carry out simple tasks. The EOG signal are used in rehabilitation for the paralyzed patients to perform the daily activities and also it is convenient form when compared to other signal acquisition. Simple tasks like hand movement can be performed by paralysis person using the proposed system. This paper consists of the analysis on EOG signal used for movementof hand for a paralyzed person. Here the eye movement and tracking of signals from the patient's eye is taken. This plays an important role in receiving the right-hand movement. This proposed design utilizes few numbers of electrodes for signal acquisition which significantly reduces the cost consumed by the circuit. The potential acquired by the electrode surrounding the eye region is due to the horizontal and vertical movement of the eye ball. This signal is fed as an input to the microcontroller. The output of the microcontroller was fed to the servomotor that is connected with hand glove. This helps the patient to grip an object and motivate them to do their task by themselves.
... To reduce this problem, different types of HCI systems have been developed in recent years. some of them were electric wheelchair [3], cursor Control [4], tooth Controller [5], Hospital alarm System [6], lip movement system [7], sip-and-puff controller [8], virtual keyboard [9], Television control system [10], virtual keyboard [11], smartphones [12], virtual Game Controller [13], Eye Tracking System [14]. ...
... To reduce this problem, different types of HCI systems have been developed in recent years. Some of them were electric wheelchair [2], cursor Control [3], tooth Controller [4], Hospital alarm System [5], lip movement system [6], sip-and-puff controller [7], virtual keyboard [8], Television control system [9]. ...
Article
Paralyzed patients were increasing day by day. Some of the neurodegenerative diseases like amyotrophic lateral sclerosis, Brainstem Leison, Stupor and Muscular dystrophy affect the muscle movements in the body. The affected persons were unable to migrate. To overcome from their problem they need some assistive technology with the help of bio signals. Electrooculogram (EOG) based Human Computer Interaction (HCI) is one of the technique used in recent days to overcome such problem. In this paper we clearly check the possibilities of creating nine states HCI by our proposed method. Signals were captured through five electrodes placed on the subjects face around the eyes. These signals were amplified with ADT26 bio amplifier, filtered with notch filter, and processed with reference power and band power techniques to extract features to detect the eye movements and mapped with Time Delay Neural Network to classify the eye movements to generate control signal to control external hardware devices. Our experimental study reports that maximum average classification of 91.09% for reference power feature and 91.55%-for band power feature respectively. The obtained result confirms that band power features with TDNN network models shows better performance than reference features for all subjects. From this outcome we conclude that band power features with TDNN network models was more suitable for classifying the eleven difference eye movements for individual subjects. To validate the result obtained from this method we categorize the subjects in age wise to check the accuracy of the system. Single trail analysis was conducted in offline to identify the recognizing accuracy of the proposed system. The result summarize that band power features with TDNN network models exceed the reference power with TDNN network model used in this study. Through the outcome we conclude that that band power features with TDNN network was more suitable for designing EOG based HCI in offline mode.
Article
Objective Steady-state visual evoked potential (SSVEP) is a control signal which is widely used in brain-computer interface (BCI) systems. The SSVEP-based spellers with hierarchical structure have a limitation of low ITR. To improve the ITR in these spellers, we effectively applied the character encoding based on the character frequency rate. Methods We proposed the 1–2 level hierarchical structure that allows the user to spell the most used characters just in one stage, while other characters will be selected through two stages. We also considered the latency at the start of each trial, to enhance the SSVEP classification accuracy. To estimate the ITR more accurately, we used a novel ITR definition for the first time, which considers the symbol occurrence probability. Results The proposed speller achieved the mean classification accuracy of 90.5%, the ITR of 48.3 bit/min, and the speed of 13.2 char/min. The latency varies for different subjects, and the mean value of 0.2 was determined across all individuals. Conclusion Considering the character encoding enhances the performance of SSVEP-based BCI spellers. Significance The proposed speller provides a reliable and easy-to-use assistive communication system for locked-in patients.
Chapter
According to recent trends in information technology, classroom learning is transformed to Web based learning. This transformation helps learner to trigger digital technologies anywhere and anytime. This paper plan to build a system that can harness the power of the brain and build smart and meaningful applications to make life easier. The major problem is emerged during online education is loose the learner’s active attention after some duration of time. This leads to the user getting distracted without having any mechanism to provide him with a feedback, as a result, online learning is not getting as much effective as classroom learning. Therefore, EEG device is used for data acquisition, to measure EEG signals and also to monitor the attention levels of user. Proposed project will collect the EEG data to calculate various parameters such as concentration level, attention level, etc. These parameters will be used in the smart applications to provide real-time analysis and feedback to the user. This technology will provide real-time feedback user who has enrolled in MOOCs. This should foresee whether the student struggles or not while learning to give convenient alarms.
Article
Full-text available
This brief introduces a new and practical human-machine interface (HMI) system based on single-channel electrooculography (EOG) signals. The proposed system uses a consumer wireless recording device to collect EOG and employs new encoding/decoding paradigms to covey users’ intentions with EOG from eye movements including blinking and looking up. The simplicity and mobility of the system provides a comfortable and practical solution to HMI. Furthermore, to reduce the hardware complexity and power consumption of the signal processing modules of the EOG-based HMI system, a novel multiplier-less implementation is developed, where all the algorithms involved, such as bandpass filtering, wavelet filtering, and support vector machine, can be realized using limited number of adders and shifters only. Experimental results show that the proposed system offers a simple, practical and yet reliable EOG-based HMI with low complexity and power consumption.
Article
Full-text available
Practical issues such as accuracy with various subjects, number of sensors, and time for training are important problems of existing brain-computer interface (BCI) systems. In this paper, we propose a hybrid framework for the BCI system that can make machine control more practical. The electrooculogram (EOG) is employed to control the machine in the left and right directions while the electroencephalogram (EEG) is employed to control the forword, no action, and complete stop motions of the machine. By using only 2-channel biosignals, the average classification accuracy of more than 95% can be achieved.
Article
Full-text available
This letter introduces a new inductive tongue computer interface to be used by disabled people for environmental control. The interface demands little effort from the user, provides a basis for an invisible interface, and has potential to allow a large number of commands to be facilitated.
Conference Paper
Human-Computer Interfaces (HCI) allow effective interactions between human being and computers, which is of particular significance to people with disabilities or temporary mobility impairment. In this paper, we propose the EyePhone framework, a mobile HCI that allows users to control mobile phones through intentional eye or facial movements. A proof-of-concept prototype is developed based on a wearable electroencephalograph (EEG) headset and an Android smartphone. Specifically, a graphical window can receive and display continuous EEG data acquired from the headset; a mouse emulator can allow users to move a cursor around the smartphone screen by moving their heads and eyes; and an emergency dialer can allow users to make an emergency phone call through a certain pattern of eye/facial movements. The launching and switching of each functional module are also implemented through predefined head movement patterns, in order to achieve a true “hands-free” environment. The efficacy and efficiency of the proposed EyePhone system is evaluated based on experiments in a variety of scenarios (e.g., sitting, standing, and walking).
Article
We present a novel human-machine interface, called GOM-Face, and its application to humanoid robot control. GOMFace bases its interfacing on three electric potentials measured on the face: (1) glossokinetic potential (GKP), which involves tongue movement; (2) electrooculogram (EOG), which involves eye movement; (3) eletromyogram (EMG), which involves teeth clenching. Each potential has been individually used for assistive interfacing to provide persons with limb motor disabilities or even complete quadriplegia an alternative communication channel. However, to the best of our knowledge, GOM-Face is the first interface that exploits all these potentials together. We resolved the interference between GKP and EOG by extracting discriminative features from two covariance matrices: a tonguemovement- only data matrix and eye-movement-only data matrix. With the feature extraction method, GOM-Face can detect four kinds of horizontal tongue or eye movements with an accuracy of 86.7% within 2.77 s. We demonstrated the applicability of GOMFace to humanoid robot control: users were able to communicate with the robot by selecting from a pre-defined menu using eye and tongue movements.
Article
Electrooculography (EOG) signals can be used to control human-computer interface (HCI) systems, if properly classified. The ability to measure and process these signals may help HCI users to overcome many of the physical limitations and inconveniences in daily life. However, there are currently no effective multi-directional classification methods for monitoring eye movements. Here, we describe a classification method used in a wireless EOG-based HCI device for detecting eye movements in 8 directions. This device includes wireless EOG signal acquisition components, wet electrodes and an EOG signal classifications algorithm. The EOG classification algorithm is based on extracting features from the electrical signals corresponding to 8 directions of eye movement (up, down, left, right, up-left, down-left, up-right and down-right) and blinking. The recognition and processing of these 8 different features were achieved in real-life conditions, demonstrating that this device can reliably measure the features of EOG signals. This system and its classification procedure provide an effective method for identifying eye movements. Additionally, it may be applied to study eye functions in real-life conditions in the near future.
This study began development of a new communication and control modality for individuals with severe motor deficits. We trained normal subjects to use the 8-12 Hz mu rhythm recorded from the scalp over the central sulcus of one hemisphere to move a cursor from the center of a video screen to a target located at the top or bottom edge. Mu rhythm amplitude was assessed by on-line frequency analysis and translated into cursor movement: larger amplitudes moved the cursor up and smaller amplitudes moved it down. Over several weeks, subjects learned to change mu rhythm amplitude quickly and accurately, so that the cursor typically reached the target in 3 sec. The parameters that translated mu rhythm amplitudes into cursor movements were derived from evaluation of the distributions of amplitudes in response to top and bottom targets. The use of these distributions was a distinctive feature of this study and the key factor in its success. Refinements in training procedures and in the distribution-based method used to translate mu rhythm amplitudes into cursor movements should further improve this 1-dimensional control. Achievement of 2-dimensional control is under study. The mu rhythm may provide a significant new communication and control option for disabled individuals.
Conference Paper
This paper describes an eye-gaze interface using a biological signal, electro-oculorgram (EOG). This interface enables a user to move a computer cursor on a graphical user interface using eye gaze movement alone. It will be useful as a communication aid for individuals with mobility handicaps. Although EOG is easily recordable, drifting and blinking problems must be solved to produce a reliable eye-gaze interface. Here we introduced a calibration method and a feedback control to overcome these problems.
Extraordinary Human-Computer Interaction: Interfaces for Users with Disabilities
  • A D N Edwards
. A. D. N. Edwards, ed. Extraordinary Human-Computer Interaction: Interfaces for Users with Disabilities. vol. 7. CUP Archive, 1995.
An EEG-based brain-computer interface for cursor control
  • Fornerisa
Fornerisa, " An EEG-based brain-computer interface for cursor control, " Electroencephalography Clin.