Content uploaded by Yogesh Kumar Meena
Author content
All content in this area was uploaded by Yogesh Kumar Meena on Dec 10, 2015
Content may be subject to copyright.
Powered wheelchair control with a
multimodal interface using
eye-tracking and soft-switch
3.EXPERIMENTAL PROTOCOL
4. MULTIMODAL INTERFACE
1.AIM
•To develop a multimodal interface that combines an eye-tracker and a soft-
switch for people with mobility impairment to control a wheelchair in a
convenient way.
2.INTRODUCTION
•Human computer interaction (HCI) research aims to provide a novel means of
communication between computers and people. There exists a large variability
of disabilities across people who have difficulties to move a wheelchair.
•To increase the accessibility of a wheelchair control, the HCI system should
include several modalities to allow different types of users to control a
wheelchair in a convenient way.
•This work proposes a multimodal interface for people with mobility
impairment such as those who cannot move their arms and legs to control a
powered wheelchair.
•The multimodal interface was designed to include an eye-tracker and a soft-
switch wherein the wheelchair can be controlled.
•The multimodal interface is used to find the desired command among nine
possible commands (eight directions, and stop).
•Six consenting healthy male subjects participated in the study.
•Three conditions varied in multimodal interface to measure the performance on
a predefined trajectory.
•For each subject, lap completion time and number of commands were
recorded.
A touch-pad (TP)
An eye-tracker (ET)
Eye-tracker and soft-switch (ET_SS)
5.RESULTS
•Average lap completion time (in second) for each input modalities. Error bars display
standard deviation.
•Average number command was used to complete the lap for each input modalities.
Error bars display standard deviation.
•The average lap completion time for eye-tracker with soft-switch is found to be lower
than the eye-tracker.
•In order to combine these two different modalities simultaneously, users can employ
their gaze to point the desired command, and the soft-switch is used for the selection
[1], [2], [3].
6.CONCLUSION
•A powered wheelchair may be controlled hands-free through gaze, thanks to the
proposed multimodal interface, and an additional soft-switch may improve the
usability of the system.
•A significant improvement in performance for a multimodal system using a total of
nine possible commands (eight directions, and stop).
•The combination of various modalities is largely dependent on the graphical user
interface.
•This multimodal system can also be used to reduce the false positives obtained from
an eye-tracker selection paradigm by utilising soft-switch commands.
REFERENCES
1. Y. K. Meena, H. Cecotti, K. Wong-lin, and G. Prasad, “Towards increasing the number of commands in a hybrid brain-computer interface with combination of gaze and motor
imagery,” in 37th annual int. conf. of the IEEE Engineering in Medicine and Biology Society, 2015.
2. Y. K. Meena, H. Cecotti, K. Wong-lin, and G. Prasad, “Simultaneous gaze and motor imagery hybrid BCI increases single-trial detection performance: a compatible-incompatible
study,” 9th IEEE-EMBS International Summer School on Biomedical Signal Processing, 2015.
3. D. O. Doherty, Y. K. Meena, H. Raza, H. Cecotti, and G. Prasad, “Exploring gaze-motor imagery hybrid brain-computer interface design,” in IEEE int. conf. on Bioinformatics and
Biomedicine, 2014,pp.335–339.
Yogesh Kumar Meena, Hubert Cecotti, KongFatt Wong-Lin, Girijesh Prasad
School of Computing and Intelligent Systems
Ulster University, Derry BT48 7JL, Northern Ireland, UK
touch-pad
eye-tracker
soft-switch
PC GUI
Arduino
Sensors
Microcontroller
Chair
Motion
Block diagram of proposed system.
Experimental setup for
the control interface.
Experimental setup for the test trajectory
Graphical user interface (GUI).
Touchpad Eye-tracker Eye-tracker + soft-switch
0
20
40
60
80
100
120
140
160
180
200
time (sec)
Input modalities
Backward Backward right
Forward Forward right
Forward left
Backward left
Stop
Right
Left
Connect
Disconnect
Start
TP
Start
ET
Start
ET_SS
ACKNOWLEDGMENT
•Y.K.M. is supported by the Government of India (Education- 11016152013). H.C., K.W.-L., and G.P. are supported by the Northern
Ireland Functional Brain Mapping Facility (1303/101154803), funded by InvestNI, University of Ulster, Magee Campus, Northland
Road, BT48 7JL, Northern Ireland, United Kingdom.
Experimental setup for the test trajectory.
Touchpad Eye-tracker Eye-tracker + soft-switch
0
5
10
15
20
25
30
35
40
Number of commands
Input modalities