Content uploaded by Fabiana Sofia Ricci
Author content
All content in this area was uploaded by Fabiana Sofia Ricci on May 16, 2024
Content may be subject to copyright.
Learning to use electronic travel aids for visually impaired in
virtual reality
Fabiana Sofia Riccia, Alain Boldinib, John-Ross Rizzoa,b,c,d, and Maurizio Porfiria,b,e
aDepartment of Biomedical Engineering, New York University Tandon School of Engineering,
Six MetroTech Center, Brooklyn, NY 11201, USA
bDepartment of Mechanical and Aerospace Engineering, New York University Tandon School
of Engineering, Six MetroTech Center, Brooklyn, NY 11201, USA
cDepartment of Rehabilitation Medicine, New York University Langone Health, 240 East 38th
Street, NY 10016, USA
dDepartment of Neurology, New York University Langone Health, 240 East 38th Street, NY
10016, USA
eCenter for Urban Science and Progress, New York University Tandon School of Engineering,
370 Jay Street, Brooklyn, NY 11201, USA
ABSTRACT
Vision loss ranks among the top causes of disability in the United States. Visual impairment (VI) remains an
urgent public health priority, causing loss of independence in almost every aspect of daily life. Mobility of the
visually impaired requires skill, effort and training. In this vein, orientation and mobility training (O&M) is
currently offered to prepare the visually impaired for independent travel in any familiar or unfamiliar environment,
as well as to use primary and secondary mobility aids, such as white canes and electronic travel aids (ETAs),
respectively. Even though O&M training is an established practice, it is not free of risks, as trainees may be
exposed to potential harm. To address this issue, we developed a virtual reality (VR) environment for training
with an ETA previously developed by our team (on which we presented at this meeting last year). The ETA
comprises a belt, serving as a haptic feedback wearable device that senses the surroundings through computer
vision and sends vibro-tactile feedback to the user’s abdomen about the position of nearby obstacles. The belt is
interfaced with VR, enabling trainees to immerse in a computer-simulated, safe, three-dimensional environment.
Our integrated system brings the visually impaired into virtual scenarios where they can perform specific tasks-
either simply to measure their performance or to train certain skills for rehabilitation purposes. The ultimate goal
of our effort is to establish a training and evaluation framework that could mitigate the immobility consequences
of visual impairment, improve quality of life, enhance social participation, and reduce societal costs, while creating
an engaging and appealing experience to the visually impaired.
Keywords: Assistive technology, electronic travel aid, mobility training, virtual reality, visual impairment
1. INTRODUCTION
“Visual impairment” (VI) is a condition of reduced visual acuity and peripheral field deficit that cannot be
remedied by corrective lenses or surgery.1Globally, an estimated 43.3 million people are blind, 295 million
people have moderate and severe vision impairment, and 258 million have mild vision impairment.2By 2050,
such estimates are predicted to increase due to the rising life expectancy across our global population, along
with poor access to health care in low-and-middle-income countries.2
VI is a considerable public health concern that negatively affects quality of life and restricts equitable access
to and achievement in education and the workplace.3,4VI is not only a sensory disability, but also a significant
deterrent to mobility; difficulties with mobility and access to public spaces are the most significant impediments
Further author information: (Send correspondence to M.P.)
M.P.: E-mail: mporfiri@nyu.edu, Telephone: +1 646 997 3681
Nano-, Bio-, Info-Tech Sensors, and Wearable Systems 2022, edited by Jaehwan Kim,
Kyo D. Song, Ilkwon Oh, Maurizio Porfiri, Proc. of SPIE Vol. 12045, 1204504
© 2022 SPIE · 0277-786X · doi: 10.1117/12.2612097
Proc. of SPIE Vol. 12045 1204504-1
among people with VI.5Previous studies have examined the consequences of VI of immobility, reporting that
people with VI had slower walking speeds, more falls, and more mobility difficulties than individuals with no
VI.6–9In addition, many aspects of daily life are affected by struggles with independent travel, including social
inclusion, employment prospects, and overall quality of life.9,10 Social and physical exclusion, in turn, reinforces
experiences of disablement in people with VI. Therefore, it is critical to examine and address the issues that
inhibit mobility of people with VI.
Today, technological advancements are reshaping the everyday mobility of people with VI and their inter-
actions with different types of environments. In particular, electronic travel aids (ETAs) are technologically
advanced mobility solutions developed to overcome some of the limitations of traditional orientation and mobil-
ity (O&M) aids, such as white canes.11 While traditional aids are an excellent solution for a near-ground obstacle,
hazards that are lower than ground or higher than knee-level often remain undetected.12 On the contrary, ETAs
are equipped with measurement systems that can provide basic information about the presence of objects or
provide complex spatial information, such as the shape of individual objects and spatial relationships between
them.13 ETAs use different feedback approaches such as vibration or auditory output, thus enabling people with
VI to travel safely and securely.14 Although further research is needed to identify optimal solutions for a wide
range of users, through high-performing ETAs, people with VI can be more self-independent, experience less
accidents, and improve their mobility, consequentially enhancing their quality of life.
Current ETAs have some major limitations, such as sizeable testing campaigns to objectively evaluate their
performance, high production costs, and low availability of training programs with mobility specialists.15,16
Widespread use of better and more consistent measures of mobility performance would facilitate and minimize
experimental sessions, which could prove frustrating, discouraging, and even dangerous for people with VI.17,18
In fact, people with VI could be exposed to major hazards, such as falls and injuries, especially when testing
initial prototypes that are yet to be optimized. The development and dissemination of standardized assessment
measures and methods could in turn improve training strategies and encourage the use of ETAs.18
Recently, augmented and virtual reality (AR/VR) applications have been used to support development,
testing, and training with ETAs as well as to supplement O&M training.19 By exploring virtual environments,
people with VI can practice their skills while receiving multisensory feedback similar to real-life navigation.
VR environments can be used to supplement traditional O&M training by allowing people with VI to build
cognitive maps of unfamiliar routes. Additionally, simulated VI through VR allows for modeling different and
even rare forms of VI, at different stages of severity, in healthy subjects. Virtual environments integrated within
O&M training programs with ETAs can be valuable to support independent mobility by enabling “walking” the
virtual route and familiarization with both the surroundings and the device.20 Also, the integration of a virtual
stimulation can be effective in learning routes and in reducing frustration, fear, and risk/threats of navigating
public paths.21
In this work, we propose a new evaluation platform for ETAs based on VR. As a representative ETA, we
consider a haptic feedback wearable device that was previously designed by our group.22 Our system leverages
ten piezo-based actuators, mounted on a belt, to provide the user with vibrotactile feedback when obstacle(s)
is/are in close proximity. As actuators are distributed on a 2X5 grid, our ETA allows a complex and yet intuitive
representation of the environment with a registered worl grid (each frame is divided into an analogous 2X5
grid), whereby the presence of an obstacle in one direction is mapped to a vibration of the actuator in the same
direction, within an egocentric reference frame. We integrate this haptic feedback device with a VR platform,
built upon our previous efforts in Ref. 23. The platform allows users to perform an obstacle avoidance task
while experiencing a simulated VI. We study how our ETA, interfaced with the VR platform, enhances the
performance of a user with simulated VI in an obstacle avoidance task.
2. MATERIALS AND METHODS
2.1 Virtual reality environment
The VR platform was built using Unity, a cross-platform engine developed by Unity Technologies (San Francisco,
California, United States), for the VR headset Oculus Rift CV1 and Touch Controllers.
Proc. of SPIE Vol. 12045 1204504-2
In VR, we simulated age-related macular degeneration (AMD), diabetic retinopathy (DR), and glaucoma
since they represent the fastest-growing and leading causes of VI among people of working age in recent times
(Fig. 1). The initial symptoms of AMD often consist of distorted vision and visual loss in the center of the
visual field that progresses slowly leading to a complete loss of central vision.24 Patients with AMD often have
increasing difficulty reading and recognizing objects and faces; spatial orientation, however, is preserved through
the intact functioning of peripheral vision.24 DR damages retinal blood vessels resulting in macular edema.25 The
edema makes patients experience blurred vision, floaters, distortions, and visual acuity loss in advanced stages.25
Glaucoma is a “silent” blinding disease due to an increase in intraocular pressure.26 Glaucoma patients with
moderate-advanced visual field loss report symptoms such as tunnel vision, blurriness, need of more light and
their blind spots as blurred, missing or grey.26 AMD symptoms were simulated by means of a culling mask
provided with gray spots to obscure the center of the visual field, a Gaussian blur shader, and a distortion
shader. DR symptoms were recreated by means of a culling mask provided with dark spots dispersed throughout
the visual field and a Gaussian blur shader. The symptoms of glaucoma were obtained by means of a culling
mask that compromises the peripheral visual field and a Gaussian blur shader.
To detect obstacles in the virtual environment, we utilized the Raycast function which projects different
groups of invisible rays coming from the user into the scene. Specifically, two rays are used to detect obstacles
on the right and left of the player, while four group of eight rays are used to detect obstacles that are in front of
the player. All the rays were projected so as to divide the visual field into a grid which is analogous to the one
formed by the actuators on the ETA (2X5). The actuator corresponding to right and left rays, and to each group
of the frontal rays would vibrate when at least one of the rays hits an obstacle in the virtual environment. The
range of action of the ETA was determined by the length of the rays which can be easily increased or decreased
through the Raycast function in VR.
(a) (b) (c)
Figure 1. Examples of the simulations of the three visual impairments in the VR platform: (a) severe age-related macular
degeneration, (b) severe diabetic retinopathy, and (c) severe glaucoma.
2.2 ETA description
As ETA for assisting people with VI during navigation and obstacle avoidance, we considered our previously
developed system that consists of a haptic feedback wearable device interfaced with a computer vision plat-
form.27,28 This platform comprises an algorithm that allows for obstacles detection by means of a stereo camera.
The transmission of environmental information collected by the computer vision platform to the user is made
possible by the haptic feedback wearable device. Specifically, the wearable device is composed of a belt with ten
integrated piezo-based discrete actuators, which can provide vibrotactile stimulation on users’ abdomen. The
actuators notify the user about the proximity of an obstacle and its position defined with respect to an egocentric
reference frame.27,28
2.3 ETA-VR interface
In our experiment, the haptic feedback device was directly interfaced with the VR environment through an
Arduino Mega 2560, which controlled custom-printed astable multivibrator circuits using op-amps to drive the
high-voltage amplifiers.22 These amplifiers, in turn, controlled each of the ten actuators on the belt independently.
A potentiometer was used to regulate the frequency of vibration in the multivibrator circuit, based on the distance
of the object from the user in VR.
Proc. of SPIE Vol. 12045 1204504-3
3. EXPERIMENT
We devised an experiment with the aim of evaluating whether or not our ETA can relay information about the
surrounding environment to the users. More specifically, we examined the feasibility of our ETA to improve
mobility performance of healthy subjects performing an obstacle avoidance task in a virtual environment while
experiencing a simulated VI. In this preliminary study, we tested our ETA with only one person (the first author).
The experiment comprises two conditions, each with three trials. The fixed factor distinguishing the two
conditions is whether the participant performed the task with or without the aid of the ETA. Each trial consisted
of performing the obstacle avoidance task while experiencing one of the three simulated VIs.
During the obstacle avoidance task, the participant was asked to cross from one side to the other in a
rectangular park in the least time possible, avoiding collisions with obstacles in the path, such as benches,
fences, and trash bins (Fig. 2). While performing the task, the participant utilized the Oculus Touch controllers
to navigate the environment, and received vibration feedback by the ETA and by the right controller. In the
other condition, the ETA provided vibrotactile feedback on the abdomen to help the user avoid the obstacles
before hitting them, thus anticipating potential collisions. The controller vibrated only after the collision had
occurred, to mimic sensory feedback from touch and assist participants in correcting their path.
Figure 2. Examples of obstacle avoidance task implemented in the virtual environment
For each trial, we acquired two metrics through the VR system: the time for completing the obstacle avoidance
task and the number of collisions with obstacles. By analyzing these two metrics we evaluated the feasibility of
the ETA to convey environmental information to the participants, thus improving their mobility performance
for each of the tested simulated VIs.
4. RESULTS
The results achieved by the preliminary study are presented in Tab. 1, in the form of the average and standard
deviation of the time to complete the task and number of collisions recorded during the test, conducted for each
of the two conditions tested.
Condition Time to completion Number of collisions
AMD severe w/ belt 91s 4
AMD severe w/o belt 187s 43
DR severe w/ belt 120s 1
DR severe w/o belt 165s 40
Glaucoma severe w/ belt 104s 3
Glaucoma severe w/o belt 122s 27
Table 1. Results of the preliminary experiments for the two conditions tested.
Proc. of SPIE Vol. 12045 1204504-4
The preliminary results show that our ETA may have a very positive impact on both the metrics considered.
The experimental subject had no prior experience of using the wearable device. For each of the three simulated
VIs, there is a decrease in the task completion time when she wears the ETA. This outcome confirmed the ability
of our ETA to effectively relay environmental information which explains the experimental subject’s ability to
respond more steadily to the haptic stimulus without the need of decreasing the navigation speed.
Results on collisions reveal that the condition in which the participant performed the task with the aid of the
ETA leads to less collisions. Thus, our ETA can be considered to be feasible as a platform to deliver feedback
that may aid in the detection and avoidance of obstacles. The considerable difference in the collisions number can
be attributed to the ETA reliability and promptness in conveying information about the immediate surroundings
to the user.
5. CONCLUSIONS
In this paper, we highlighted the potential of virtual reality (VR) in simulating visual impairment and testing
electronic travel aids (ETAs) for the visually impaired. As an initial investigation, we implemented the three
most widespread forms of visual impairment (that is, age-related macular degeneration, diabetic retinopathy,and
glaucoma) in a VR platform and employed them to evaluate the effectiveness of an ETA in conveying environ-
mental information to an end user. The interface between the VR platform and the ETA was used to perform
tests in a highly controllable environment. The long-term goal of the pro ject is to establish new paradigms for
training and performance assessments of ETAs based on AR/VR technologies.
The preliminary analyses showed in this paper require a follow-up study that addresses some of their critical
weaknesses, including the limited number of participants and the lack of a statistically grounded approach.
Finally, we foresee further developments and improvements of the system, such that it could be employed for
education within outreach activities with the general public and for training visually impaired subjects regarding
use and best practices for ETAs.
ACKNOWLEDGMENTS
This research was supported by the National Science Foundation under Grant Nos. ECCS-1928614, CNS-
1952180, and CBET-2037878, by the National Eye Institute and Fogarty International Center under Grant No.
R21EY033689, as well as by the U.S. Department of Defense under Grant No. VR200130. New York University
(NYU) and John-Ross Rizzo (JRR) have financial interests in related intellectual property. NYU owns a patent
licensed to Tactile Navigation Tools. NYU, JRR are equity holders and advisors of said company.
REFERENCES
[1] Naipal, S. and Rampersad, N., “A review of visual impairment,” African Vision and Eye Health 77 (01
2018).
[2] Bourne, R., Steinmetz, J. D., Flaxman, S., Briant, P. S., Taylor, H. R., Resnikoff, S., Casson, R. J., Abdoli,
A., Abu-Gharbieh, E., Afshin, A., and et al., “Trends in prevalence of blindness and distance and near
vision impairment over 30 years: an analysis for the global burden of disease study,” The Lancet Global
Health 9(2), e130–e143 (2021).
[3] Park, S. J., Ahn, S., and Park, K. H., “Burden of Visual Impairment and Chronic Diseases,” JAMA
Ophthalmology 134, 778–784 (07 2016).
[4] Haegele, J. A. and Zhu, X., “Physical activity, self-efficacy and health-related quality of life among adults
with visual impairments,” Disability and Rehabilitation 43(4), 530–536 (2021). PMID: 31230474.
[5] Wong, S., “Traveling with blindness: A qualitative space-time approach to understanding visual impairment
and urban mobility,” Health & Place 49, 85–92 (2018).
[6] Broman, A. T., West, S. K., Munoz, B., Bandeen-Roche, K., Rubin, G. S., and Turano, K. A., “Divided
visual attention as a predictor of bumping while walking: the salisbury eye evaluation,” Investigative oph-
thalmology & visual science 45(9), 2955–2960 (2004).
Proc. of SPIE Vol. 12045 1204504-5
[7] Patel, I., Turano, K. A., Broman, A. T., Bandeen-Roche, K., Munoz, B., and West, S. K., “Measures
of visual function and percentage of preferred walking speed in older adults: the salisbury eye evaluation
project,” Investigative ophthalmology & visual science 47(1), 65–71 (2006).
[8] Freeman, E. E., Munoz, B., Rubin, G., and West, S. K., “Visual field loss increases the risk of falls in
older adults: the salisbury eye evaluation,” Investigative ophthalmology & visual science 48(10), 4445–4450
(2007).
[9] Miyata, K., Yoshikawa, T., Harano, A., Ueda, T., and Ogata, N., “Effects of visual impairment on mobility
functions in elderly: Results of fujiwara-kyo eye study,” Plos one 16(1), e0244997 (2021).
[10] Lubin, A. and Deka, D., “Role of public transportation as job access mode: Lessons from survey of people
with disabilities in new jersey,” Transportation research record 2277(1), 90–97 (2012).
[11] Farcy, R., Leroux, R., Jucha, A., Damaschini, R., Gr´egoire, C., and Zogaghi, A., “Electronic travel aids and
electronic orientation aids for blind people: technical, rehabilitation and everyday life points of view,” in
[Proceedings of the Conference and Workshop on Assistive Technology for Vision and Hearing Impairment],
Citeseer (2006).
[12] Romlay, M. R. M., Toha, S. F., Ibrahim, A. M., and Venkat, I., “Methodologies and evaluation of electronic
travel aids for the visually impaired people: a review,” Bulletin of Electrical Engineering and Informat-
ics 10(3), 1747–1758 (2021).
[13] Ball, E. M., [Electronic Travel Aids: An Assessment], 289–321, Springer London, London (2008).
[14] R¨oijezon, U., Prellwitz, M., Ahlmark, D. I., van Deventer, J., Nikolakopoulos, G., and Hyypp¨a, K., “A
haptic navigation aid for individuals with visual impairments: Indoor and outdoor feasibility evaluations of
the lasernavigator,” Journal of Visual Impairment & Blindness 113(2), 194–201 (2019).
[15] Petrie, H., Johnson, V., Strothotte, T., Raab, A., Fritz, S., and Michel, R., “Mobic: Designing a travel aid
for blind and elderly people,” The Journal of Navigation 49(1), 45–52 (1996).
[16] Gitlin, L., Mount, J., Lucas, W., Weirich, L., and Gramberg, L., “The physical costs and psychosocial
benefits of travel aids for persons who are visually impaired or blind,” Journal of Visual Impairment &
Blindness 91(4), 347–359 (1997).
[17] Mount, J., Gitlin, L. N., and Howard, P. D., “Musculoskeletal consequences of travel aid use among visually
impaired adults: directions for future research and training,” Technology and Disability 6(3), 159–167 (1997).
[18] Council, N. R. et al., “Summary and highlights,” in [Electronic Travel AIDS: New Directions for Research],
National Academies Press (US) (1986).
[19] Zhang, L., Wu, K., Yang, B., Tang, H., and Zhu, Z., “Exploring virtual environments by visually impaired
using a mixed reality cane without visual feedback,” in [2020 IEEE International Symposium on Mixed and
Augmented Reality Adjunct (ISMAR-Adjunct)], 51–56 (2020).
[20] Basch, M.-E. and Bogdanov, I., “Work directions and new results in electronic travel aids for blind and
visually impaired people,” (2011).
[21] Pissaloux, E., Velazquez, R., and Maingreaud, F., “On 3d world perception: towards a definition of a
cognitive map based electronic travel aid,” in [The 26th Annual International Conference of the IEEE
Engineering in Medicine and Biology Society], 1, 107–109, IEEE (2004).
[22] Boldini, A., Garcia, A. L., Sorrentino, M., Beheshti, M., Ogedegbe, O., Fang, Y., Porfiri, M., and Rizzo,
J.-R., “An inconspicuous, integrated electronic travel aid for visual impairment,” ASME Letters in Dynamic
Systems and Control 1(4), 041004 (2021).
[23] Boldini, A., Ma, X., Rizzo, J.-R., and Porfiri, M., “A virtual reality interface to test wearable electronic
travel aids for the visually impaired,” in [Nano-, Bio-, Info-Tech Sensors and Wearable Systems], 11590,
115900Q, International Society for Optics and Photonics (2021).
[24] Stahl, A., “The diagnosis and treatment of age-related macular degeneration,” Deutsches ¨
Arzteblatt Inter-
national 117(29-30), 513 (2020).
[25] Khan, Z., Khan, F. G., Khan, A., Rehman, Z. U., Shah, S., Qummar, S., Ali, F., and Pack, S., “Diabetic
retinopathy detection using vgg-nin a deep learning architecture,” IEEE Access 9, 61408–61416 (2021).
[26] Gagrani, M., Ndulue, J., Anderson, D., Kedar, S., Gulati, V., Shepherd, J., High, R., Smith, L., Fowler,
Z., Khazanchi, D., Nawrot, M., and Ghate, D., “What do patients with glaucoma see: a novel ipad app to
improve glaucoma patient awareness of visual field loss,” British Journal of Ophthalmology (2020).
Proc. of SPIE Vol. 12045 1204504-6
[27] Phamduy, P., Rizzo, J.-R., Hudson, T. E., Torre, M., Levon, K., and Porfiri, M., “Communicating through
touch: Macro fiber composites for tactile stimulation on the abdomen,” IEEE transactions on haptics 11(2),
174–184 (2017).
[28] Boldini, A., Rizzo, J.-R., and Porfiri, M., “A piezoelectric-based advanced wearable: obstacle avoidance
for the visually impaired built into a backpack,” in [Nano-, Bio-, Info-Tech Sensors, and 3D Systems IV],
11378, 1137806, International Society for Optics and Photonics (2020).
Proc. of SPIE Vol. 12045 1204504-7