Figure 4 - available via license: CC BY
Content may be subject to copyright.
A VA-ST Smart Specs captured image.

A VA-ST Smart Specs captured image.

Source publication
Article
Full-text available
Over the last decades, the development of navigation devices capable of guiding the blind through indoor and/or outdoor scenarios has remained a challenge. In this context, this paper’s objective is to provide an updated, holistic view of this research, in order to enable developers to exploit the different aspects of its multidisciplinary nature....

Contexts in source publication

Context 1
... remarkable application of this technology for VI people's guidance lies in the user interface. One solution proposed by Stephen L. Hick et al., from Oxford University, exploited the residual vision by enhancing 3D perceptions with simplified images emphasizing depth (Figure 4) [64]. They recently tried to access the market with their Smart Specs [65] glasses, with VA-ST startup funding. ...
Context 2
... remarkable application of this technology for VI people's guidance lies in the user interface. One solution proposed by Stephen L. Hick et al., from Oxford University, exploited the residual vision by enhancing 3D perceptions with simplified images emphasizing depth (Figure 4) [64]. They recently tried to access the market with their Smart Specs [65] glasses, with VA-ST start-up funding. ...

Similar publications

Article
Full-text available
Cloud-based technologies are driving positive changes in the way organizations can communicate. In running a global business, the need for travel and being available in meetings is a must. However, with expensive travel costs, an alternative solution to overcome this problem is required. This paper presents, a new approach that enhances current 2D...

Citations

... Considering such numbers, it is clear there is a strong need for assistive navigation systems not only to help guide visually impaired pedestrians while walking on sidewalks, but also to offer additional useful information about the environment [4,5]. ...
Preprint
Full-text available
The World Health Organization (WHO) estimated in 2012 that globally 285 million people have low vision or are blind. A similar number of an estimated 253 million people was reported by the World Blind Union (WBU) in 2015. Most visually impaired or blind persons rely on white canes or service dogs to improve their daily mobility. However, the use of white canes limits the detection distance of objects to the length of the cane or approximately 1.5m. On the other hand, the number of guide dogs is limited, and the training of a service animal is very expensive. Therefore, there is a real need for systems that increase the mobility of the visually impaired, but at the same time are affordable, comfortable, and easy to use.This article presents a complete assistive navigation system, called SENSATION: Sidewalk Environment Detection System for Assistive Navigation, which is inexpensive, easy to wear and does not interfere with the additional usage of a white cane. It is a standalone system that does not depend on cloud computing and uses deep learning models on the device for the immediate detection of the environment along with image segmentation and an algorithm to correct drifting while walking.
... Unlike other applications present in virtual markets, in applications focused on autonomy for visually impaired people, it is evident that their interface, despite being more dynamic, does not provide a complete service for its users. Among its main problems is the architecture of the base information for deployment of geographic information, which is due to the fact that its programming uses APIs of the Google service [55][56][57][58]. Even more so, the base cartography does not constitute an exact representation of pedestrian viability-because it is a global service, its information represents the demand of the ma-jority of users [59][60][61]. ...
Article
Full-text available
The global navigation satellite systems (GNSS) have become important in conjunction with the advancement of technology, in order to improve the accuracy of positioning and navigation on mobile devices. In the current project, a mobile application for navigation using the network transport of restricted test case modeling (RTCM) via internet protocol (NTRIP) was developed, and it has been focused on the autonomous mobility of people with visual disabilities. This occurred through a web viewer that stores the base cartography in a genome database (GDB). Such information is integrated into the application interface with Java Script language within the Android Studio platform, with a personalized design. This incorporates a screen reader for selection, navigation and direction of destinations, in addition to an early warning system for obstacles. Additionally, a differential position correction was implemented using the BKG Ntrip Client (BNC) software, for the adjustment of coordinates with the precise point positioning (PPP) method through streams in the format of RTCM with casters EPEC3, IGS03 and BCEP00BKG0. The evaluation of the application was performed using the National Standard for Spatial Data Accuracy (NSSDA), establishing 30 control points. These were obtained through the fast static method, in order to compare the horizontal accuracy of the observations in static and navigation modes between high-end and mid-range mobile devices.
... Since the first electronic travel aids (ETAs) emerged around 70 years ago, the development of navigation devices to guide VIB people through indoor and/or outdoor environments has remained a challenge and a key concern for researchers [ Real and Araujo 2019]. ...
... So far, many reviews have been carried out on assistive technologies developed for VIB navigation. Several papers have reviewed walking assistance systems [Fernandes et al. 2019;Islam et al. 2019;Manjari et al. 2020; Real and Araujo 2019;Romlay et al. 2021;Zhang et al. 2021b] and provided a detailed classification of developed approaches. Islam M. et al. [Islam et al. 2019] categorized walking assistants into three groups: sensor-based, computer vision-based, and smartphone-based. ...
Preprint
Full-text available
In recent decades, several assistive technologies for visually impaired and blind (VIB) people have been developed to improve their ability to navigate independently and safely. At the same time, simultaneous localization and mapping (SLAM) techniques have become sufficiently robust and efficient to be adopted in the development of assistive technologies. In this paper, we first report the results of an anonymous survey conducted with VIB people to understand their experience and needs; we focus on digital assistive technologies that help them with indoor and outdoor navigation. Then, we present a literature review of assistive technologies based on SLAM. We discuss proposed approaches and indicate their pros and cons. We conclude by presenting future opportunities and challenges in this domain.
... Some people depends upon vibratory mode and some on auditory mode. There can be a situation in which both the solutions rely on each other depending on the environment .Therefore, if there is availability of multiple feedback solutions, then it will make the system more flexible in varying environments.The multimode system can be implemented on the basis of user preference ,it would be useful for the user even though it has complexities in the implementation [56,57]. c) Reliability of personal and private information : Proper management of personal and private information of user must be considered while designing the devices for visuallychallenged and blind people [58,59].Futhermore ,there should be a customised setting mode in navigation devices while designing for blind and visually challenged .The settings can be executed as per the preferences of the user. ...
Article
Indoor navigation in unfamiliar surroundings has been a tough task for visually challenged persons. Electronic travelling aids (ETA) embedded with different features such as obstacle detection and recognition towards the desired destination whether it is staircase, lifts or some room can help the visually challenged person for navigation. GPS technology has limitation in indoor navigation due to very low accuracy. This paper presents a comprehensive review on recent advances in indoor navigation technologies and aids for facilitating the visually challenged person movement. Based on the studied researches this paper gives some recommendation for the future scope of work in this direction has been enumerated.
... Tasks in product and software development are very similar and overlapping to those present in enhancing services in communication or entertainment of healthy individuals. Regarding tactile information, blind people make use of such developments, especially during navigation as an extension or replacement of audio feedback signals [29][30][31][32][33]. Persons with visual impairment generally use a cane to explore their surroundings and sense objects. ...
Article
Full-text available
Assistive technology uses multi-modal feedback devices, focusing on the visual, auditory, and haptic modalities. Tactile devices provide additional information via touch sense. Perception accuracy of vibrations depends on the spectral and temporal attributes of the signal, as well as on the body parts they are attached to. The widespread use of AR/VR devices, wearables, and gaming interfaces requires information about the usability of feedback devices. This paper presents results of an experiment using an 8-channel tactile feedback system with vibrators placed on the wrists, arms, ankles, and forehead. Different vibration patterns were designed and presented using sinusoidal frequency bursts on 2, 4, and 8 channels. In total, 27 subjects reported their sensation formally and informally on questionnaires. Results indicate that 2 and 4 channels could be used simultaneously with high accuracy, and the transducers’ optimal placement (best sensitivity) is on the wrists, followed by the ankles. Arm and head positions were inferior and generally inadequate for signal presentation. For optimal performance, signal length should exceed 500 ms. Furthermore, the amplitude level and temporal pattern of the presented signals have to be used for carrying information rather than the frequency of the vibration.
... Systems capable of GPS, obstacle detection and object detection have been designed. In these studies, ultrasonic sensors were generally used for obstacle detection [5]. In another study, a GPS assisted system was designed to enable visually impaired individuals to move by using ultrasonic sensors and vibration motors. ...
Article
Full-text available
It is very difficult for visually impaired individuals to avoid obstacles, to notice or recognize obstacles in distance, to notice and follow the special paths made for them. They continue their lives by touching these situations or finding solutions with the help of a walking stick in their hands. Due to these safety problems, it is difficult for visually impaired individuals to move freely and these situations affect individuals negatively in terms of social and health. In order to find solutions to these problems, a support system has been proposed for visually impaired individuals. The vision support system includes an embedded system with a camera with an audio warning system so that the visually impaired individual can identify the objects in front of him, and a circuit with an ultrasonic sensor so that he can detect the obstacles in front of him early and take precautions. The object recognition system is realized with convolutional neural networks. The Faster R-CNN model was used and in addition to this, a model that we created, which can recognize 25 kinds of products, was used. With the help of the dataset we created and the network trained with this dataset, the visually impaired individual will be able to identify some market products. In addition to these, auxiliary elements were added to the walking sticks they used. This system consists of a camera system that enables the visually impaired individual to notice the lines made for the visually impaired in the environment, and a tracking circuit placed at the tip of the cane so that they can easily follow these lines and move more easily. Each system has been designed separately so that the warnings can be delivered to the visually impaired person quickly without delay. In this way, the error rate caused by the processing load has been tried to be reduced. The system we have created is designed to be wearable, easy to use and low-cost to be accessible to everyone.
... In a world that is heavily based on visual perception, the visually impaired (VI) face immense problems in everyday life and especially in the work environment. In recent decades, there has been continual progress in research for improving the mobility of the VI [2]. Examples range from sonarbased solutions like Russell's PathSounder [3] and Mowat Sensor [4], to employing computer vision and deep neural networks for simultaneous localization and mapping (SLAM), navigation and obstacle avoidance for the VI [5]- [7]. ...
... Given the fact that user experience and cognitive load play a vital role in the acceptance of a new assistive technology [2], in the current study, Unity 3D game engine's capabilities are used to enhance the system's ease of use. After path planning and finding the optimal route to the target location, a guiding game object (GGO) will appear; this virtual object will emit spatial sound, defined as three-dimensional feedback (A3DF), to act as a guiding signal for the VI, helping them find the way to the target location. ...
... where T cam cali obj is transformation matrix which transforms the initial pose and position of the calibration object, P calib obj , to the main camera, P cam , in the scene and P calib obj new is the new pose of the calibration object in the scene after calibration. Additionally for the EM we have, (2) where similar transformations as above are performed for transforming the EM, P env mesh , to its calibrated pose and position, P env mesh new given the transformation matrix T env mesh cali obj . 3) Path planning and A3DF: After calibration, the GGO can be used to help the user navigate to a given target location. ...
Conference Paper
With the Augmented Reality (AR) technology avail-able today, it is quite feasible to accommodate the needs of the visually impaired (VI) via AR. In this paper, a framework is introduced to help the VI navigate and explore unfamiliar indoor environments. In contrast to commonly used AR applications focused on visual augmentation, the proposed framework em-ploys auditory three-dimensional feedback (A3DF) for guiding the VI. Concretely, the current framework reads the pose of the user and helps the VI reach a target location via A3DF. The A3DF is implemented with the Unity game engine to provide the optimal user experience. After acquiring the environment mesh (EM), the optimal path from the user's location to the target location is calculated, while avoiding obstacles using Unity's navigation system. Moreover, the user is provided with semantic information about the unknown environment whilst exploring via auditory information. This framework is implemented on Microsoft HoloLens 2 and tested at an office environment with different locations of interest. Additionally, this framework potentially accelerates the learning curve since the user can be trained using Unity's simulation environment. Lastly, given different design parameters of the framework, the proposed method can be fine-tuned to fit the specific needs of the individual VI.
... Considering the growing prevalence of smartphones in general and in pBLV population [12] [14] [7], mobile applications can be a potential solution to address the needs of localization and navigation for the pBLV [25]. The All Aboard [10], developed by Massachusetts Eye and Ear, utilizes computer vision to detect bus stop sign in the vicinity of the users and guide the users to the precise location of the bus stop by providing the distance estimations of the bus stop sign using computer vision algorithms. ...
Preprint
Full-text available
People with blindness and low vision (pBLV) experience significant challenges when locating final destinations or targeting specific objects in unfamiliar environments. Furthermore, besides initially locating and orienting oneself to a target object, approaching the final target from one's present position is often frustrating and challenging, especially when one drifts away from the initial planned path to avoid obstacles. In this paper, we develop a novel wearable navigation solution to provide real-time guidance for a user to approach a target object of interest efficiently and effectively in unfamiliar environments. Our system contains two key visual computing functions: initial target object localization in 3D and continuous estimation of the user's trajectory, both based on the 2D video captured by a low-cost monocular camera mounted on in front of the chest of the user. These functions enable the system to suggest an initial navigation path, continuously update the path as the user moves, and offer timely recommendation about the correction of the user's path. Our experiments demonstrate that our system is able to operate with an error of less than 0.5 meter both outdoor and indoor. The system is entirely vision-based and does not need other sensors for navigation, and the computation can be run with the Jetson processor in the wearable system to facilitate real-time navigation assistance.
... Researchers have worked on various prototypes of guidance systems for visually impaired people. Santiago et al. [21] surveyed the development of systems using traditional technologies such as sonar, infrared radiation, and inertial sensing. The authors conclude that smartphones and wearable devices can be valuable as assistive devices for individuals with visual impairments. ...
Article
Full-text available
Guidance systems for visually impaired persons have become a popular topic in recent years. Existing guidance systems on the market typically utilize auxiliary tools and methods such as GPS, UWB, or a simple white cane that exploits the user’s single tactile or auditory sense. These guidance methodologies can be inadequate in a complex indoor environment. This paper proposes a multi-sensory guidance system for the visually impaired that can provide tactile and auditory advice using ORB-SLAM and YOLO techniques. Based on an RGB-D camera, the local obstacle avoidance system is realized at the tactile level through point cloud filtering that can inform the user via a vibrating motor. Our proposed method can generate a dense navigation map to implement global obstacle avoidance and path planning for the user through the coordinate transformation. Real-time target detection and a voice-prompt system based on YOLO are also incorporated at the auditory level. We implemented the proposed system as a smart cane. Experiments are performed using four different test scenarios. Experimental results demonstrate that the impediments in the walking path can be reliably located and classified in real-time. Our proposed system can function as a capable auxiliary to help visually impaired people navigate securely by integrating YOLO with ORB-SLAM.
... Some more projects are focusing on user requirement analysis of navigational systems like Miao et al. [36] and Asakawa et al. [37] who report on a user requirement analysis for blind navigation in public buildings and museums, respectively. Moreover, Real and Araujo [38] reviewed the foundations and requirements of existing applications regarding the provision of environmental information. In their work, they focus on user-centered design for indoor and outdoor positioning taking into account their past experiences, technological state of the art, and user-related needs and constraints. ...
Article
Full-text available
The autonomy, independence, productivity and, in general, quality of life of people with visual impairments often rely significantly on their ability to use new assistive technologies. In particular, their ability to navigate by foot, use means of transport and visit indoor spaces may be greatly enhanced by the use of assistive navigation systems. In this paper, a detailed analysis of user needs and requirements is presented concerning the design and development of assisting navigation systems for blind and visually impaired people (BVIs). To this end, the elicited user needs and requirements from interviews with the BVIs are processed and classified into seven main categories. Interestingly, one of the categories represents the requirements of the BVIs to be trained on the use of the mobile apps that would be included in an assistive navigation system. The need of the BVIs to be confident in their ability to safely use the apps revealed the requirement that training versions of the apps should be available. These versions would be able to simulate real-world conditions during the training process. The requirements elicitation and classification reported in this paper aim to offer insight into the design, development, deployment and distribution of assistive navigation systems for the BVIs.