Article

Study on portable haptic guide device with omnidirectional driving gear

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

In the past, white canes, guide dogs, and guide helpers have served to assist visually impaired people when walking outdoors. However, these assistance methods have various limitations for extended and suitable usage. Therefore, our laboratory herein proposes a route guidance device that signals commands to the end-user by giving a sense of force to the thumb. The guide device is driven on the common plane with two degrees of freedom by using an XY stage with omnidirectional driving gear. The current prototype presents challenges for its practical implementation as a smooth, safe and reliable route guidance system. In this study, we conducted experiments aiming to optimize the combination of the update position of the intermediate target point and the presentation cycle in blindfolded clear-eyed people.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Conference Paper
Full-text available
Human perceptual properties have been applied for designing multisensory display technologies. This paper overviews the sensory-illusion-based approach we have used to create a force display that elicits illusory continuous force sensation by presenting asymmetric vibrations and a self-motion display based on a cross-modal effect between visual and tactile motion.
Conference Paper
Full-text available
Navigation robots have the potential to overcome some of the limitations of traditional navigation aids for blind people , specially in unfamiliar environments. In this paper, we present the design of CaBot (Carry-on roBot), an autonomous suitcase-shaped navigation robot that is able to guide blind users to a destination while avoiding obstacles on their path. We conducted a user study where ten blind users evaluated specifc functionalities of CaBot, such as a vibro-tactile handle to convey directional feedback; experimented to fnd their comfortable walking speed; and performed navigation tasks to provide feedback about their overall experience. We found that CaBot's performance highly exceeded users' expectations , who often compared it to navigating with a guide dog or sighted guide. Users' high confdence, sense of safety, and trust on CaBot poses autonomous navigation robots as a promising solution to increase the mobility and independence of blind people, in particular in unfamiliar environments.
Conference Paper
Full-text available
Turn-by-turn navigation is a useful paradigm for assisting people with visual impairments during mobility as it reduces the cognitive load of having to simultaneously sense, localize and plan. To realize such a system, it is necessary to be able to automatically localize the user with sufficient accuracy, provide timely and efficient instructions and have the ability to easily deploy the system to new spaces. We propose a smartphone-based system that provides turn-by-turn navigation assistance based on accurate real-time localization over large spaces. In addition to basic navigation capabilities, our system also informs the user about nearby points-of-interest (POI) and accessibility issues (e.g., stairs ahead). After deploying the system on a university campus across several indoor and outdoor areas, we evaluated it with six blind subjects and showed that our system is capable of guiding visually impaired users in complex and unfamiliar environments.
Article
Full-text available
The present study investigated whether a tactile flow created by a matrix of vibrators in a seat pan simultaneously presented with an optical flow in peripheral vision enhances the perceived forward velocity of self-motion. A brief tactile motion stimulus consisted of four successive rows of vibration, and the interstimulus onset between the tactile rows was varied to change the velocity of the tactile motion. The results show that the forward velocity of self-motion is significantly overestimated for rapid tactile flows and underestimated for slow ones, compared with optical flow alone or non-motion vibrotactile stimulation conditions. In addition, the effect with a temporal tactile rhythm without changing the stimulus location was smaller than that with spatiotemporal tactile motion, with the interstimulus onset interval to elicit a clear sensation of tactile apparent motion. These findings suggest that spatiotemporal tactile motion is effective in inducing a change in the perceived forward velocity of self-motion.
Conference Paper
Full-text available
Pedestrian navigation systems require users to perceive, interpret, and react to navigation information. This can tax cognition as navigation information competes with information from the real world. We propose actuated navigation, a new kind of pedestrian navigation in which the user does not need to attend to the navigation task at all. An actuation signal is directly sent to the human motor system to influence walking direction. To achieve this goal we stimulate the sartorius muscle using electrical muscle stimulation. The rotation occurs during the swing phase of the leg and can easily be counteracted. The user therefore stays in control. We discuss the properties of actuated navigation and present a lab study on identifying basic parameters of the technique as well as an outdoor study in a park. The results show that our approach changes a user's walking direction by about 16°/m on average and that the system can successfully steer users in a park with crowded areas, distractions, obstacles, and uneven ground.
Conference Paper
Full-text available
Flatland was a large scale immersive theatre production completed in March 2015 that made use of a novel shape-changing haptic navigation device, the ‘Animotus’. Copies of this device were given to each audience member in order to guide them through a 112m2 dark space to large tactile structures accompanied by audio narration from the production’s plot. The Animotus was designed to provide unobtrusive navigation feedback over extended periods of time, via modification of its natural cube shape to simultaneously indicate proximity and heading information to navigational targets. Prepared by an interdisciplinary team of blind and sighted specialists, Flatland is part performance, part in-the-wild user study. Such an environment presents a unique opportunity for testing new forms of technology and theatre concepts with large numbers of participants (94 in this case). The artistic aims of the project were to use sensory substitution facilitated exploration to investigate comparable cultural experiences for blind and sighted attendees. Technical goals were to experiment with novel haptic navigational concepts, which may be applied to various other scenarios, including typical outdoor pedestrian navigation. This short paper outlines the project aims, haptic technology design motivation and initial evaluation of resulting audience navigational ability and qualitative reactions to the Animotus.
Article
Full-text available
Perception involves motor control of sensory organs. However, the dynamics underlying emergence of perception from motor-sensory interactions are not yet known. Two extreme possibilities are as follows: (1) motor and sensory signals interact within an open-loop scheme in which motor signals determine sensory sampling but are not affected by sensory processing and (2) motor and sensory signals are affected by each other within a closed-loop scheme. We studied the scheme of motor-sensory interactions in humans using a novel object localization task that enabled monitoring the relevant overt motor and sensory variables. We found that motor variables were dynamically controlled within each perceptual trial, such that they gradually converged to steady values. Training on this task resulted in improvement in perceptual acuity, which was achieved solely by changes in motor variables, without any change in the acuity of sensory readout. The within-trial dynamics is captured by a hierarchical closed-loop model in which lower loops actively maintain constant sensory coding, and higher loops maintain constant sensory update flow. These findings demonstrate interchangeability of motor and sensory variables in perception, motor convergence during perception, and a consistent hierarchical closed-loop perceptual model.
Article
Full-text available
An approach is proposed to shed light on the mechanisms underlying human perception of environmental sound that intrudes in everyday living. Most research on exposure-effect relationships aims at relating overall effects to overall exposure indicators in an epidemiological fashion, without including available knowledge on the possible underlying mechanisms. Here, it is proposed to start from available knowledge on audition and perception to construct a computational framework for the effect of environmental sound on individuals. Obviously, at the individual level additional mechanisms (inter-sensory, attentional, cognitive, emotional) play a role in the perception of environmental sound. As a first step, current knowledge is made explicit by building a model mimicking some aspects of human auditory perception. This model is grounded in the hypothesis that long-term perception of environmental sound is determined primarily by short notice-events. The applicability of the notice-event model is illustrated by simulating a synthetic population exposed to typical Flemish environmental noise. From these simulation results, it is demonstrated that the notice-event model is able to mimic the differences between the annoyance caused by road traffic noise exposure and railway traffic noise exposure that are also observed empirically in other studies and thus could provide an explanation for these differences.
Conference Paper
Full-text available
In this paper, we propose a computer-vision based registration method for augmented reality based on template matching. Computer-vision tracking methods for augmented reality applications typically use a special fiducial markers such as squares or circles. Our new method uses a black square fiducial of ARToolKit to obtain the initial tracking condition, but does not use it in subsequent iterative tracking phases. Several natural feature points are extracted from the tracked object by offline image analysis. While tracking the object, some of these feature points are selected for template matching and the object pose and position are calculated. Even when the initial tracking square is no longer in view, our method produces robust tracking at real time frame rates.
Article
We developed a robot that can guide visually impaired people and elderly people as they walk around in large hospitals. In relation to this, a previous report described the structure of a guidance robot and the comparison of its use with that of a white cane. It was shown that with the use of the robot, participants could move more easily, safely, and confidently than with a white cane. However, to solve the problems encountered with the use of the previous robot, a new guidance robot was fabricated. This paper describes the structure of the new robot and the results of the demonstration examination of the robot in the Kanagawa Rehabilitation Center. The robot navigates to reach a destination set by using the touch panel, whereas velocity depends on the force exerted by the user on the robot. The questionnaire answered by the participants were evaluated using the system usability scale, which showed that the acceptability range of the robot is “Acceptable” and its usability is high.
Conference Paper
Mobile audio augmented reality systems (MAARS) provide a new and engaging modality to present information or to create playful experiences. Using special filters, spatial audio rendering creates the impression that the sound of a virtual source emanates from a certain position in the physical space. So far, most of the implementations of such systems rely on head tracking to create a realistic effect, which requires additional hardware. Recent results indicate that the built-in sensors of a smartphone can be used as source for orientation measurement, reducing deployment to a simple app download. AudioScope presents an alternative interaction technique to create such an experience, using the metaphor of pointing a directional microphone at the environment. In an experiment with 20 users, we compared the time to locate a proximate audio source and the perceived presence in the virtual environment. Results show that there is no significant difference between head-orientation measurement and AudioScope regarding accuracy and perceived presence. This means that MAARS, such as audio guides for museums, do not require special hardware but can run on the visitor's smartphones with standard headphones.
Conference Paper
This paper introduces a new mechanism to induce a virtual force based on human illusory sensations. An asymmetric signal is applied to a tactile actuator consisting of an electromagnetic coil, a metal weight, and a spring, such that the user feels that the device is being pulled (or pushed) in a particular direction, although it is not supported by any mechanical connection to other objects or the ground. The proposed tactile device is smaller (35.0 mm x 5.0 mm x 7.5 mm) and lighter (5.2 g) than any previous force-feedback devices, which have to be connected to the ground with mechanical links. This small form factor allows the device to be implemented in several novel interactive applications, such as a pedestrian navigation system that includes a finger-mounted tactile device or an (untethered) input device that features virtual force. Our experimental results indicate that this illusory sensation actually exists and the proposed device can switch the virtual force direction within a short period. We combined this new technology with visible light transmission via a digital micromirror device (DMD) projector and developed a position guiding input device with force perception.
Article
Fillingim and Maixner (Fillingim, R.B. and Maixner, W., Pain Forum, 4(4) (1995) 209–221) recently reviewed the body of literature examining possible sex differences in responses to experimentally induced noxious stimulation. Using a `box score' methodology, they concluded the literature supports sex differences in response to noxious stimuli, with females displaying greater sensitivity. However, Berkley (Berkley, K.J., Pain Forum, 4(4) (1995) 225–227) suggested the failure of a number of studies to reach statistical significance suggests the effect may be small and of little practical significance. This study used meta-analytic methodology to provide quantitative evidence to address the question of the magnitude of these sex differences in response to experimentally induced pain. We found the effect size to range from large to moderate, depending on whether threshold or tolerance were measured and which method of stimulus administration was used. The values for pressure pain and electrical stimulation, for both threshold and tolerance measures, were the largest. For studies employing a threshold measure, the effect for thermal pain was smaller and more variable. The failures to reject the null hypothesis in a number of these studies appear to have been a function of lack of power from an insufficient number of subjects. Given the estimated effect size of 0.55 threshold or 0.57 for tolerance, 41 subjects per group are necessary to provide adequate power (0.70) to test for this difference. Of the 34 studies reviewed by Fillingim and Maixner, only seven were conducted with groups of this magnitude. The results of this study compels to caution authors to obtain adequate sample sizes and hope that this meta-analytic review can aid in the determination of sample size for future studies.
Conference Paper
In this study, we propose a method of displaying unaware-usage haptic sensation of navigation. People view a map when they visit an unfamiliar place, but when they are taken away their eyes from the map, their attention is diverted and margin of the heart. However, when we are relaxed and do not worry about getting lost, we can discover the intrinsic beauty of the unfamiliar land. Therefore, we have focused on the sense of touch and ensured security by using a support such as a wall or a handrail in the streets, along with the sense of touch. Therefore, we propose a haptic navigation system to release human eyes from the requirement of constantly looking into a map in order to enhance the experience of our daily walk or sightseeing.
Conference Paper
This paper describes how, we have used the ARToolKit to perform three degree of freedom tracking of the hands, in world coordinates, which is used to interact with a mobile outdoor augmented reality computer. Since ARToolKit generates matrices in camera coordinates, if errors occur during the calibration process, it is difficult to extract out real world coordinates. We discuss the problem of making ARToolKit generate world coordinates, and the solutions we developed to meet the requirements for our tracking system.
Perceptual illusions for multisensory displays. Proceedings of the 22nd International Display Workshops (IDW '15)
  • T Amemiya
Ballway stage: ALD-H220-C2P. 2020(6)
  • Chuo Seiki Co
  • Ltd
A registration method based on texture tracking using ARToolKit. 2003 IEEE international augmented reality toolkit workshop
  • H Kato
  • K Tachibana
  • M Billinghurst