Article

Do You Feel What I Hear?

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Abstract In this research we implemented,three different methods,for presenting scientific graphs to blind and visually impaired people. Each rendering method employed either audition, kinesthetic or a combination of those two modalities. In order to allow for distance learning, we have used low cost portable devices for the output graph rendering. The three modes,of representation were then compared,by three separate groups of blind and visually impaired computer,users. Each group consisted of four participants. Results reveal that the combination,of both audio and kinesthetic modalities can be a promising representation medium,of common,scientific graphs for visually challenged people. KEYWORDS

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... New techniques supporting cross-modal coordination and audio-kinesthetic perceptual integration in the absence of visual information have been developed to support an intuitive non-visual interaction with scientific data. Taking an example, Roth et al. [16] have designed three methods to provide congenitally blind and visually impaired users with non-visual representation of the three different scientific graphs-linear function, parabolic curve and periodic sine wave, making use of two different modalities. The first rendering technique employed auditory feedback cues (non-speech sounds) to convey information regarding the graph shape. ...
Article
Full-text available
Blind and visually impaired students need special educational and developmental tools to allow them to interact with graphic entities on PDA and desktop platforms. In previous research, stylus movements regarding the hidden graph were sonified with three directional-predictive sound (DPS) signals, taking into account an exploration behavior and the concept of the capture radius. The results indicated that the scanpaths were by 24textendash40% shorter in length and task completion times decreased by 20textendash25%. The goal of the study presented in this paper was to measure and compare the subjective performance recorded with directional-predictive vibrations (DPV) regarding the subjective performance achieved when the hidden graphic images were explored with DPS. The study also aimed to find out which kind of feedback cues would require less cognitive efforts in interpreting their meaning. The prototype of vibro-tactile pen with embedded vibration motor was used to produce DPV instead of sounds. The performance of eight blindfolded subjects was investigated in terms of the number of both feedbacks used and the time spent to complete non-visual inspection of the hidden graphs. There was a statistically significant difference between the average number of DPS and vibrations and task completion time taken by the players to discover the features of hidden graphs being explored with different capture radius. The experimental findings confirmed the beneficial use of DPS signals in the task when cross-modal coordination should benefit the user in the absence of visual information when compared with DPV patterns.
Article
Full-text available
Current methods available to represent graphical information to individuals who are blind or visually impaired are too expensive and/or cumbersome to be of practical use. Therefore, there is a needfor an affordable display device capable of rendering graphical information through stimulation of working sensory systems. To further facilitate individuals, the device must be portable, as to enable them to use it in many different settings, and highly affordable, as most individuals who are blind are also unemployed. In this paper a dynamic display haptic device is described that is both affordable (<25US)andportable(<1kg).Thedeviceusesaphotointerruptertodetectcontrastsinlightreflectivityforanimageandvibratingsolenoidmotorstoprovidetactilefeedback.Thedeviceiswornlikeaglove,sothetactilefeedbackcombineswiththebodyskinestheticsenseofpositionofthehandtoconveyahapticimage.Preliminarytestsshowthatasinglefingermodelofthedevicehasonaveragea5025 US) and portable (<1 kg). The device uses a photo-interrupter to detect contrasts in light reflectivity for an image and vibrating solenoid motors to provide tactile feedback. The device is worn like a glove, so the tactile feedback combines with the body's kinesthetic sense of position of the hand to convey a haptic image. Preliminary tests show that a single-finger model of the device has on average a 50% object identification accuracy, which is higher than the accuracy for raised-line drawings. The device ca be expanded for use of multiple fingers, while still remaining affordable (<50 US).
Article
This study was performed to determine intra-aneurysm sac pressure of abdominal aortic aneurysm after endovascular aneurysm repair in patients considered successfully treated with aneurysm shrinkage and absence of endovascular leakage. In 10 patients with median aneurysm shrinkage of 12 mm (range, 7 to 22 mm) and median follow-up of 19 months (range, 14-43 months), a percutaneous translumbar intra-aneurysm pressure measurement was made with a 0.014-inch guide wire-mounted pressure sensor and compared with intra-aortic pressure. Median intra-aneurysm systolic/diastolic/mean pressure was 19/18/19 (range, 17-35/13-33/17-31) compared with median intra-aortic pressure of 135/75/99 (range, 126-199/60-95/84-129). Mean intra-aneurysm pressure was 20% of mean intra-aortic pressure (range, 13%-33%). Pulsatility was negligible. Successful endovascular aneurysm repair of abdominal aortic aneurysm results in considerable pressure reduction in the aneurysm sac. The ability to monitor intra-aneurysm pressure provides hemodynamic information within the sac, which can be used in conjunction with imaging to determine whether a secondary intervention is warranted.
Conference Paper
Full-text available
Existing drawing tools for blind users give inadequate contextual feedback on the state of the drawing, leaving blind users unable to comprehend and successfully produce graphical information. We have investigated a tactile method of drawing used by blind users that mimics drawing with a pencil and a paper. Our study revealed a set of properties that must be incorporated into drawing tools for blind users, including giving feedback for relocating important points, determining angles, and communicating the overall structure of the drawing. We describe a gridbased model that provides these properties in a primitivebased 2D graphics environment, and we introduce its use in drawing and other graphical interactions. KEYWORDS Non-visual drawing tools, GUIs for blind users, contextual inquiry, feedback, grid
Article
Full-text available
Data visualization is a technique used to explore real or simulated data by representing it in a form more suitable for comprehension. This form is usually visual since vision provides a means to perceive large quantities of spatial information quickly. However, people who are blind or visually impaired must rely on other senses to accomplish this perception. Haptic interface technology makes digital information tangible, which can provide an additional medium for data exploration and analysis. Unfortunately, the amount of information that can be perceived through a haptic interface is considerably less than that which can be perceived through vision, so a haptic environment must be enhanced to aid the comprehension of the display. This enhancement includes speech output and the addition of object properties such as friction and texture. Textures are generated which can be modified according to a characteristic or property of the object to which it is applied. For example, textures can be used as an analog to color in graphical displays to highlight variations in data. Taking all of these factors into account, methods for representing various forms of data are presented here with the goal of providing a haptic visualization system without the need for a visual component. The data forms considered include one-, two-, and three-dimensional (1-D, 2-D, and 3-D) data which can be rendered using points, lines, surfaces, or vector fields similar to traditional graphical displays. The end result is a system for the haptic display of these common data sets which is accessible for people with visual impairments
Article
A system for the creation of computer-generated sound patterns of two-dimensional line graphs is described. The objectives of the system are to provide the blind with a means of understanding line graphs in the holistic manner used by those with sight. A continuously varying pitch is used to represent motion in the x direction. To test the feasibility of using sound to represent graphs, a prototype system was developed and human factors experimenters were performed. Fourteen subjects were used to compare the tactile-graph methods normally used by the blind to these new sound graphs. It was discovered that mathematical concepts such as symmetry, monotonicity, and the slopes of lines could be determined quickly using sound. Even better performance may be expected with additional training. The flexibility, speed, cost-effectiveness, and greater measure of independence provided the blind or sight-impaired using these methods was demonstrated.
Article
By applying multidimensional scaling procedures and other quantitative analyses to perceptual dissimilarity judgments, we compared the perceptual structure of visual line graphs depicting simulated time series data with that of auditory displays (musical graphs) presenting the same data. Highly similar and meaningful perceptual structures were demonstrated for both auditory and visual modalities, showing that important data characteristics (function slope, shape, and level) were perceptually salient in either presentation mode. Auditory graphics may be a highly useful alternative to traditional visual graphics for a variety of data presentation applications.
Conference Paper
In order to enhance operator performance and understanding within remote environments, most research and development of telepresence systems has been directed towards improving the fidelity of the link between operator and environment. Although higher fidelity interfaces are important to the advancement of a telepresence system, the beneficial effects of corrupting the link between operator and remote environment by introducing abstract perceptual information into the interface called virtual fixtures are described
Article
This paper describes a method of presenting structured audio messages, earcons, in parallel so that they take less time to play and can better keep pace with interactions in a human-computer interface. The two component parts of a compound earcon are played in parallel so that the time taken is only that of a single part. An experiment was conducted to test the recall and recognition of parallel compound earcons as compared to serial compound earcons. Results showed that there are no differences in the rates of recognition between the two groups. Non-musicians are also shown to be equal in performance to musicians. Some extensions to the earcon creation guidelines of Brewster, Wright and Edwards are put forward based upon research into auditory stream segregation. Parallel earcons are shown to be an effective means of increasing the presentation rates of audio messages without compromising recognition rates. (C) 1995 Academic Press Limited
Article
This paper describes "From Dots to Shapes" (FDTS), an auditory platform composed by three classic games ("Simon", "Point Connecting" and "concentration game") for blind and visually impaired pupils. Each game was adapted to work on a concept of the Euclidean geometry (e.g. ) The tool, , is based on sonic and haptic interaction, and therefore could be used by special educators as a help for teaching basic planar geometry.
The IT Potential of Haptics – touch access for people with disabilities. licentiate thesis. Center for Rehabilitation Engineering Research
  • C Sjöström
Sjöström, C. (1999). The IT Potential of Haptics – touch access for people with disabilities. licentiate thesis. Center for Rehabilitation Engineering Research. Available WWW: http://www.certec.lth.se/doc/touchaccess/
Human Grasp Choice and Robotic Grasp Analysis
  • M R Cutkosky
  • R D Howe
Cutkosky, M.R., Howe, R.D. (1990). Human Grasp Choice and Robotic Grasp Analysis. In Dextrous Robot Hands, S.T. Venkataraman and T. Iberall, (Ed.), Springer-Verlag, New York, 1990.