Conference Paper

Adapting haptic game devices for non-visual graph rendering.

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... • Multimodal exploration of graphs for statistical data [72,73,30,200]. ...
Article
Abstract In this research we implemented,three different methods,for presenting scientific graphs to blind and visually impaired people. Each rendering method employed either audition, kinesthetic or a combination of those two modalities. In order to allow for distance learning, we have used low cost portable devices for the output graph rendering. The three modes,of representation were then compared,by three separate groups of blind and visually impaired computer,users. Each group consisted of four participants. Results reveal that the combination,of both audio and kinesthetic modalities can be a promising representation medium,of common,scientific graphs for visually challenged people. KEYWORDS
Article
Personal computers, palm top computers, media players and cell phones provide instant access to information from around the world. There are a wide variety of options available to make that information available to people with visual disabilities, so many that choosing one for use in any given context can often feel daunting to someone new to the field of accessibility. This paper reviews tools and techniques for the presentation of textual, graphic, mathematic and web documents through audio and haptic modalities to people with visual disabilities.
Article
This study implemented three different methods for presenting scientific graphs to visually impaired people: audition, kinesthetics, or a combination of the two. The results indicate that the combination of both audio and kinesthetic modalities can be a promising representation medium of common scientific graphs for people who are visually impaired.
Conference Paper
This paper reports on the design of an audio-haptic tool that enables blind computer users to explore a picture by the hearing and feeling modalities. The tool is divided in two entities: a description tool and an exploration tool. The description tool allows moderators (sighted person) to describe a scene. Therefore, the scene is firstly segmented manually or automatically into a set of objects (car, tree, house, etc.). For. every object, the moderator can define a behavior which correspond either to an auditory (i.e., using speech or non-speech sound) or to a kinesthetic rendering. The blind person uses the exploration tool in order to obtain the audio-haptic rendering of the segmented image previously defined by the moderator. Depending on the nature of the feedback defined (audio, kinesthetic), the blind user interacts either with a graphic tablet and/or a force feedback device
ResearchGate has not been able to resolve any references for this publication.