February 2001
·
21 Reads
·
15 Citations
This paper reports on the design of an audio-haptic tool that enables blind computer users to explore a picture by the hearing and feeling modalities. The tool is divided in two entities: a description tool and an exploration tool. The description tool allows moderators (sighted person) to describe a scene. Therefore, the scene is firstly segmented manually or automatically into a set of objects (car, tree, house, etc.). For. every object, the moderator can define a behavior which correspond either to an auditory (i.e., using speech or non-speech sound) or to a kinesthetic rendering. The blind person uses the exploration tool in order to obtain the audio-haptic rendering of the segmented image previously defined by the moderator. Depending on the nature of the feedback defined (audio, kinesthetic), the blind user interacts either with a graphic tablet and/or a force feedback device