Conference Paper

Touchable Video and Tactile Audio.

DOI: 10.1109/ISM.2009.79 Conference: ISM 2009, 11th IEEE International Symposium on Multimedia, San Diego, California, USA, December 14-16, 2009
Source: DBLP

ABSTRACT We propose a Haptic Audio Visual System (HAVS) which consists of a touchable video and a tactile audio. Touchable video generates tactile feedback according to the user's real-time interactions on the video. Tactile audio provides the tactile experiences through the automatic generation of vibro-tactile stimuli using a sound detection algorithm. The video screen is divided into a grid, and each cell has customizable tactile information which corresponds to the visual content. At the same time, the sound stream is also analyzed and customizable vibration responses are synchronized to specific sound effects or audio "signatures". One of the important advantages of HAVS is the ability to augment and enhance immersiveness and the interactivity of users in 3D virtual environments. The application of HAVS to commercial, military, educational and medical contexts including: home shopping, online gaming, interactive broadcasting, teaching/learning and teleoperator settings will be discussed.

  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the design and testing of an image contour display system with vibrotactile array. The tactile image display system is attached on the user’s back. It produces non-visual image and permits subjects to determine the position, size, shape of visible objects through vibration stimulus. The system comprises three parts: 1) a USB camera; 2) 48 (6×8) vibrating motors; 3) ARM micro-controlled system. Image is captured with the camera and the 2D contour is extracted and transformed into vibrotactile stimulus with a “contour following” (time-spatial dynamic coding) pattern. With this system subjects could identify the shape of object without special training; meanwhile fewer vibrotactile actuators are adopted. Preliminary experiments were carried out and the results demonstrated that the prototype was satisfactory and efficient for the visually impaired in seeing aid and environment perception.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Visual and haptic rendering pipelines exist concurrently and compete for computing resources while the refresh rate of haptic rendering is two orders of magnitude higher than that of visual rendering (1000 Hz vs. 30-50Hz). However, in many cases, 3D visual rendering can be replaced by merely displaying 2D images, thus releasing the resources to image-driven haptic rendering algorithms. These algorithms provide for haptic texture rendering in vicinity of a touch point, but usually require additional information augmented with the image to provide for haptic perception of geometry of the shapes displayed in images. We propose a framework for making tangible images which allows haptic perception of three features: scene geometry, texture and physical properties. Haptic geometry rendering technique uses depth information, that could be acquired by a multitude of ways for providing haptic interaction with images and videos in real-time. The presented method neither performs D reconstruction nor requires for using polygonal models. It is based on direct force calculation and allows for smooth haptic interaction even at object boundaries. We also propose dynamic mapping of haptic workspace in real-time to enable sensation of fine surface details. Alternately, one of the existing shading-based haptic texture rendering methods can be combined with the proposed haptic geometry rendering algorithm to provide believable interaction. Haptic perception of physical properties is achieved by automatic segmentation of an image into haptic regions and interactive assignment of physical properties to them.