Article

Seeing with the Brain.

Department of Orthopedics and Rehabilitation, University of Wisconsin–Madison, Madison, Wisconsin, United States
International Journal of Human-Computer Interaction (Impact Factor: 0.72). 04/2003; 15(2):285-295. DOI: 10.1207/S15327590IJHC1502_6
Source: DBLP

ABSTRACT We see with the brain, not the eyes (Bach-y-Rita, 1972); images that pass through our pupils go no further than the retina. From there image information travels to the rest of the brain by means of coded pulse trains, and the brain, being highly plastic, can learn to interpret them in visual terms. Perceptual levels of the brain interpret the spatially encoded neural activity, modified and augmented by nonsynaptic and other brain plasticity mechanisms (Bach-y-Rita, 1972, 1995, 1999, in press). However, the cognitive value of that information is not merely a process of image analysis. Perception of the image relies on memory, learning, contextual interpretation (e. g., we perceive intent of the driver in the slight lateral movements of a car in front of us on the highway), cultural, and other social fac-tors that are probably exclusively human characteristics that provide "qualia" (Bach-y-Rita, 1996b). This is the basis for our tactile vision substitution system (TVSS) studies that, starting in 1963, have demonstrated that visual information and the subjective qualities of seeing can be obtained tactually using sensory sub-stitution systems., The description of studies with this system have been taken

Download full-text

Full-text

Available from: Mitchell Tyler, Apr 21, 2014
2 Followers
 · 
171 Views
  • Source
    • "In the research reported here we have picked up on the work presented in [1] [2] [3] [4] [5] in order to investigate the learning potential of humans with regard to shape representations on the tongue using electro-tactile arrays. However, in line with [11] "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper details an investigation into sensory substitution by means of direct electrical stimulation of the tongue for the purpose of information input to the human brain. In particular, a device has been constructed and a series of trials have been performed in order to demonstrate the efficacy and performance of an electro-tactile array mounted onto the tongue surface for the purpose of sensory augmentation. Tests have shown that by using a low resolution array a computer-human feedback loop can be successfully implemented by humans in order to complete tasks such as object tracking, surface shape identification and shape recognition with no training or prior experience with the device. Comparisons of this technique have been made with visual alternatives and these show that the tongue based tactile array can match such methods in convenience and accuracy in performing simple tasks.
  • Source
    • "In order to increase the sensing range, Electronic Travel Aids (ETAs) combine various sensors to scan the environment and provide different types of feedback [2][3]. Various approaches are used to represent the spatial information to the user, such as vibrations as used in the Ultracane [4], 3D auditory representations [5][6], transforming the 3D image into a bas-relief surface that can be tactily explored [7], etc. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes an instrumented handle with multimodal augmented haptic feedback, which can be integrated into a conventional white cane to extend the haptic exploration range of visually impaired users. The information extracted from the environment through a hybrid range sensor is conveyed to the user in an intuitive manner over two haptic feedback systems. The first renders impulses that imitate the impact of the real cane with a distant obstacle. In combination with the range sensors, this system allows to “touch” and explore remote objects, thus compensating for the limited range of the conventional white cane without altering its intuitive usage. The impulses are generated by storing kinetic energy in a spinning inertia wheel, which is released by abruptly braking the wheel. Furthermore, a vibrotactile interface integrated into the ergonomic handle conveys the distance to obstacles to the user. Three vibrating motors located along the index finger and hand are activated in different spatiotemporal patterns to induce a sense of distance through apparent movement. The realized augmented white cane not only increases the safety of the user by detecting obstacles from a further distance and alerting about those located at the head level, but also allows the user to build extended spatial mental models by increasing the sensing range, thereby allowing anticipated decision making and thus more natural navigation.
    Biomedical Robotics and Biomechatronics (BioRob), 2010 3rd IEEE RAS and EMBS International Conference on; 10/2010
  • Source
    • "We have developed an electrotactile stimulation (ETS) system for the human tongue that can be used to present either information for a variety of sensory substitution applications, e.g. vision, or head orientation in balance control [1] [2] [3] [4] [5], or for neuromodulation and rehabilitation after neural injury [6] [7] [8]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: We have developed a novel, tongue-based electrotactile brain-machine interface. Variability of the tactile sensation intensity across the stimulated area, however, limits the amount of reliable information transmission. We have conducted an experiment to characterize local sensitivity across the region stimulated by the array. From this data we have constructed an isointensity algorithm to compensate for the variability in electrotactile sensation levels across the stimulated area of the tongue.
    Conference proceedings: ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference 01/2009; 2009:559-62. DOI:10.1109/IEMBS.2009.5334556
Show more