Article

Seeing with the Brain.

Department of Orthopedics and Rehabilitation, University of Wisconsin–Madison, Madison, Wisconsin, United States
International Journal of Human-Computer Interaction (Impact Factor: 0.72). 04/2003; 15(2):285-295. DOI: 10.1207/S15327590IJHC1502_6
Source: DBLP

ABSTRACT We see with the brain, not the eyes (Bach-y-Rita, 1972); images that pass through our pupils go no further than the retina. From there image information travels to the rest of the brain by means of coded pulse trains, and the brain, being highly plastic, can learn to interpret them in visual terms. Perceptual levels of the brain interpret the spatially encoded neural activity, modified and augmented by nonsynaptic and other brain plasticity mechanisms (Bach-y-Rita, 1972, 1995, 1999, in press). However, the cognitive value of that information is not merely a process of image analysis. Perception of the image relies on memory, learning, contextual interpretation (e. g., we perceive intent of the driver in the slight lateral movements of a car in front of us on the highway), cultural, and other social fac-tors that are probably exclusively human characteristics that provide "qualia" (Bach-y-Rita, 1996b). This is the basis for our tactile vision substitution system (TVSS) studies that, starting in 1963, have demonstrated that visual information and the subjective qualities of seeing can be obtained tactually using sensory sub-stitution systems., The description of studies with this system have been taken

Full-text

Available from: Mitchell Tyler, Apr 21, 2014
2 Followers
 · 
158 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Many applications use electrostimulation of the human skin to provide tactile sensation. The effect of electrotactile stimulations were studied on a 6x6 matrix of tactile electrodes placed on the anterior part of the tongue. The liminary threshold with continuous or discontinuous waveform and patterns with 2 or 4 electrodes was investigated. The result suggest that for energy saving and to improve the yield, it would probably be better to use discontinuous stimulation with two electrode patterns.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a new method of sensing the 3D visual environment and controlling objects within it. The human hand is well suited to interrogate and manipulate objects by physical contact; however, the hand is limited to surfaces within its reach. Extending the hand's innate ability, we mount miniature cameras on individual fingertips, permitting rapid sweeping through the 3D visual environment at greater distances. The information gleaned from each fingertip camera is fed back to that finger by a small vibrator, so the sense of touch remains related to each finger's individual interaction with the environment. The muscles of the fingers, wrist, and arm can potentially provide motor control as fine as the eyes' (and we have ten fingers instead of only two eyes) although the sensory resolution and total bandwidth is likely to be less. Metaphorically speaking, we have given eyesight to the fingers, and thus we call the resulting capability, "Fingersight ." In addition to sensing the environment, we can remotely control certain identified targets with subsequent motion of the fingers. We can move or rotate objects on a computer screen or recognize inanimate objects such as a light switch, controlling environmental parameters with a flick of the finger.
    Haptic, Audio and Visual Environments and Games, 2007. HAVE 2007. IEEE International Workshop on; 11/2007
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we propose an assistive technology (AT) device for individuals with visual impairments. This device would extract motion information from images acquired from a capture device and present the information through a tactile interface. The algorithm utilized provides information about the amount of motion present in the field of view and the centroid of the motion. Depending upon the number of stimulators and the location of the active stimulators, the direction and size of the objects could be determined. Currently, a simulation ofthe operation is performed using a computer to process images captured from an inexpensive camera. The primary goal of this design is to provide an AT device that is practical, cost-effective, and useful for individuals with visual impairments.
    Bioengineering Conference, 2006. Proceedings of the IEEE 32nd Annual Northeast; 05/2006