Charles M. Higgins's research while affiliated with The University of Arizona and other places

Publications (34)

Article
Full-text available
We have developed a neural network model capable of performing visual binding inspired by neuronal circuitry in the optic glomeruli of flies: a brain area that lies just downstream of the optic lobes where early visual processing is performed. This visual binding model is able to detect objects in dynamic image sequences and bind together their res...
Article
Full-text available
Visual binding is the process of associating the responses of visual interneurons in different visual submodalities all of which are responding to the same object in the visual field. Recently identified neuropils in the insect brain termed optic glomeruli reside just downstream of the optic lobes and have an internal organization that could suppor...
Article
Full-text available
When imitating biological sensors, we have not completely understood the early processing of the input to reproduce artificially. Building hybrid systems with both artificial and real biological components is a promising solution. For example, when a dragonfly is used as a living sensor, the early processing of visual information is performed fully...
Article
Collision avoidance models derived from the study of insect brains do not perform universally well in practical collision scenarios, although the insects themselves may perform well in similar situations. In this article, we present a detailed simulation analysis of two well-known collision avoidance models and illustrate their limitations. In doin...
Article
Motion-sensitive neurons in the visual systems of many species, including humans, exhibit a depression of motion responses immediately after being exposed to rapidly moving images. This motion adaptation has been extensively studied in flies, but a neuronal mechanism that explains the most prominent component of adaptation, which occurs regardless...
Article
Insect navigational behaviors including obstacle avoidance, grazing landings, and visual odometry are dependent on the ability to estimate flight speed based only on visual cues. In honeybees, this visual estimate of speed is largely independent of both the direction of motion and the spatial frequency content of the image. Electrophysiological rec...
Article
Insects use visual estimates of flight speed for a variety of behaviors, including visual navigation, odometry, grazing landings and flight speed control, but the neuronal mechanisms underlying speed detection remain unknown. Although many models and theories have been proposed for how the brain extracts the angular speed of the retinal image, term...
Article
A sensor was designed to compute speed at the image focal plane for robotic navigation. It employs an array of parallel sensing and computing blocks, and outputs a signal that varies linearly with image speed.
Article
We present a novel analog VLSI implementation of visual motion computation based on the lateral inhibition and positive feedback mechanisms that are inherent in the hysteretic winner-take-all circuit. By use of an input-dependent bias current and threshold mechanism, the circuit resets itself to prepare for another motion computation. This implemen...
Article
The objective of this study is to improve the quality of life for the visually impaired by restoring their ability to self-navigate. In this paper we describe a compact, wearable device that converts visual information into a tactile signal. This device, constructed entirely from commercially available parts, enables the user to perceive distant ob...
Article
The Hassenstein-Reichardt (HR) correlation model is commonly used to model elementary motion detection in the fly. Recently, a neuronally-based computational model was proposed which, unlike the HR model, is based on identified neurons. The response ofboth models increases as the square ofcontrast, although the response ofinsect neurons saturates a...
Article
Using a neuronallybased computational model of the fly 's visual elementarymotion detection (EMD) system, the effects of picrotoxin, a GABA receptor antagonist, were modeled to investigate the role of various GABAergic cells in direction selectivity. By comparing the results of our simulation of an anatomicallycorrect model to previouslypublished e...
Article
The computation of local visual motion can be accomplished very efficiently in the focal plane with custom very large-scale integration (VLSI) hardware. Algorithms based on measurement of the spatial and temporal frequency content of the visual motion signal, since they incorporate no thresholding operation, allow highly sensitive responses to low...
Article
Visual motion information provides a variety of clues that enable biological organisms from insects to primates to efficiently navigate in unstructured environments. We present modular mixed-signal very large-scale integration (VLSI) implementations of the three most prominent biological models of visual motion detection. A novel feature of these d...
Article
Taking inspiration from the visual system of the fly, we describe and characterize a monolithic analog very large-scale integration sensor, which produces control signals appropriate for the guidance of an autonomous robot to visually track a small moving target. This sensor is specifically designed to allow such tracking even from a moving imaging...
Article
Flies have the capability to visually track small moving targets, even across cluttered backgrounds. Previous computational models, based on figure detection (FD) cells identified in the fly, have suggested how this may be accomplished at a neuronal level based on information about relative motion between the target and the background. We experimen...
Article
Behavioral experiments suggest that insects make use of the apparent image speed on their compound eyes to navigate through obstacles, control flight speed, land smoothly, and measure the distance they have flown. However, the vast majority of electrophysiological recordings from motion-sensitive insect neurons show responses which are tuned in spa...
Article
Full-text available
Based on comparative anatomical studies and electrophysiological experiments, we have identified a conserved subset of neurons in the lamina, medulla, and lobula of dipterous insects that are involved in retinotopic visual motion direction selectivity. Working from the photoreceptors inward, this neuronal subset includes lamina amacrine (alpha) cel...
Conference Paper
Tracking of a target in a cluttered environment requires extensive computational architecture. However, even a small housefly is adept at pursuing its prey. Biomimetic algorithms suggest a way of looking at this problem. In the lobula plate of a fly's brain, a neural circuit is hypothesized based on a tangential cell called the figure detection (FD...
Article
We introduce a biologically inspired computational architecture for small-field detection and wide-field spatial integration of visual motion based on the general organizing principles of visual motion processing common to organisms from insects to primates. This highly parallel architecture begins with two-dimensional (2-D) image transduction and...
Conference Paper
Biological motion sensors found in the retinas of species ranging from ies to primates are tuned to specic spatio-temporal fre- quencies to determine the local motion vectors in their visual eld and perform complex motion computations. In this study, we present a novel implementation of a silicon retina based on the Adelson-Bergen spatio- temporal...
Article
Engineers have a lot to gain from studying biology. The study of biological neural systems alone provides numerous examples of computational systems that are far more complex than any man-made system and perform real-time sensory and motor tasks in a manner that humbles the most advanced artificial systems. Despite the evolutionary genesis of these...
Article
The extent of pixel-parallel focal plane image processing is limited by pixel area and imager fill factor. In this paper, we describe a novel multi-chip neuromorphic VLSI visual motion processing system which combines analog circuitry with an asynchronous digital interchip communications protocol to allow more complex pixel-parallel motion processi...
Article
We present two compact CMOS integrated circuits for computing the 2D local direction of motion of an image focused directly onto the chip. These circuits incorporate onboard photoreceptors and focal plane motion processing. With fully functional 14 Theta 13 and 12 Theta 13 implementations consuming less than 50 W per pixel, we conclude that practic...
Article
We present two compact CMOS integrated circuits for computing the two-dimensional (2-D) local direction of motion of an image focused directly onto the chip. These circuits incorporate onboard photoreceptors and focal plane motion processing. With fully functional 14×13 and 12×13 implementations consuming less than 50 μW per pixel, we conclude that...
Article
. Optical flow fields are a primary source of information about the visual scene in technical and biological systems. In a step towards a system for real time scene analysis we have developed two new algorithms for the parallel computation of the direction of motion field in 2-D. We have successfully implemented these algorithms in analog VLSI hard...
Conference Paper
We describe a multi-chip CMOS VLSI visual motion processing system which combines analog circuitry with an asynchronous digital interchip communications protocol to allow more complex motion processing than is possible with all the circuitry in the focal plane. The two basic VLSI building blocks are a sender chip which incorporates a 2D imager arra...
Article
We describe a multi-chip CMOS VLSI visual motion processing system which combines analog circuitry with an asynchronous digital interchip communications protocol to allow more complex motion processing than is possible with all the circuitry in the focal plane. The two basic VLSI building blocks are a sender chip which incorporates a 2D imager arra...
Article
A robust, integrative algorithm is presented for computing the position of the focus of expansion or axis of rotation (the singular point) in optical flow fields such as those generated by self-motion. Measurements are shown of a fully parallel CMOS analog VLSI motion sensor array which computes the direction of local motion (sign of optical flow)...
Conference Paper
Optical flow fields are a primary source of information about the visual scene in technical and biological systems. In a step towards a system for real time scene analysis we have developed two new algorithms for the parallel computation of the direction of motion field in 2-D. We have successfully implemented these algorithms in analog VLSI hardwa...
Article
A family of analog CMOS velocity sensors is described which measures the velocity of a moving edge by computing its time of travel between adjacent pixels. These sensors are compact, largely invariant to illumination over a wide range, sensitive to edges with very low contrast, and responsive to several orders of magnitude of velocity. Two successf...
Article
Insects are supremely successful biological autonomous systems that, despite their diminutive size, have survived in the real world and adapted to changing physical environments for more than 400 million years. Insects, which were highly successful long before the existence of mankind, are today the most species- rich and diverse of all animal taxa...

Citations

... Research on the usefulness of sensory substitution systems predominantly focuses on assisting the visually impaired with wayfinding and spatial recognition. Apparatus using vibrotactile stimulation applied through matrices onto the back, abdomen, forehead and hand have been used; and results indicate participants can effectively avoid obstructions and navigate surroundings (Johnson & Higgins, 2006;Segond, Weiss & Sampaio, 2005;Zelek, Bromley, Asmar & Thompson, 2003). One significant limitation of these findings was identified: the technology was cumbersome and lacked portability. ...
... Particular operations (spatial/temporal filtering, motion and colour detection) are processed in series or in parallel in the different layers. This ordered processing of information is a requisite for later operations that will translate into behaviours (reviewed in Strausfeld et al., 2006). ...
... As neuromorphic front end sensors for temporal contrast detection improved, Higgins [76] and Indiveri [71] both adopted multi-chip approaches to motion-estimation, relying on a stand-alone specialized front end sensor for temporal contrast detection, and a separate chip for the FS token algorithm implementation. The multi-chip approach carries the disadvantage of requiring additional power for off-chip communication between the front-end sensor and the motion computation chip. ...
... They proposed an Elementary Motion Unit (EMU) which correlates illumination levels recorded at different instants and positions of the image. To implement EMUs only photo-pixel, delays and multipliers are required, and multipliers can even be substituted with other forms of comparator simpler to design (Gottardi and Yang, 1993;Harrison, 2005;Pant and Higgins, 2007). ...
... The membership of, and interactions within, these smaller groups may be dependent on a metric distance, for example, the zonal model of shoaling fish [7], or a topological distance, as in the model of flocking starlings [8]. Models of perception have also been developed to investigate the use of optic flow in birds [9] and bats [10] and even visual tracking to aid collision avoidance in insects [11]. ...
... Since local direction of motion can be used for many of the same tasks as local velocity, the high pixel resolutions possible with these sensors may make them more practically useful than present lower-resolution velocity sensors. This work has been briefly described in [2]. ...
... A similar work was done in [30], yet without any information on its real-time performance. On the other hand, some VLSI implementations based on analog application-specific chips were proposed [28], [31]- [33]. In these systems, multiple chips were required to compute the motion energy, and a general-purpose computer was used for velocity integration. ...
... The spatially separate pooling of front-end signals inFig.·1E would correspond to different compartments within the dendritic arborization of the tangential cell, which is plausible given the massive extent of these arborizations. This interpretation is consistent with previous modeling work on the emergence of directional selectivity across the medulla–lobula plate projection (Melano and Higgins, 2005). Specifically, Melano and Higgins placed the relevant nonlinearity between T5 and subsequent pooling by the tangential cell [see theirfig.·1b, ...
... Similarly to [106], the integrated response can be used as a visual odometer over time. Moreover, there are also studies on the temporal adaptation of EMDs [38,89], the contrast sensitivity of EMDs [178], and an EMDs-based algorithm for global motion estimation [140], as well as a non-directional HR detectors model for simulating insect speed-dependent behaviour [96], and so on. is adapted from [216], (h) is adapted from [55]. ...
... "Center-surround" orientation selectivity has been mathematically modeled in numerous ways, including Gabor wavelets (Adelson and Bergen 1985) and by using a difference-of-Gaussians (DoG) function (Rodieck 1965). The leading model of orientation selectivity in insects supports a direct neuronal implementation of the DoG model (Rivera-Alvidrez et al. 2011). This model, based on both electrophysiological and neuroanatomical evidence, makes use of spatial spreading of photoreceptor inputs by two distinct types of amacrine cells that results in two Gaussian-blurred versions of the input image. ...