Oliver W LaytonColby College · Computer Science
Oliver W Layton
Doctor of Philosophy
About
49
Publications
3,123
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
341
Citations
Introduction
Additional affiliations
September 2009 - August 2013
Publications
Publications (49)
Accurate self-motion estimation is critical for various navigational tasks in mobile robotics. Optic flow provides a means to estimate self-motion using a camera sensor and is particularly valuable in GPS- and radio-denied environments. The present study investigates the influence of different activation functions—ReLU, leaky ReLU, GELU, and Mish—o...
Accuracy-optimized convolutional neural networks (CNNs) have emerged as highly effective models at predicting neural responses in brain areas along the primate ventral stream, but it is largely unknown whether they effectively model neurons in the complementary primate dorsal stream. We explored how well CNNs model the optic flow tuning properties...
Accuracy-optimized convolutional neural networks (CNNs) have emerged as highly effective models at predicting neural responses in brain areas along the primate ventral stream, but it is largely unknown whether they effectively model neurons in the complementary primate dorsal stream. We explored how well CNNs model the optic flow tuning properties...
Humans are capable of accurately judging their heading from optic flow during straight forward self-motion. Despite the global coherence in the optic flow field, however, visual clutter and other naturalistic conditions create constant flux on the eye. This presents a problem that must be overcome to accurately perceive heading from optic flow-the...
Human observers are capable of perceiving the motion of moving objects relative to the stationary world, even while undergoing self-motion. Perceiving world-relative object motion is complicated because the local optical motion of objects is influenced by both observer and object motion, and reflects object motion in observer coordinates. It has be...
Self-motion along linear paths without eye movements creates optic flow that radiates from the direction of travel (heading). Optic flow-sensitive neurons in primate brain area MSTd have been linked to linear heading perception, but the neural basis of more general curvilinear self-motion perception is unknown. The optic flow in this case is more c...
Convolutional neural networks (CNNs) have made significant advances over the past decade with visual recognition, matching or exceeding human performance on certain tasks. Visual recognition is subserved by the ventral stream of the visual system, which, remarkably, CNNs also effectively model. Inspired by this connection, we investigated the exten...
Optic flow provides rich information about world-relative self-motion and is used by many animals to guide movement. For example, self-motion along linear, straight paths without eye movements, generates optic flow that radiates from a singularity that specifies the direction of travel (heading). Many neural models of optic flow processing contain...
This paper introduces a self-tuning mechanism for capturing rapid adaptation to changing visual stimuli by a population of neurons. Building upon the principles of efficient sensory encoding, we show how neural tuning curve parameters can be continually updated to optimally encode a time-varying distribution of recently detected stimulus values. We...
Most algorithms for steering, obstacle avoidance, and moving object detection rely on accurate self-motion estimation, a problem animals solve in real time as they navigate through diverse environments. One biological solution leverages optic flow, the changing pattern of motion experienced on the eye during self-motion. Here I present ARTFLOW, a b...
Self-motion produces characteristic patterns of optic flow on the eye of the mobile observer. Movement along linear, straight paths without eye movements yields motion that radiates from the direction of travel (heading). The observer experiences more complex motion patterns while moving along more general curvilinear (e.g. circular) paths, the app...
This paper introduces a self-tuning mechanism for capturing rapid adaptation to changing visual stimuli by a population of neurons. Building upon the principles of efficient sensory encoding, we show how neural tuning curve parameters can be continually updated to optimally encode a time-varying distribution of recently detected stimulus values. We...
Affordance-based control and current-future control offer competing theoretical accounts of the visual control of locomotion. The aim of this study was to test predictions derived from these accounts about the necessity of self-motion (Experiment 1) and target-ground contact (Experiment 2) in perceiving whether a moving target can be intercepted be...
Many everyday interactions with moving objects benefit from an accurate perception of their movement. Self-motion, however, complicates object motion perception because it generates a global pattern of motion on the observer’s retina and radically influences an object’s retinal motion. There is strong evidence that the brain compensates by suppress...
Walking and other forms of self-motion create global motion patterns across our eyes. With the resulting stream of visual signals, how do we perceive ourselves as moving through a stable world? While the neural mechanisms are largely unknown, human studies (e.g. Warren & Rushton, 2009) provide strong evidence that the visual system is capable of pa...
Cortical area MSTd contains cells sensitive to the radial expansion and contraction motion patterns experienced during forward and backward self-motion. We investigated the open question of whether populations of MSTd cells tuned to expansion and contraction interact through recurrent connectivity, which may play important roles in postural control...
To avoid collisions, predation, and other life-threatening encounters, humans must perceive the motion of objects that move independently during self-motion. The local optical motion of such objects on the retina depends not only on their movement through the world, but also on the observer's self-motion (observer-relative reference frame). Yet, th...
An important but neglected aspect of tasks that involve interception of moving targets on foot is knowing when to stop pursuing a target that is moving too fast to catch. Whether the target is a prey animal in the wild or an opponent on the playing field, chasing an uncatchable target is not only futile but a waste of energy. The aim of this study...
Unlabelled:
When a moving object cuts in front of a moving observer at a 90° angle, the observer correctly perceives that the object is traveling along a perpendicular path just as if viewing the moving object from a stationary vantage point. Although the observer's own (self-)motion affects the object's pattern of motion on the retina, the visual...
Human heading perception based on optic flow is not only accurate, it is also remarkably robust and stable. These qualities are especially apparent when observers move through environments containing other moving objects, which introduce optic flow that is inconsistent with observer self-motion and therefore uninformative about heading direction. M...
The focus of expansion (FoE) specifies the heading direction of an observer during self-motion, and experiments show that humans can accurately perceive their heading from optic flow. However, when the environment contains an independently moving object, heading judgments may be biased. When objects approach the observer in depth, the heading bias...
Many forms of locomotion rely on the ability to accurately perceive one's direction of locomotion (i.e., heading) based on optic flow. Although accurate in rigid environments, heading judgments may be biased when independently moving objects are present. The aim of this study was to systematically investigate the conditions in which moving objects...
Several studies have revealed that human heading perception based on optic flow is biased when independently moving objects (IMOs) cross or approach the observer's future path. However, these biases are surprisingly weak (~2°) and perceived heading does not seem to abruptly shift at the moment that a moving object crosses the observer's future path...
Camouflaged animals that have very similar textures to their surroundings are difficult to detect when stationary. However, when an animal moves, humans readily see a figure at a different depth than the background. How do humans perceive a figure breaking camouflage, even though the texture of the figure and its background may be statistically ide...
Determining whether a region belongs to the interior or exterior of a shape (figure-ground segregation) is a core competency of the primate brain, yet the underlying mechanisms are not well understood. Many models assume that figure-ground segregation occurs by assembling progressively more complex representations through feedforward connections, w...
Self-motion, steering, and obstacle avoidance during navigation in the real world require humans to travel along curved paths. Many perceptual models have been proposed that focus on heading, which specifies the direction of travel along straight paths, but not on path curvature, which humans accurately perceive and is critical to everyday locomoti...
The spatio-temporal displacement of luminance patterns in a 2D image is called optic flow. Present biologically-inspired approaches to navigation that use optic flow largely focus on the problem of extracting the instantaneous direction of travel (heading) of a mobile agent. Computational models have demonstrated success in estimating heading in hi...
Humans are capable of rapidly determining whether regions in a visual scene appear as figures in the foreground or as background, yet how figure-ground segregation occurs in the primate visual system is unknown. Figures in the environment are perceived to own their borders, and recent neurophysiology has demonstrated that certain cells in primate v...
Navigation in a static environment along straight paths without eye movements produces radial optic flow fields. A singularity called the focus of expansion (FoE) specifies the direction of travel (heading) of the observer. Cells in primate dorsal medial superior temporal area (MSTd) respond to radial fields and are therefore thought to be heading-...
Humans accurately judge their direction of heading when translating in a rigid environment, unless independently moving objects (IMOs) cross the observer's focus of expansion (FoE). Studies show that an IMO on a laterally moving path that maintains a fixed distance with respect to the observer (non-approaching; C. S. Royden & E. C. Hildreth, 1996)...
Is it possible for humans to navigate in the natural environment wherein the path taken between various destinations is 'optimal' in some way? In the domain of optimization this challenge is traditionally framed as the "Traveling Salesman Problem" (TSP). What strategies and ecological considerations are plausible for human navigation? When given a...