David Whitaker

University of Bradford, Bradford, England, United Kingdom

Are you David Whitaker?

Claim your profile

Publications (80)205.97 Total impact

  • Procedia - Social and Behavioral Sciences 03/2014; 126:152-153. DOI:10.1016/j.sbspro.2014.02.350
  • [Show abstract] [Hide abstract]
    ABSTRACT: Perceived time is inherently malleable. For example, adaptation to relatively long or short sensory events leads to a repulsive aftereffect such that subsequent events appear to be contracted or expanded (duration adaptation). Perceived visual duration can also be distorted via concurrent presentation of discrepant auditory durations (multisensory integration). The neural loci of both distortions remain unknown. In the current study we use a psychophysical approach to establish their relative positioning within the sensory processing hierarchy. We show that audiovisual integration induces marked distortions of perceived visual duration. We proceed to use these distorted durations as visual adapting stimuli yet find subsequent visual duration aftereffects to be consistent with physical rather than perceived visual duration. Conversely, the concurrent presentation of adapted auditory durations with nonadapted visual durations results in multisensory integration patterns consistent with perceived, rather than physical, auditory duration. These results demonstrate that recent sensory history modifies human duration perception prior to the combination of temporal information across sensory modalities and provides support for adaptation mechanisms mediated by duration selective neurons situated in early areas of the visual and auditory nervous system (Aubie, Sayegh, & Faure, 2012; Duysens, Schaafsma, & Orban, 1996; Leary, Edwards, & Rose, 2008).
    Journal of Vision 12/2013; 13(14). DOI:10.1167/13.14.4 · 2.73 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In order to interact with our environment, the human brain constructs maps of visual space. The orderly mapping of external space across the retinal surface, termed retinotopy, is maintained at subsequent levels of visual cortical processing and underpins our capacity to make precise and reliable judgments about the relative location of objects around us. While these maps, at least in the visual system, support high precision judgments about the relative location of objects, they are prone to significant perceptual distortion. Here, we ask observers to estimate the separation of two visual stimuli--a spatial interval discrimination task. We show that large stimulus sizes require much greater separation in order to be perceived as having the same separation as small stimulus sizes. The relationship is linear, task independent, and unrelated to the perceived position of object edges. We also show that this type of spatial distortion is not restricted to the object itself but can also be revealed by changing the spatial scale of the background, while object size remains constant. These results indicate that fundamental spatial properties, such as retinal image size or the scale at which an object is analyzed, exert a marked influence on spatial coding.
    Journal of Vision 04/2012; 12(4):8. DOI:10.1167/12.4.8 · 2.73 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This study was conducted to investigate whether neural compensation for induced defocus can alter visual resolution in other areas of the human retina beyond the fovea. In certain circumstances, the blur adaptation response may be influenced by refractive status. The effect of blur adaptation on the central 10° of the retina was investigated in 20 normally sighted observers (10 emmetropes and 10 myopes; median age, 21 years). Visual acuity (VA) was measured at the fovea and at five locations of the parafoveal nasal visual field (2°, 4°, 6°, 8°, and 10°) with best corrected distance vision. Myopic defocus of 1 D was introduced, and the same measurements were repeated immediately before and after a 30-minute adaptation. VA declined with increasing eccentricity in the clear, blurred, and blur-adapted viewing conditions. The rate of decline was quantified by the parameter E2, which represents the amount of eccentricity dependence of the acuity task. Foveal and parafoveal VA decreased with the introduction of optical defocus and improved significantly after a period of blur adaptation. The consistent value of E2 in each condition indicated that these changes in VA were not eccentricity dependent. Changes in VA under blurred and blur-adapted conditions were of similar magnitudes in myopic and emmetropic observers. Neural adaptation to blur improves VA under defocused conditions in the parafovea as well as the fovea, indicating that the underlying compensatory mechanism acts across a range of spatial scales and independently of retinal eccentricity. Foveal and parafoveal blur adaptation does not vary with refractive error.
    Investigative ophthalmology & visual science 03/2012; 53(3):1145-50. DOI:10.1167/iovs.11-8477 · 3.66 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Our sensory systems face a daily barrage of auditory and visual signals whose arrival times form a wide range of audiovisual asynchronies. These temporal relationships constitute an important metric for the nervous system when surmising which signals originate from common external events. Internal consistency is known to be aided by sensory adaptation: repeated exposure to consistent asynchrony brings perceived arrival times closer to simultaneity. However, given the diverse nature of our audiovisual environment, functionally useful adaptation would need to be constrained to signals that were generated together. In the current study, we investigate the role of two potential constraining factors: spatial and contextual correspondence. By employing an experimental design that allows independent control of both factors, we show that observers are able to simultaneously adapt to two opposing temporal relationships, provided they are segregated in space. No such recalibration was observed when spatial segregation was replaced by contextual stimulus features (in this case, pitch and spatial frequency). These effects provide support for dedicated asynchrony mechanisms that interact with spatially selective mechanisms early in visual and auditory sensory pathways.
    Experimental Brain Research 02/2012; 218(3):477-85. DOI:10.1007/s00221-012-3038-3 · 2.17 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Previous studies have demonstrated that the retention of information in short-term visual perceptual memory can be disrupted by the presentation of masking stimuli during interstimulus intervals (ISIs) in delayed discrimination tasks (S. Magnussen & W. W. Greenlee, 1999). We have exploited this effect in order to determine to what extent short-term perceptual memory is selective for stimulus color. We employed a delayed hue discrimination paradigm to measure the fidelity with which color information was retained in short-term memory. The task required 5 color normal observers to discriminate between spatially non-overlapping colored reference and test stimuli that were temporally separated by an ISI of 5 s. The points of subjective equality (PSEs) on the resultant psychometric matching functions provided an index of performance. Measurements were made in the presence and absence of mask stimuli presented during the ISI, which varied in hue around the equiluminant plane in DKL color space. For all reference stimuli, we found a consistent mask-induced, hue-dependent shift in PSE compared to the "no mask" conditions. These shifts were found to be tuned in color space, only occurring for a range of mask hues that fell within bandwidths of 29-37 deg. Outside this range, masking stimuli had little or no effect on measured PSEs. The results demonstrate that memory masking for color exhibits selectivity similar to that which has already been demonstrated for other visual attributes. The relatively narrow tuning of these interference effects suggests that short-term perceptual memory for color is based on higher order, non-linear color coding.
    Journal of Vision 01/2012; 12(1):26. DOI:10.1167/12.1.26 · 2.73 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The task of deciding how long sensory events seem to last is one that the human nervous system appears to perform rapidly and, for sub-second intervals, seemingly without conscious effort. That these estimates can be performed within and between multiple sensory and motor domains suggest time perception forms one of the core, fundamental processes of our perception of the world around us. Given this significance, the current paucity in our understanding of how this process operates is surprising. One candidate mechanism for duration perception posits that duration may be mediated via a system of duration-selective 'channels', which are differentially activated depending on the match between afferent duration information and the channels' 'preferred' duration. However, this model awaits experimental validation. In the current study, we use the technique of sensory adaptation, and we present data that are well described by banks of duration channels that are limited in their bandwidth, sensory-specific, and appear to operate at a relatively early stage of visual and auditory sensory processing. Our results suggest that many of the computational principles the nervous system applies to coding visual spatial and auditory spectral information are common to its processing of temporal extent.
    Proceedings of the Royal Society B: Biological Sciences 08/2011; 279(1729):690-8. DOI:10.1098/rspb.2011.1131 · 5.29 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In a world of sensory overload, it is becoming increasingly important to provide environments that enable us to recover our sense of well being. Such restorative (‘tranquil’) environments need to comprise sufficient sensory stimulation to keep us engaged, whilst at the same time providing opportunity for reflection and relaxation. One essential aspect in safeguarding existing, or developing new ‘tranquil space’, is understanding the optimum relationship between the soundscape and the visual composition of a location. This research represents a first step in understanding the effects of audio-visual interaction on the perception of tranquillity and identifies how the interpretation of acoustic information is an integral part of this process. By using uni and bi-modal auditory-visual stimuli in a two stage experimental strategy, it has been possible to measure the key components of the tranquillity construct. The findings of this work should be of particular interest to those charged with landscape management, such as National Park Authorities, Regional Councils, and other agencies concerned with providing and maintaining public amenity.
    Journal of Environmental Psychology 12/2010; DOI:10.1016/j.jenvp.2010.03.006 · 2.40 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Restorative environments which enable individuals to recover the sense of well-being are becoming increasingly important. These environments are characterized by an enhanced level of tranquility. Therefore, measuring the optimum relationship between the landscape and soundscape characteristics and the tranquility rating of these spaces is essential for their design and maintenance. In order to understand the key factors which affect the tranquillity construct, a large volume of audio and visual data has been collected across a representative range of landscapes in the UK. These data have been analyzed objectively by studying the temporal and spectral characteristics of the recorded sounds and the proportion of the natural and contextual features present in the video clips. The tranquility rating of these landscapes has been obtained from subjective experiments on 44 subjects to whom uni- and bimodal stimuli have been presented in a separate experiment. The results of these experiments make possible to objectively measure the key components of the tranquility construct and determine their relative importance in the design of a restorative space. On this basis, new relations between the tranquility rating, sound pressure level, loudness characteristics, and the visual quality of the scene have been derived.
    The Journal of the Acoustical Society of America 10/2010; 128(4):2371. DOI:10.1121/1.3508424 · 1.56 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible--adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.
    Proceedings of the Royal Society B: Biological Sciences 10/2010; 278(1710):1314-22. DOI:10.1098/rspb.2010.1737 · 5.29 Impact Factor
  • Source
    Graeme J Kennedy, David Whitaker
    [Show abstract] [Hide abstract]
    ABSTRACT: Precortical vision is mediated by three opponent mechanisms that combine receptoral outputs to form a luminance channel (L + M) and two chromatic channels, red-green (L/M) and blue-yellow (S/L + M). Here we ask the extent to which these basic color opponent mechanisms interact in the phenomenon of crowding, where nearby targets interfere with the processing of a central test target. The task was to identify the orientation of a Gabor patch while an annular plaid surrounded the patch. The radius of the annulus was varied in order to produce different separations of the test and flanker. The chromatic content of the Gabor and the annulus could be varied independently along the (L + M), (L/M), and (S/L + M) cardinal axes. For all targets, when the target and flanker shared the same chromaticity, performance decreased with decreasing separation of the target and annulus, i.e., a typical crowding effect was seen. When the test and flanker isolated different chromatic mechanisms, very little crowding was observed, even at the minimum separation of test target and annulus. In addition to this, intermediate chromaticities were found to produce intermediate levels of crowding. Finally, crowding effects using "half-wave rectified" stimuli suggest a locus for crowding effects beyond the level of color opponent mechanisms.
    Journal of Vision 06/2010; 10(6):15. DOI:10.1167/10.6.15 · 2.73 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Evidence suggests than human time perception is likely to reflect an ensemble of recent temporal experience. For example, prolonged exposure to consistent temporal patterns can adaptively realign the perception of event order, both within and between sensory modalities (e.g. Fujisaki et al., 2004 Nat. Neurosci., 7, 773-778). In addition, the observation that 'a watched pot never boils' serves to illustrate the fact that dynamic shifts in our attentional state can also produce marked distortions in our temporal estimates. In the current study we provide evidence for a hitherto unknown link between adaptation, temporal perception and our attentional state. We show that our ability to use recent sensory history as a perceptual baseline for ongoing temporal judgments is subject to striking top-down modulation via shifts in the observer's selective attention. Specifically, attending to the temporal structure of asynchronous auditory and visual adapting stimuli generates a substantial increase in the temporal recalibration induced by these stimuli. We propose a conceptual framework accounting for our findings whereby attention modulates the perceived salience of temporal patterns. This heightened salience allows the formation of audiovisual perceptual 'objects', defined solely by their temporal structure. Repeated exposure to these objects induces high-level pattern adaptation effects, akin to those found in visual and auditory domains (e.g. Leopold & Bondar (2005) Fitting the Mind to the World: Adaptation and Aftereffects in High-Level Vision. Oxford University Press, Oxford, 189-211; Schweinberger et al. (2008) Curr. Biol., 18, 684-688).
    European Journal of Neuroscience 05/2010; 31(10):1755-62. DOI:10.1111/j.1460-9568.2010.07194.x · 3.67 Impact Factor
  • P. McGraw, D. Whitaker, D. Levi
    Journal of Vision 05/2010; 8(6):431-431. DOI:10.1167/8.6.431 · 2.73 Impact Factor
  • Source
    James Heron, James V M Hanson, David Whitaker
    [Show abstract] [Hide abstract]
    ABSTRACT: Our motor actions normally generate sensory events, but how do we know which events were self generated and which have external causes? Here we use temporal adaptation to investigate the processing stage and generality of our sensorimotor timing estimates. Adaptation to artificially-induced delays between action and event can produce a startling percept--upon removal of the delay it feels as if the sensory event precedes its causative action. This temporal recalibration of action and event occurs in a quantitatively similar manner across the sensory modalities. Critically, it is robust to the replacement of one sense during the adaptation phase with another sense during the test judgment. Our findings suggest a high-level, supramodal recalibration mechanism. The effects are well described by a simple model which attempts to preserve the expected synchrony between action and event, but only when causality indicates it is reasonable to do so. We further demonstrate that this model successfully characterises related adaptation data from outside the sensorimotor domain.
    PLoS ONE 11/2009; 4(11):e7681. DOI:10.1371/journal.pone.0007681 · 3.53 Impact Factor
  • Source
    James V M Hanson, David Whitaker, James Heron
    [Show abstract] [Hide abstract]
    ABSTRACT: Differences in transduction and transmission latencies of visual, auditory and tactile events cause corresponding differences in simple reaction time. As reaction time is usually measured in unimodal blocks, it is unclear whether such latency differences also apply when observers monitor multiple sensory channels. We investigate this by comparing reaction time when attention is focused on a single modality, and when attention is divided between multiple modalities. Results show that tactile reaction time is unaffected by dividing attention, whereas visual and auditory reaction times are significantly and asymmetrically increased. These findings show that tactile information is processed preferentially by the nervous system under conditions of divided attention, and suggest that tactile events may be processed preattentively.
    Neuroreport 10/2009; 20(15):1392-6. DOI:10.1097/WNR.0b013e3283319e25 · 1.64 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Tripping is a common factor in falls and a typical safety strategy to avoid tripping on steps or stairs is to increase foot clearance over the step edge. In the present study we asked whether the perceived height of a step could be increased using a visual illusion and whether this would lead to the adoption of a safer stepping strategy, in terms of greater foot clearance over the step edge. The study also addressed the controversial question of whether motor actions are dissociated from visual perception. 21 young, healthy subjects perceived the step to be higher in a configuration of the horizontal-vertical illusion compared to a reverse configuration (p = 0.01). During a simple stepping task, maximum toe elevation changed by an amount corresponding to the size of the visual illusion (p<0.001). Linear regression analyses showed highly significant associations between perceived step height and maximum toe elevation for all conditions. The perceived height of a step can be manipulated using a simple visual illusion, leading to the adoption of a safer stepping strategy in terms of greater foot clearance over a step edge. In addition, the strong link found between perception of a visual illusion and visuomotor action provides additional support to the view that the original, controversial proposal by Goodale and Milner (1992) of two separate and distinct visual streams for perception and visuomotor action should be re-evaluated.
    PLoS ONE 02/2009; 4(2):e4577. DOI:10.1371/journal.pone.0004577 · 3.53 Impact Factor
  • Source
    Dennis M Levi, David Whitaker, Allison Provost
    [Show abstract] [Hide abstract]
    ABSTRACT: In normal vision, detecting a kink (a change in orientation) in a line is scale invariant: it depends solely on the length/width ratio of the line (D. Whitaker, D. M. Levi, & G. J. Kennedy, 2008). Here we measure detection of a change in the orientation of lines of different length and blur and show that strabismic amblyopia is qualitatively different from normal foveal vision, in that: 1) stimulus blur has little effect on performance in the amblyopic eye, and 2) integration of orientation information follows a different rule. In normal foveal vision, performance improves in proportion to the square root of the ratio of line length to blur (L:B). In strabismic amblyopia improvement is proportional to line length. Our results are consistent with a substantial degree of internal neural blur in first-order cortical filters. This internal blur results in a loss of scale invariance in the amblyopic visual system. Peripheral vision also shows much less effect of stimulus blur and a failure of scale invariance, similar to the central vision of strabismic amblyopes. Our results suggest that both peripheral vision and strabismic amblyopia share a common bottleneck in having a truncated range of spatial mechanisms--a range that becomes more restricted with increasing eccentricity and depth of amblyopia.
    Journal of Vision 01/2009; 9(1):22.1-11. DOI:10.1167/9.1.22 · 2.73 Impact Factor
  • Source
    James V M Hanson, James Heron, David Whitaker
    [Show abstract] [Hide abstract]
    ABSTRACT: When formulating an estimate of event time, the human sensory system has been shown to possess a degree of perceptual flexibility. Specifically, the perceived relative timing of auditory and visual stimuli is, to some extent, a product of recent experience. It has been suggested that this form of sensory recalibration may be peculiar to the audiovisual domain. Here we investigate how adaptation to sensory asynchrony influences the perceived temporal order of audiovisual, audiotactile and visuotactile stimulus pairs. Our data show that a brief period of repeated exposure to asynchrony in any of these sensory pairings results in marked changes in subsequent temporal order judgments: the point of perceived simultaneity shifts toward the level of adaptation asynchrony. We find that the size and nature of this shift is very similar in all three pairings and that sensitivity to asynchrony is unaffected by the adaptation process. In light of these findings we suggest that a single supramodal mechanism may be responsible for the observed recalibration of multisensory perceived time.
    Experimental Brain Research 03/2008; 185(2):347-52. DOI:10.1007/s00221-008-1282-3 · 2.17 Impact Factor
  • Source
    David Whitaker, Dennis M Levi, Graeme J Kennedy
    [Show abstract] [Hide abstract]
    ABSTRACT: Human vision is vital in determining our interaction with the outside world. In this study we characterize our ability to judge changes in the direction of motion of objects-a common task which can allow us either to intercept moving objects, or else avoid them if they pose a threat. Observers were presented with objects which moved across a computer monitor on a linear path until the midline, at which point they changed their direction of motion, and observers were required to judge the direction of change. In keeping with the variety of objects we encounter in the real world, we varied characteristics of the moving stimuli such as velocity, extent of motion path and the object size. Furthermore, we compared performance for moving objects with the ability of observers to detect a deviation in a line which formed the static trace of the motion path, since it has been suggested that a form of static memory trace may form the basis for these types of judgment. The static line judgments were well described by a 'scale invariant' model in which any two stimuli which possess the same two-dimensional geometry (length/width) result in the same level of performance. Performance for the moving objects was entirely different. Irrespective of the path length, object size or velocity of motion, path deviation thresholds depended simply upon the duration of the motion path in seconds. Human vision has long been known to integrate information across space in order to solve spatial tasks such as judgment of orientation or position. Here we demonstrate an intriguing mechanism which integrates direction information across time in order to optimize the judgment of path deviation for moving objects.
    PLoS ONE 02/2008; 3(4):e1930. DOI:10.1371/journal.pone.0001930 · 3.53 Impact Factor
  • James V. M. Hanson, James Heron, David Whitaker
    [Show abstract] [Hide abstract]
    ABSTRACT: Purpose: When formulating an estimate of event time, the human sensory system has been shown to possess a degree of perceptual flexibility. Specifically, the perceived relative timing of auditory and visual stimuli is, to some extent, a product of recent experience (Fujisaki et al., 2004). It has been suggested that this form of sensory recalibration may be peculiar to the audiovisual domain (Miyazaki et al., 2006). Here we investigate how adaptation to sensory asynchrony influences the perceived temporal order of audiovisual, audiotactile and visuotactile stimulus pairs. Methods: Observers (the authors and naïve observer CV) made temporal order judgments (TOJ) in the audiovisual, audiotactile and visuotactile stimulus pairings. The data were collected both with and without adaptation to asynchrony; in the adaptation conditions, observers underwent a period of exposure to asynchronous stimulus pairs prior to collecting data. From the resultant psychometric functions, the Point of Subjective Simultaneity (PSS; the physical temporal offset between two stimuli required for perceived simultaneity) was obtained for all observers and each condition. Results: Our data show that a brief period of repeated exposure to asynchrony in any of these sensory pairings results in marked changes in subsequent temporal order judgments: the point of perceived simultaneity shifts toward the level of adaptation asynchrony. We find that the size and nature of this shift is very similar in all three pairings and that sensitivity to asynchrony is unaffected by the adaptation process. Conclusions: Our results represent the first convincing demonstrations of asynchrony adaptation involving the tactile modality, and show that perceived timing in all three stimulus pairings tested is markedly influenced by recent experience. The fact that sensitivity to asynchrony is unaffected by the adaptation process suggests a genuine recalibration of perceived time. In light of these findings we suggest that a single supramodal mechanism may be responsible for the observed recalibration of multisensory perceived time.
    Ophthalmic and Physiological Optics 01/2008; 28(1):96-97. DOI:10.1111/j.1475-1313.2007.00530_3.x · 2.66 Impact Factor

Publication Stats

2k Citations
205.97 Total Impact Points

Institutions

  • 1996–2011
    • University of Bradford
      • Department of Optometry and Vision Science
      Bradford, England, United Kingdom
  • 2005
    • New England College of Optometry
      Boston, Massachusetts, United States
  • 1997
    • University of Wales
      Cardiff, Wales, United Kingdom
  • 1992–1995
    • University of Waterloo
      Waterloo, Ontario, Canada
  • 1989–1994
    • Aston University
      Birmingham, England, United Kingdom
  • 1990
    • University of Birmingham
      Birmingham, England, United Kingdom