Jennifer M Groh

Jennifer M Groh
Duke University Medical Center | DUMC · Department of Neurobiology

Ph.D.

About

71
Publications
4,594
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
2,602
Citations
Citations since 2016
24 Research Items
982 Citations
2016201720182019202020212022050100150
2016201720182019202020212022050100150
2016201720182019202020212022050100150
2016201720182019202020212022050100150
Additional affiliations
July 2006 - present
Duke University Medical Center
Position
  • Professor (Full)
June 2006 - present
Duke University
July 1997 - May 2006
Dartmouth College

Publications

Publications (71)
Article
Full-text available
How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit the number of stimuli driving any given neuron when multiple stimuli are present. To test this...
Article
Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected b...
Article
Conventional analysis of neuroscience data involves computing average neural activity over a group of trials and/or a period of time. This approach may be particularly problematic when assessing the response patterns of neurons to more than one simultaneously presented stimulus. in such cases the brain must represent each individual component of th...
Article
The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies in animals have assumed fusion of cross-modal information, whereas recent human experi...
Preprint
Full-text available
Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect the brain's auditory pathways from the ear through auditory cortex and beyond, but how these signals might contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evalua...
Preprint
How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that tuning curves might change to limit the number of stimuli driving any given neuron when multiple stimuli are present. To test this hypothesis, we r...
Article
We recently reported the existence of fluctuations in neural signals that may permit neurons to code multiple simultaneous stimuli sequentially across time [1]. This required deploying a novel statistical approach to permit investigation of neural activity at the scale of individual trials. Here we present tests using synthetic data to assess the s...
Preprint
Conventional analysis of neuroscience data involves computing average neural activity over a group of trials and/or a period of time. This approach may be particularly problematic when assessing the response patterns of neurons to more than one simultaneously presented stimulus. In such cases, the brain must represent each individual component of t...
Preprint
Full-text available
The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent...
Preprint
Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some...
Article
Full-text available
Visual calibration of auditory space requires re-alignment of representations differing in (1) format (auditory hemispheric channels vs visual maps) and (2) reference frames (head-centered vs eye-centered). Here, a ventriloquism paradigm from Kopčo, Lin, Shinn-Cunningham, and Groh [J. Neurosci. 29, 13809–13814 (2009)] was used to examine these proc...
Preprint
Full-text available
We recently reported the existence of fluctuations in neural signals that may permit neurons to code multiple simultaneous stimuli sequentially across time. This required deploying a novel statistical approach to permit investigation of neural activity at the scale of individual trials. Here we present tests using synthetic data to assess the sensi...
Preprint
Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected b...
Preprint
Full-text available
Visual calibration of auditory space requires re-alignment of representations differing in 1) format (auditory hemispheric channels vs. visual maps) and 2) reference frames (head-centered vs. eye-centered). Here, a ventriloquism paradigm from Kopčo et al . (J Neurosci, 29, 13809-13814) was used to examine these processes in humans and monkeys for v...
Chapter
This chapter reviews the literature on how auditory signals are transformed into a coordinate system that facilitates interactions with the visual system. Sound location is deduced from cues that depend on the position of the sound with respect to the head, but visual location is deduced from the pattern of light illuminating the retina, yielding a...
Article
Full-text available
How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds. During dual-sound trials, we find that...
Preprint
How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple different stimuli by interleaving different signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds. During dual-sound...
Article
Full-text available
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a novel multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in 3 subjects) performing a sacc...
Article
Full-text available
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccad...
Article
We accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. In this study, we assessed how this neural computation unfolds across three interconnected structures: frontal eye fields (FEF), intraparietal cortex (LIP/MIP), and the superior collicu...
Preprint
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here we show a novel multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans ( n =19 ears in 16 subjects) and monkeys ( n =5 ears in 3 subjects) performing a sacca...
Article
Saccadic eye movements can be elicited by more than one type of sensory stimulus. This implies substantial transformations of signals originating in different sense organs as they reach a common motor output pathway. In this study, we compared the prevalence and magnitude of auditory- and visually-evoked activity in a structure implicated in oculom...
Article
Full-text available
Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space...
Article
Full-text available
A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location i...
Article
The nature of disturbance in body experience in anorexia nervosa (AN) remains poorly operationalized despite its prognostic significance. We examined the relationship of subjective reports of sensitivity to and behavioral avoidance of sensory experience (e.g., to touch, motion) to body image disturbance and temperament in adult women currently diag...
Article
Full-text available
The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory...
Article
Full-text available
The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visua...
Article
Visual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show t...
Article
Full-text available
The inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated whether eye position affects activity in both the central and shell regions. Accordingly, we mapped...
Article
Full-text available
We investigated the functional architecture of the inferior colliculus (IC) in rhesus monkeys. We systematically mapped multiunit responses to tonal stimuli and noise in the IC and surrounding tissue of six rhesus macaques, collecting data at evenly placed locations and recording nonresponsive locations to define boundaries. The results show a mode...
Article
Full-text available
The motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In this study, we tested whether initial eye position has an effect on microstimulation-evoked sacca...
Article
Full-text available
We evaluated to what extent the influence of eye position in the auditory pathway of primates can be described as a gain field. We compared single unit activity in the inferior colliculus (IC), core auditory cortex (A1) and the caudomedial belt (CM) region of auditory cortex (AC) in primates, and found stronger evidence for gain field-like interact...
Article
Full-text available
When you hear a salient sound, it is natural to look at it to find out what is happening. Orienting the eyes to look at sounds is essential to our ability to identify and understand the events occurring in our environment. This behavior involves both sensorimotor and multisensory integration: a sound elicits a movement of the visual sense organ, th...
Article
Full-text available
Seeing the image of a newscaster on a television set causes us to think that the sound coming from the loudspeaker is actually coming from the screen. How images capture sounds is mysterious because the brain uses different methods for determining the locations of visual versus auditory stimuli. The retina senses the locations of visual objects wit...
Article
Full-text available
The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a combination of head- and eye-centered information) (Mullette-Gillman et al. 2005). One possible ex...
Article
We use both vision and audition when localizing objects and events in our environment. However, these sensory systems receive spatial information in different coordinate systems: sounds are localized using inter-aural and spectral cues, yielding a head-centered representation of space, whereas the visual system uses an eye-centered representation o...
Article
Full-text available
Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monkeys and other animals, in which single-unit recording studies have found stronger evidence for...
Article
Full-text available
The inferior colliculus (IC) is normally thought of as a predominantly auditory structure because of its early position in the ascending auditory pathway just before the auditory thalamus. Here, we show that a majority of IC neurons (64% of 180 neurons) in awake monkeys carry visual- and/or saccade-related signals in addition to their auditory resp...
Article
Full-text available
Fixation points are two LEDs -at -11° and 11° (left and right fixation points) -10° lower than the other LEDs. Response method: eye movement is tracked by the eye tracker (subjects' head was fixed by chin rest). Objective In our everyday life, we combine spatial information from the auditory (head-centered reference frame) and visual (eye-centered...
Article
Full-text available
How the brain responds to sequences of sounds is a question of great relevance to a variety of auditory perceptual phenomena. We investigated how long the responses of neurons in the primary auditory cortex of awake monkeys are influenced by the previous sound. We found that responses to the second sound of a two-sound sequence were generally atten...
Article
Objects and events can often be detected by more than one sensory system. Interactions between sensory systems can offer numerous benefits for the accuracy and completeness of the perception. Recent studies involving visual-auditory interactions have highlighted the perceptual advantages of combining information from these two modalities and have s...
Article
Full-text available
Neural activity in the inferior colliculus (IC) likely plays an integral role in the processing of various auditory parameters, such as sound location and frequency. However, little is known about the extent to which IC neural activity may be influenced by the context in which sounds are presented. In this study, we examined neural activity of IC n...
Article
Full-text available
We studied the representation of eye-position information in the primate inferior colliculus (IC). Monkeys fixated visual stimuli at one of eight or nine locations along the horizontal meridian between -24 and 24 degrees while sounds were presented from loudspeakers at locations within that same range. Approximately 40% of our sample of 153 neurons...
Article
Multisensory integration of spatial signals requires not only that stimulus locations be encoded in the same spatial reference frame, but also that stimulus locations be encoded in the same representational format. Previous studies have addressed the issue of spatial reference frame, but representational format, particularly for sound location, has...
Article
Full-text available
The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas (LIP, MIP) as a candidate for such a representation. We recorded the activity of 275 neurons in LIP...
Article
Full-text available
Auditory spatial information arises in a head-centered coordinate frame, whereas the saccade command signals generated by the superior colliculus (SC) are thought to specify target locations in an eye-centered frame. However, auditory activity in the SC appears to be neither head- nor eye-centered but in a reference frame that is intermediate betwe...
Article
We investigated the format of the code for sound location in the inferior colliculi of three awake monkeys (Macaca mulatta). We found that roughly half of our sample of 99 neurons was sensitive to the free-field locations of broadband noise presented in the frontal hemisphere. Such neurons nearly always responded monotonically as a function of soun...
Article
Neurons in primary auditory cortex are known to be sensitive to the locations of sounds in space, but the reference frame for this spatial sensitivity has not been investigated. Conventional wisdom holds that the auditory and visual pathways employ different reference frames, with the auditory pathway using a head-centered reference frame and the v...
Chapter
Knowing where things are in space is essential to our existence. The visual, auditory, and cutaneous senses all contribute to the perception of stimulus location, but they acquire spatial information in radically different ways. Positional information is literally built into the neural wiring for vision and touch: stimuli at different positions in...
Article
Full-text available
The nervous system uses two basic types of formats for encoding information. The parameters of many sensory (and some premotor) signals are represented by the pattern of activity among an array of neurons each of which is optimally responsive to a different parameter value. This type of code is commonly referred to as a place code. Motor commands,...
Article
Determining the precise moment a visual stimulus appears is difficult because visual response latencies vary. This temporal uncertainty could cause localization errors to brief visual targets presented before and during eye movements if the oculomotor system cannot determine the position of the eye at the time the stimulus appeared. We investigated...
Article
We examined the frame of reference of auditory responses in the inferior colliculus in monkeys fixating visual stimuli at different locations. Eye position modulated the level of auditory responses in 33% of the neurons we encountered, but it did not appear to shift their spatial tuning. The effect of eye position on auditory responses was substant...
Article
To track a moving object, its motion must first be distinguished from that of the background. The center-surround properties of neurons in the middle temporal visual area (MT) may be important for signaling the relative motion between object and background. To test this, we microstimulated within MT and measured the effects on monkeys' eye movement...
Article
Full-text available
Guidelines for submitting commentsPolicy: Comments that contribute to the discussion of the article will be posted within approximately three business days. We do not accept anonymous comments. Please include your email address; the address will not be displayed in the posted comment. Cell Press Editors will screen the comments to ensure that they...
Article
Monkeys trained to distinguish touch stimuli that 'flutter' with different frequencies can similarly distinguish electrical stimulation of the somatosensory cortex according to its frequency; the implication is that the electrically-evoked patterns of cortical activity cause flutter sensations similar to those induced by touch.
Article
Full-text available
To generate behavioral responses based on sensory input, motor areas of the brain must interpret, or "read out," signals from sensory maps. Our experiments tested several algorithms for how the motor systems for smooth pursuit and saccadic eye movements might extract a usable signal of target velocity from the distributed representation of velocity...
Article
Pronounced effects of attention have been demonstrated in a region of visual cortex previously thought to be devoid of such influences; identifying the features critical for eliciting these effects should teach us a great deal about the neural underpinnings of visual attention.
Article
Purpose. Tracking a moving target that appears in the periphery involves two kinds of eye movements, saccades and smooth pursuit. Both kinds of movements require a signal of target velocity. The saccadic system must use a target velocity signal to compensate for the motion of the target, and the pursuit system must use such a signal to match eye ve...
Article
1. We compared the properties of saccades to somatosensory and visual targets. This comparison provides insight into the translation of sensory signals coding target location in different sensory coordinate frameworks into motor commands of a common format. Vibrotactile stimuli were delivered to the hands, which were fixed in position and concealed...
Article
1. We examined cells with saccade-related activity in the superior colliculus (SC) of monkeys performing saccades to both somatosensory and visual targets. Our goals were 1) to determine whether signals from these separate sensory systems have converged onto a common motor pathway by the level of the SC; 2) to determine the frame of reference of so...
Article
1. We recorded from cells with sensory responses to somatosensory stimuli in the superior colliculus (SC) of awake monkeys. Our goal was to determine the frame of reference of collicular somatosensory signals by seeing whether the positions of the eyes influenced the responses of cells to a given tactile stimulus. Somatosensory targets consisted of...
Article
Full-text available
Two models for transforming auditory signals from head-centered to eye-centered coordinates are presented. The vector subtraction model subtracts a rate-coded eye position signal from a topographically weighted auditory target position signal to produce a rate-code of target location with respect to the eye. The rate-code is converted into a place-...