Article

Dopamine: Burning the Candle at Both Ends

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Dopamine neurons are well known for signaling reward-prediction errors. In this issue, Matsumoto and Takada (2013) show that some dopamine neurons also signal salient events during progression through a visual search task requiring working memory and sustained attention.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Here we have re-examined the behavioral effects of a dopaminergic neuron (DAN) lesion induced by 6-hydroxydopamine (6-OHDA) in mice neonates, a purported model of ADHD, from a bottom-up perspective. In line with current views emphasizing the role of DA signals in orienting behavior towards the discovery of information (Bromberg-Martin and Hikosaka, 2009;Dolan and Dayan, 2013;Pearson and Platt, 2013), we focused on the exploration and exploitation of information-rich novel environments rather than on conditions of instrumental learning favoring model-free cognitive control (Dolan and Dayan, 2013). Since information is rewarding by itself (Bromberg-Martin and Hikosaka, 2009) we speculate that developing without DANs leads to the acquisition of an imbalanced exploration-exploitation strategy, with enduring consequences on behavior. ...
... Moreover, recent studies have shown that information is rewarding for animals and that DANs encode advanced information about rewards as if it were a primary reward (Bromberg-Martin and Hikosaka, 2009). Thus, exploitation and search seem to share a common drive, the salience of primary (e.g., food) and abstract (e.g., information) rewards (Pearson and Platt, 2013). Thus, DAN-lesioned mice seem to have a general deficiency in processing salience, as they fail both to exploit and to forage for information in novel settings. ...
Article
Findings showing that neonatal lesions of the forebrain dopaminergic system in rodents lead to juvenile locomotor hyperactivity and learning deficits have been taken as evidence of face validity for the attention deficit hyperactivity disorder. But the core cognitive and physiological intermediate phenotypes underlying this rodent syndrome remain unknown. Here we show that early postnatal dopaminergic lesions cause long-lasting deficits in exploitation of shelter, social and nutritional resources, and an imbalanced exploratory behavior, where non-directed local exploration is exacerbated while sophisticated search behaviors involving sequences of goal directed actions are degraded. Importantly, some behavioral deficits do not diminish after adolescence but instead worsen or mutate, particularly, those related to the exploration of wide and spatially complex environments. In vivo electrophysiological recordings and morphological reconstructions of striatal medium spiny neurons reveal corticostriatal alterations associated to the behavioral phenotype. More specifically, an attenuation of corticostriatal functional connectivity affecting medial prefrontal inputs more markedly than cingulate and motor inputs is accompanied by a contraction of the dendritic arbor of striatal projection neurons in this animal model. Thus, dopaminergic neurons are essential during postnatal development for the functional and structural maturation of corticostriatal connections. From a bottom-up viewpoint, our findings suggest that neuropsychiatric conditions presumably linked to developmental alterations of the dopaminergic system should be evaluated for deficits in foraging decision making, alterations in the recruitment of corticostriatal circuits during foraging tasks, and structural disorganization of the frontostriatal connections.Neuropsychopharmacology accepted article preview online, 15 April 2015. doi:10.1038/npp.2015.104.
... Advanced data analyses from these complex tasks reveal inclusion of model-based temporal predictions into the dopamine error response (382) and compatibility with TD models (144). Variations in stimulus coherence (389) and visual search performance (349) result in graded reward probabilities that induce reward prediction errors that are coded by dopamine neurons, although the error coding may not be easily apparent (426). Thus, in all these high-order cognitive tasks, the phasic dopamine response tracks only, and reliably, the reward prediction error. ...
Article
Rewards are crucial objects that induce learning, approach behavior, choices, and emotions. Whereas emotions are difficult to investigate in animals, the learning function is mediated by neuronal reward prediction error signals which implement basic constructs of reinforcement learning theory. These signals are found in dopamine neurons, which emit a global reward signal to striatum and frontal cortex, and in specific neurons in striatum, amygdala, and frontal cortex projecting to select neuronal populations. The approach and choice functions involve subjective value, which is objectively assessed by behavioral choices eliciting internal, subjective reward preferences. Utility is the formal mathematical characterization of subjective value and a prime decision variable in economic choice theory. It is coded as utility prediction error by phasic dopamine responses. Utility can incorporate various influences, including risk, delay, effort, and social interaction. Appropriate for formal decision mechanisms, rewards are coded as object value, action value, difference value, and chosen value by specific neurons. Although all reward, reinforcement, and decision variables are theoretical constructs, their neuronal signals constitute measurable physical implementations and as such confirm the validity of these concepts. The neuronal reward signals provide guidance for behavior while constraining the free will to act. Copyright © 2015 the American Physiological Society.
Article
In this issue of Neuron, Li et al. (2013) show that transgenically eliminating thalamocortical neurotransmission disrupts the formation of barrel columns in the somatosensory cortex and cortical lamination, providing evidence for the importance of extrinsic activity-dependent factors in cortical development.
Article
Full-text available
Although cells in many brain regions respond to reward, the cortical-basal ganglia circuit is at the heart of the reward system. The key structures in this network are the anterior cingulate cortex, the orbital prefrontal cortex, the ventral striatum, the ventral pallidum, and the midbrain dopamine neurons. In addition, other structures, including the dorsal prefrontal cortex, amygdala, hippocampus, thalamus, and lateral habenular nucleus, and specific brainstem structures such as the pedunculopontine nucleus, and the raphe nucleus, are key components in regulating the reward circuit. Connectivity between these areas forms a complex neural network that mediates different aspects of reward processing. Advances in neuroimaging techniques allow better spatial and temporal resolution. These studies now demonstrate that human functional and structural imaging results map increasingly close to primate anatomy.
Article
Full-text available
Increase of extracellular dopamine in primate prefrontal cortex during a working memory task. J. Neurophysiol. 78: 2795-2798, 1997. The dopamine innervation of the prefrontal cortex is involved importantly in cognitive processes, such as tested in working memory tasks. However, there have been no studies directly investigating prefrontal dopamine levels in relation to cognitive processes. We measured frontal extracellular dopamine concentration using in vivo microdialysis in monkeys performing in a delayed alternation task as a typical working memory paradigm and in a sensory-guided control task. We observed a significant increase in dopamine level in the delayed alternation task as compared both with the sensory-guided control task and the basal resting level. The increase was seen in the dorsolateral prefrontal but not in the arcuate or orbitofrontal areas. The increase appeared to reflect the working memory component of the task and was observed mainly in the lip areas of principal sulcus. Although there was no significant difference in dopamine level between delayed alternation and sensory-guided control tasks in the premotor area, significant increases in dopamine concentration were observed during both tasks as compared with the basal resting level, indicating the importance of premotor dopamine for the motor response itself.
Article
Full-text available
While it has previously been assumed that mesolimbic dopamine neurons carry a reward signal, recent data from single-unit, microdialysis and voltammetry studies suggest that these neurons respond to a large category of salient and arousing events, including appetitive, aversive, high intensity, and novel stimuli. Elevations in dopamine release within mesolimbic, mesocortical and nigrostriatal target sites coincide with arousal, and the increase in dopamine activity within target sites modulates a number of behavioral functions. However, because dopamine neurons respond to a category of salient events that extend beyond that of reward stimuli, dopamine levels are not likely to code for the reward value of encountered events. The paper (i) examines evidence showing that dopamine neurons respond to salient and arousing change in environmental conditions, regardless of the motivational valence of that change, and (ii) asks how this might shape our thinking about the role of dopamine systems in goal-directed behavior.
Article
Full-text available
An influential concept in contemporary computational neuroscience is the reward prediction error hypothesis of phasic dopaminergic function. It maintains that midbrain dopaminergic neurons signal the occurrence of unpredicted reward, which is used in appetitive learning to reinforce existing actions that most often lead to reward. However, the availability of limited afferent sensory processing and the precise timing of dopaminergic signals suggest that they might instead have a central role in identifying which aspects of context and behavioural output are crucial in causing unpredicted events.
Article
Dopamine is essential to cognitive functions. However, despite abundant studies demonstrating that dopamine neuron activity is related to reinforcement and motivation, little is known about what signals dopamine neurons convey to promote cognitive processing. We therefore examined dopamine neuron activity in monkeys performing a delayed matching-to-sample task that required working memory and visual search. We found that dopamine neurons responded to task events associated with cognitive operations. A subset of dopamine neurons were activated by visual stimuli if the monkey had to store the stimuli in working memory. These neurons were located dorsolaterally in the substantia nigra pars compacta, whereas ventromedial dopamine neurons, some in the ventral tegmental area, represented reward prediction signals. Furthermore, dopamine neurons monitored visual search performance, becoming active when the monkey made an internal judgment that the search was successfully completed. Our findings suggest an anatomical gradient of dopamine signals along the dorsolateral-ventromedial axis of the ventral midbrain.
Article
Whereas reward (appetitiveness) and aversiveness (punishment) have been distinguished as two discrete dimensions within psychology and behavior, physiological and computational models of their neural representation have treated them as opposite sides of a single continuous dimension of “value.” Here, I show that although dopamine neurons of the primate ventral midbrain are activated by evidence for reward and suppressed by evidence against reward, they are insensitive to aversiveness. This indicates that reward and aversiveness are represented independently as two dimensions, even by neurons that are closely related to motor function. Because theory and experiment support the existence of opponent neural representations for value, the present results imply four types of value-sensitive neurons corresponding to reward-ON (dopamine), reward-OFF, aversive-ON, and aversive-OFF.
Article
The capacity to predict future events permits a creature to detect, model, and manipulate the causal structure of its interactions with its environment. Behavioral experiments suggest that learning is driven by changes in the expectations about future salient events such as rewards and punishments. Physiological work has recently complemented these studies by identifying dopaminergic neurons in the primate whose fluctuating output apparently signals changes or errors in the predictions of future salient and rewarding events. Taken together, these findings can be understood through quantitative theories of adaptive optimizing control.
Article
What roles do mesolimbic and neostriatal dopamine systems play in reward? Do they mediate the hedonic impact of rewarding stimuli? Do they mediate hedonic reward learning and associative prediction? Our review of the literature, together with results of a new study of residual reward capacity after dopamine depletion, indicates the answer to both questions is `no'. Rather, dopamine systems may mediate the incentive salience of rewards, modulating their motivational value in a manner separable from hedonia and reward learning. In a study of the consequences of dopamine loss, rats were depleted of dopamine in the nucleus accumbens and neostriatum by up to 99% using 6-hydroxydopamine. In a series of experiments, we applied the `taste reactivity' measure of affective reactions (gapes, etc.) to assess the capacity of dopamine-depleted rats for: 1) normal affect (hedonic and aversive reactions), 2) modulation of hedonic affect by associative learning (taste aversion conditioning), and 3) hedonic enhancement of affect by non-dopaminergic pharmacological manipulation of palatability (benzodiazepine administration). We found normal hedonic reaction patterns to sucrose vs. quinine, normal learning of new hedonic stimulus values (a change in palatability based on predictive relations), and normal pharmacological hedonic enhancement of palatability. We discuss these results in the context of hypotheses and data concerning the role of dopamine in reward. We review neurochemical, electrophysiological, and other behavioral evidence. We conclude that dopamine systems are not needed either to mediate the hedonic pleasure of reinforcers or to mediate predictive associations involved in hedonic reward learning. We conclude instead that dopamine may be more important to incentive salience attributions to the neural representations of reward-related stimuli. Incentive salience, we suggest, is a distinct component of motivation and reward. In other words, dopamine systems are necessary for `wanting' incentives, but not for `liking' them or for learning new `likes' and `dislikes'.
Article
The prefrontal cortex is thought to modulate sensory signals in posterior cortices during top-down attention, but little is known about the underlying neural circuitry. Experimental and clinical evidence indicate that prefrontal dopamine has an important role in cognitive functions, acting predominantly through D1 receptors. Here we show that dopamine D1 receptors mediate prefrontal control of signals in the visual cortex of macaques (Macaca mulatta). We pharmacologically altered D1-receptor-mediated activity in the frontal eye field of the prefrontal cortex and measured the effect on the responses of neurons in area V4 of the visual cortex. This manipulation was sufficient to enhance the magnitude, the orientation selectivity and the reliability of V4 visual responses to an extent comparable with the known effects of top-down attention. The enhancement of V4 signals was restricted to neurons with response fields overlapping the part of visual space affected by the D1 receptor manipulation. Altering either D1- or D2-receptor-mediated frontal eye field activity increased saccadic target selection but the D2 receptor manipulation did not enhance V4 signals. Our results identify a role for D1 receptors in mediating the control of visual cortical signals by the prefrontal cortex and suggest how processing in sensory areas could be altered in mental disorders involving prefrontal dopamine.
Article
Midbrain dopamine neurons are well known for their strong responses to rewards and their critical role in positive motivation. It has become increasingly clear, however, that dopamine neurons also transmit signals related to salient but nonrewarding experiences such as aversive and alerting events. Here we review recent advances in understanding the reward and nonreward functions of dopamine. Based on this data, we propose that dopamine neurons come in multiple types that are connected with distinct brain networks and have distinct roles in motivational control. Some dopamine neurons encode motivational value, supporting brain networks for seeking, evaluation, and value learning. Others encode motivational salience, supporting brain networks for orienting, cognition, and general motivation. Both types of dopamine neurons are augmented by an alerting signal involved in rapid detection of potentially important sensory cues. We hypothesize that these dopaminergic pathways for value, salience, and alerting cooperate to support adaptive behavior.
Article
Midbrain dopamine neurons are activated by reward or sensory stimuli predicting reward. These excitatory responses increase as the reward value increases. This response property has led to a hypothesis that dopamine neurons encode value-related signals and are inhibited by aversive events. Here we show that this is true only for a subset of dopamine neurons. We recorded the activity of dopamine neurons in monkeys (Macaca mulatta) during a Pavlovian procedure with appetitive and aversive outcomes (liquid rewards and airpuffs directed at the face, respectively). We found that some dopamine neurons were excited by reward-predicting stimuli and inhibited by airpuff-predicting stimuli, as the value hypothesis predicts. However, a greater number of dopamine neurons were excited by both of these stimuli, inconsistent with the hypothesis. Some dopamine neurons were also excited by both rewards and airpuffs themselves, especially when they were unpredictable. Neurons excited by the airpuff-predicting stimuli were located more dorsolaterally in the substantia nigra pars compacta, whereas neurons inhibited by the stimuli were located more ventromedially, some in the ventral tegmental area. A similar anatomical difference was observed for their responses to actual airpuffs. These findings suggest that different groups of dopamine neurons convey motivational signals in distinct manners.
Article
The prefrontal cortex is involved in the cognitive process of working memory. Local injections of SCH23390 and SCH39166, selective antagonists of the D1 dopamine receptor, into the prefrontal cortex of rhesus monkeys induced errors and increased latency in performance on an oculomotor task that required memory-guided saccades. The deficit was dose-dependent and sensitive to the duration of the delay period. These D1 antagonists had no effect on performance in a control task requiring visually guided saccades, indicating that sensory and motor functions were unaltered. Thus, D1 dopamine receptors play a selective role in the mnemonic, predictive functions of the primate prefrontal cortex.
Article
Dopamine has been implicated in the cognitive process of working memory but the cellular basis of its action has yet to be revealed. By combining iontophoretic analysis of dopamine receptors with single-cell recording during behaviour, we found that D1 antagonists can selectively potentiate the 'memory fields' of prefrontal neurons which subserve working memory. The precision shown for D1 receptor modulation of mnemonic processing indicates a direct gating of selective excitatory synaptic inputs to prefrontal neurons during cognition.
Article
1. To examine the role of dopamine receptors in the prefrontal cortex (PFC) on working memory, we injected dopamine antagonists (SCH23390, SCH39166, haloperidol, sulpiride, and raclopride) locally into the dorsolateral PFC in two monkeys trained to perform an oculomotor delayed-response (ODR) task. In the ODR task, monkeys fixate a central spot on a cathode ray tube (CRT) monitor while a visual cue is briefly (300 ms) presented in one of several peripheral locations in the visual field. After a delay of 1.5-6 s, the fixation spot is turned off, instructing the monkey to move its eyes to the target location that had been indicated by the visuospatial cue before the delay. Each monkey also performed a control task in which the cue remained on during the delay period. In this task the monkey's response was sensory rather than memory guided. 2. Local intracerebral injection of the selective dopamine antagonists SCH23390 (10-80 micrograms) and SCH39166 (1-5 micrograms) and/or the nonselective dopamine antagonist haloperidol (10-100 micrograms) induced deficits in ODR task performance at a total of 22 sites in the dorsolateral PFC. The deficit was characterized by a decrease in the accuracy of the memory-guided saccade as well as an increase in the latency of the response. The deficit usually appeared within 1-3 min after the injection, reached a peak at 20-40 min, and recovered at 60-90 min. 3. Performance change was restricted to a few specific target locations, which varied with the injection site and were most often contralateral to the injection site. 4. The degree of impairment in the ODR task occasioned by the injection of the dopamine antagonists was sensitive to the duration of delay; longer delays were associated with larger decreases in the accuracy and delayed onset of the memory-guided saccade. 5. The deficit was dose dependent; higher doses induced larger errors and increases in the onset of the memory-guided saccade. 6. Dopamine antagonists did not affect performance on the control task, which required the same eye movements but was sensory guided. Thus, in the same experimental session in which ODR performance was impaired, the accuracy and the latency of the sensory-guided saccades were normal for every target location.(ABSTRACT TRUNCATED AT 400 WORDS)
Article
Two young adult monkeys (Macaca mullata) were trained to perform a delayed-response task that required the monkeys to remember a cued spatial position (left or right) over a delay interval and then to make a response to the cued position. Local injection of the alpha 2-adrenergic antagonist yohimbine (10 micrograms in 2 microliters saline) into the dorsolateral prefrontal cortex (Walker's area 46 and area 9) impaired the performance of the delayed-response task, and it was without effect on the performance of the task if there was no delay between the cue and choice signals. The main performing error after injection of yohimbine was that the monkeys responded to uncued position with higher rate. Local injection of the alpha 1-adrenergic antagonist prazosin (10 microgram in 2 microliters saline) or the beta-adrenergic antagonist propranolol (10 micrograms in 2 microliters saline) into the same cortical areas induced no significant effect on the performance of the task. The present study suggests that prefrontal alpha 2-adrenoceptors play an important role in the spatial working memory in young adult monkeys.
Article
The capacity to predict future events permits a creature to detect, model, and manipulate the causal structure of its interactions with its environment. Behavioral experiments suggest that learning is driven by changes in the expectations about future salient events such as rewards and punishments. Physiological work has recently complemented these studies by identifying dopaminergic neurons in the primate whose fluctuating output apparently signals changes or errors in the predictions of future salient and rewarding events. Taken together, these findings can be understood through quantitative theories of adaptive optimizing control.
Article
Recent neurophysiological studies reveal that neurons in certain brain structures carry specific signals about past and future rewards. Dopamine neurons display a short-latency, phasic reward signal indicating the difference between actual and predicted rewards. The signal is useful for enhancing neuronal processing and learning behavioral reactions. It is distinctly different from dopamine's tonic enabling of numerous behavioral processes. Neurons in the striatum, frontal cortex, and amygdala also process reward information but provide more differentiated information for identifying and anticipating rewards and organizing goal-directed behavior. The different reward signals have complementary functions, and the optimal use of rewards in voluntary behavior would benefit from interactions between the signals. Addictive psychostimulant drugs may exert their action by amplifying the dopamine reward signal.
  • K C Berridge
Berridge, K.C., and Robinson, T.E. (1998). Brain Res. Brain Res. Rev. 28, 309-369.
  • E S Bromberg-Martin
  • M Matsumoto
  • O Hikosaka
Bromberg-Martin, E.S., Matsumoto, M., and Hikosaka, O. (2010). Neuron 68, 815–834.
  • M Matsumoto
  • O Hikosaka
Matsumoto, M., and Hikosaka, O. (2009). Nature 459, 837–841.
  • P Redgrave
  • K Gurney
Redgrave, P., and Gurney, K. (2006). Nat. Rev. Neurosci. 7, 967-975.
  • B.-M Li
Li, B.-M., and Mei, Z.-T. (1994). Behav. Neural Biol. 62, 134–139.
  • W Schultz
  • P Dayan
Schultz, W., Dayan, P., and Montague, P.R. (1997). Science 275, 1593–1599.
Mineault 1 and Christopher C. Pack 1
  • J Patrick
Patrick J. Mineault 1 and Christopher C. Pack 1, *
  • S N Haber
  • B Knutson
Haber, S.N., and Knutson, B. (2010). Neuropsychopharmacology 35, 4-26.
  • B Noudoost
Noudoost, B., and Moore, T. (2011). Nature 474, 372-375.
Neuron 79, this issue
  • M Matsumoto
  • M Takada
Matsumoto, M., and Takada, M. (2013). Neuron 79, this issue, 1011-1024.
  • Sawaguchi T.
  • Goldman-Rakic P.S.
  • Berridge K.C.
  • Robinson T.E.
  • C D Fiorillo
Fiorillo, C.D. (2013). Science 341, 546-549.
  • B.-M Li
  • Z.-T Mei
Li, B.-M., and Mei, Z.-T. (1994). Behav. Neural Biol. 62, 134-139.
  • T Sawaguchi
  • P S Goldman-Rakic
Sawaguchi, T., and Goldman-Rakic, P.S. (1991). Science 251, 947-950.