Article
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We thank Corlett for his thought-provoking response to our recent article. Corlett shares our concerns about inconsistencies in theories of perceptual prediction and highlights some reminiscent debates in learning theory. He also proposes why the perceptual prediction mechanisms may operate differently in the domain of action relative to other domains of sensory cognition. Here, we highlight how we share the conviction that dialogue across disciplines will inform both models of perception and learning but clarify that important distinctions between the explananda mean the theoretical puzzles are not reducible to each other. We also question whether action prediction mechanisms do indeed operate differently.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... However, while these accounts have been influential for the last two decades, evidence that attenuation results from predictive mechanisms is sparseespecially considering the number of non-predictive mechanisms known to influence perception during action (Press et al., 2020a(Press et al., , 2020bPress & Cook, 2015;Seki & Fetz, 2012). For this reason, in a recent study Kilteni et al. (2019) aimed to determine whether the attenuating influence of action on tactile perception in fact reflected the operation of predictive mechanisms. ...
... It is made available under a The copyright holder for this preprint (which was not peer-reviewed) is the . https://doi.org/10.1101/2020.03.26.007559 doi: bioRxiv preprint perception according to event repetition (see Press et al., 2020aPress et al., , 2020b, for further discussion). ...
Preprint
Full-text available
It is widely believed that predicted tactile action outcomes are perceptually attenuated. The present experiments determined whether predictive mechanisms always generate attenuation, or instead can enhance perception – as typically observed in sensory cognition domains outside of action. We manipulated probabilistic expectations in a paradigm often used to demonstrate tactile attenuation. Participants produced actions and subsequently rated the intensity of forces on a passive finger. Experiment 1 confirmed previous findings that action outcomes are perceived less intensely than passive stimulation, but demonstrated more intense perception when active finger stimulation was removed. Experiments 2 and 3 manipulated prediction explicitly and found that expected touch during action is perceived more intensely than unexpected touch. Computational modelling suggested that expectations increase the gain afforded to expected tactile signals. These findings challenge a central tenet of prominent motor control theories and demonstrate that sensorimotor predictions do not exhibit a qualitatively distinct influence on tactile perception. Statement of Relevance Perception of expected action outcomes is thought to be attenuated. Such a mechanism may be adaptive because surprising inputs are more useful - e.g., signalling the need to take new courses of action - and is thought to explain why we cannot tickle ourselves and unusual aspects of action and awareness in clinical populations. However, theories outside of action purport that predicted events are perceptually facilitated, allowing us to generate largely accurate representations of our noisy sensory world. We do not know whether action predictions really alter perception differently from other predictions because different manipulations have been performed. Here we perform similar manipulations and demonstrate that action predictions can enhance, rather than attenuate, touch. We thereby demonstrate that action predictions may not have a qualitatively distinct influence on perception, such that we must re-examine theories concerning how predictions influence perception across domains and clinical theories based upon their assumptions.
... In sum, the current findings demonstrate a role for the hippocampus in both acquiring and exploiting predictive associations, bridging the fields of learning and perception. These fields have separately made progress in investigating the roles of prediction, novelty and uncertainty 1,52 , but have until now largely remained segregated literatures, despite great promise to inform one another 68,94 . Ultimately, weighting predictions and errors according to their reliability is crucial to optimally perceive and engage with our environment, and the current findings suggest that the hippocampus plays a crucial role in this process. ...
Article
Full-text available
We constantly exploit the statistical regularities in our environment to help guide our perception. The hippocampus has been suggested to play a pivotal role in both learning environmental statistics, as well as exploiting them to generate perceptual predictions. However, it is unclear how the hippocampus balances encoding new predictive associations with the retrieval of existing ones. Here, we present the results of two high resolution human fMRI studies (N = 24 for both experiments) directly investigating this. Participants were exposed to auditory cues that predicted the identity of an upcoming visual shape (with 75% validity). Using multivoxel decoding analysis, we find that the hippocampus initially preferentially represents unexpected shapes (i.e., those that violate the cue regularities), but later switches to representing the cue-predicted shape regardless of which was actually presented. These findings demonstrate that the hippocampus is involved both acquiring and exploiting predictive associations, and is dominated by either errors or predictions depending on whether learning is ongoing or complete. Successfully exploiting the regularities in our environment requires balancing the encoding of new information with the retrieval of stored associations. Here, the authors show that the hippocampus switches from representing novel information (errors) to representing predictions as learning proceeds.
... Finally, a recent theoretical proposal suggests that action prediction should increase the perceived intensity of expected effects (e.g., the self-generated test tap) whereas secondary processes increase the intensity of subsequent events that are surprising (Press et al., 2020a). Accordingly, the attenuation of self-generated touch reflects generalized sensory suppression effects during movement rather than motor predictions, and/or it is a postdictive process that occurs after the presentation of the stimulus (Press et al., 2020b). Therefore, one could argue that the forward models are engaged towards action observation, but we do not observe sensory attenuation because the attenuation is not related to the action prediction. ...
Article
Full-text available
The discovery of mirror neurons in the macaque brain in the 1990s triggered investigations on putative human mirror neurons and their potential functionality. The leading proposed function has been action understanding: accordingly, we understand the actions of others by ‘simulating’ them in our own motor system through a direct matching of the visual information to our own motor programs. Furthermore, it has been proposed that this simulation involves the prediction of the sensory consequences of the observed action, similar to the prediction of the sensory consequences of our executed actions. Here, we tested this proposal by quantifying somatosensory attenuation behaviorally during action observation. Somatosensory attenuation manifests during voluntary action and refers to the perception of self‐generated touches as less intense than identical externally generated touches because the self‐generated touches are predicted from the motor command. Therefore, we reasoned that if an observer simulates the observed action and, thus, he/she predicts its somatosensory consequences, then he/she should attenuate tactile stimuli simultaneously delivered to his/her corresponding body part. In three separate experiments, we found a systematic attenuation of touches during executed self‐touch actions, but we found no evidence for attenuation when such actions were observed. Failure to observe somatosensory attenuation during observation of self‐touch is not compatible with the hypothesis that the putative human mirror neuron system automatically predicts the sensory consequences of the observed action. In contrast, our findings emphasize a sharp distinction between the motor representations of self and others.
... Moreover, previous results indicated that action predictions and sensory visual-auditory predictions lead to comparable effects, indicating that the two information types feed into a common generative model, despite presumably having different brain sources 59 . This would confirm that action predictions are not qualitatively different from sensory predictions 20,21 and that they are likely to stem from more general cognitive processes, rather than from specific information coming from the motor system 19 . Nevertheless, while most studies focus on identity predictions 8 , additional decisions regarding when and whether to act which in turn modulate the sensory processing of action-effects as well 60,61 , might still make action-related predictions functionally distinctive from other types of sensory predictions. ...
Article
Full-text available
Our brains continuously build and update predictive models of the world, sources of prediction being drawn for example from sensory regularities and/or our own actions. Yet, recent results in the auditory system indicate that stochastic regularities may not be easily encoded when a rare medium pitch deviant is presented between frequent high and low pitch standard sounds in random order, as reflected in the lack of sensory prediction error event-related potentials [i.e., mismatch negativity (MMN)]. We wanted to test the implication of the predictive coding theory that predictions based on higher-order generative models—here, based on action intention, are fed top-down in the hierarchy to sensory levels. Participants produced random sequences of high and low pitch sounds by button presses in two conditions: In a “specific” condition, one button produced high and the other low pitch sounds; in an “unspecific” condition, both buttons randomly produced high or low-pitch sounds. Rare medium pitch deviants elicited larger MMN and N2 responses in the “specific” compared to the “unspecific” condition, despite equal sound probabilities. These results thus demonstrate that action-effect predictions can boost stochastic regularity-based predictions and engage higher-order deviance detection processes, extending previous notions on the role of action predictions at sensory levels.
... Third, several of the fusion tests in auditory cortex were trending but not significant. Summation of opposing processes (Press, Kok, & Yon, 2020a, 2020bYon & Press, 2017) may have reduced the apparent strength of these fusion effects: Neural signatures of an early fusion process may have been counteracted by later, surprise-related enhancement of conflicting cue information. This hypothesis could be explicitly tested using methods with high temporal resolution like magnetoencephalography. Fusion effects may have also been dampened by unlearning of the predictive cues from the preponderance of incongruent trials during scanning. ...
Article
Humans perceive expected stimuli faster and more accurately. However, the mechanism behind the integration of expectations with sensory information during perception remains unclear. We investigated the hypothesis that such integration depends on "fusion"-the weighted averaging of different cues informative about stimulus identity. We first trained participants to map a range of tones onto faces spanning a male-female continuum via associative learning. These two features served as expectation and sensory cues to sex, respectively. We then tested specific predictions about the consequences of fusion by manipulating the congruence of these cues in psychophysical and fMRI experiments. Behavioral judgments and patterns of neural activity in auditory association regions revealed fusion of sensory and expectation cues, providing evidence for a precise computational account of how expectations influence perception.
... Most notably, many of the studies compare perception of "predicted" self-generated events with perception of "unpredicted" sensory events generated by external sources while participants themselves remain passive-or where the sensory and motor events overlap less due to temporal misalignment (Blakemore, Frith, & Wolpert, 1999). This comparison perhaps confounds the operation of expectation mechanisms with that of other processes (see Press, Kok, & Yon, 2020a). For example, if conceptualizing action as an additional task, classic working memory models would hypothesize reduced sensory processing when events are presented in combination with action (Baddeley, 1996). ...
Article
Full-text available
We predict how our actions will influence the world around us. Prevailing models in the action control literature propose that we use these predictions to suppress or “cancel” perception of expected action outcomes, to highlight more informative surprising events. However, contrasting normative Bayesian models in sensory cognition suggest that we are more, not less, likely to perceive what we expect—given that what we expect is more likely to occur. Here we adjudicated between these models by investigating how expectations influence perceptual decisions about action outcomes in a signal detection paradigm. Across three experiments, participants performed one of two manual actions that were sometimes accompanied by brief presentation of expected or unexpected visual outcomes. Contrary to dominant cancellation models but consistent with Bayesian accounts, we found that observers were biased to report the presence of expected action outcomes. There were no effects of expectation on sensitivity. Computational modeling revealed that the action-induced bias reflected a sensory bias in how evidence was accumulated rather than a baseline shift in decision circuits. Expectation effects remained in Experiments 2 and 3 when orthogonal cues indicated which finger was more likely to be probed (i.e. task-relevant). These biases toward perceiving expected action outcomes are suggestive of a mechanism that would enable generation of largely veridical representations of our actions and their consequences in an inherently uncertain sensory world.
... Altered proprioception could have had a noisier signal, leading participants to consider it not reliable. A further possible cause of reduced reliability of proprioception might be due to the movement of the hand, which through several mechanisms such as spinal gating (Press et al. 2020, Seki et al. 2012) may have attenuated the sensory signal. In parallel or alternatively with an enhanced weight of vision, a reduced weight of proprioception during Bayesian multisensory integration could justify the presented results. ...
Article
Over a lifetime of experience, the representation of the body is built upon congruent integration of multiple elements constituting the sensorimotor loop. To investigate its robustness against the rupture of congruency between senses and with motor command, we selectively manipulated in healthy subjects the binds between sight, proprioception, and efferent motor command. Two experiments based on the Moving Hand Illusion were designed employing Tendon Vibration Illusion to modulate proprioception and generate illusory altered feedback of movement. In Experiment A, visuomotor congruency was modulated by introducing adelay between complex multifingered movements performed by arobotic hand and real movement of each participant's hand. In the presence of the motor command, visuomotor congruency enhanced ownership, agency, and skin conductance, while proprioceptive-motor congruency was not effective, confirming the prevalence of vision upon proprioception. In Experiment B, the impact of visuo-proprioceptive congruency was tested in the absence of motor command because the robotic hand moved autonomously. Intersensory congruency compensated for the absence of motor command only for ownership. Skin conductance in Exp Band Proprioceptive Drift in both experiments did not change. Results suggest that ownership and agency are independently processed, and presence of the efferent component modulates sensory feedbacks salience. The brain seems to require the integration of at least two streams of congruent information. Bodily awareness can be generated from sensory information alone, but to feel in charge of the body, senses must be double-checked with the prediction generated from efference copy, which is treated as an additional sensory modality.
... Most notably, many of the studies compare perception of 'predicted' self-generated events with perception of 'unpredicted' sensory events generated by external sources while participants themselves remain passiveor where the sensory and motor events overlap less due to temporal misalignment (Blakemore, Frith & Wolpert, 1999). This comparison is perhaps confounding the operation of expectation mechanisms with that of other processes (see Press, Kok & Yon, 2020a). For example, if conceptualising action as an additional task, classic working memory models would hypothesise reduced sensory processing when events are presented in combination with action (Baddeley, 1996). ...
Preprint
Full-text available
We predict how our actions will influence the world around us. Prevailing models of action control propose that we use these predictions to suppress or ‘cancel’ perception of expected action outcomes. However, contrasting normative Bayesian models in sensory cognition suggest that top-down predictions bias observers toward perceiving what they expect. Here we adjudicated between these models by investigating how expectations influence perceptual decisions about briefly presented action outcomes. Contrary to dominant cancellation models, we found that observers’ perceptual decisions are biased toward the presence of outcomes congruent with their actions. Computational modelling revealed this action-induced bias reflected a bias in how sensory evidence was accumulated, rather than a baseline shift in decision circuits. In combination, these results reveal a gain control mechanism that can explain how we generate largely veridical representations of our actions and their consequences in an inherently uncertain sensory world.
Article
It is widely believed that predicted tactile action outcomes are perceptually attenuated. The present experiments determined whether predictive mechanisms necessarily generate attenuation or, instead, can enhance perception—as typically observed in sensory cognition domains outside of action. We manipulated probabilistic expectations in a paradigm often used to demonstrate tactile attenuation. Adult participants produced actions and subsequently rated the intensity of forces on a static finger. Experiment 1 confirmed previous findings that action outcomes are perceived less intensely than passive stimulation but demonstrated more intense perception when active finger stimulation was removed. Experiments 2 and 3 manipulated prediction explicitly and found that expected touch during action is perceived more intensely than unexpected touch. Computational modeling suggested that expectations increase the gain afforded to expected tactile signals. These findings challenge a central tenet of prominent motor control theories and demonstrate that sensorimotor predictions do not exhibit a qualitatively distinct influence on tactile perception.
Chapter
Somatosensory processing is about the development of bodily experience via a variety of sensations, including tactile sensations and sensations of position and motion. This chapter discusses recent neuroimaging findings regarding the brain mechanisms of oral somatosensory processing. It outlines the brain mechanisms associated with gustation. Especially, the chapter highlights the role of affective–motivational processing of food, an issue highly relevant to gustatory and oral functions. Following somatosensation and gustation, it focuses on how the cognitive–affective functions shape our experience of oral conditions. Specifically, the chapter outlines the current understanding of perception, attention, motivation and emotion from the perspective of cognitive neuroscience and highlights the association between oral sensorimotor functions and these cognitive–affective functions. The chapter also focuses on the issues of multisensory integration, and specifically, focuses on the current knowledge of multisensory integration related to oral functions and summarizes the relevant brain mechanisms.
Article
Full-text available
We build models of the world around us to guide perception and learning in the face of uncertainty. New evidence reveals a neurocomputational mechanism that links predictive processes across cognitive domains.
Article
Full-text available
The ability to represent and respond to uncertainty is fundamental to human cognition and decision-making. Noradrenaline (NA) is hypothesized to play a key role in coordinating the sensory, learning, and physiological states necessary to adapt to a changing world, but direct evidence for this is lacking in humans. Here, we tested the effects of attenuating noradrenergic neurotransmission on learning under uncertainty. We probed the effects of the β-adrenergic receptor antagonist propranolol (40 mg) using a between-subjects, double-blind, placebo-controlled design. Participants performed a probabilistic associative learning task, and we employed a hierarchical learning model to formally quantify prediction errors about cue-outcome contingencies and changes in these associations over time (volatility). Both unexpectedness and noise slowed down reaction times, but propranolol augmented the interaction between these main effects such that behavior was influenced more by prior expectations when uncertainty was high. Computationally, this was driven by a reduction in learning rates, with people slower to update their beliefs in the face of new information. Attenuating the global effects of NA also eliminated the phasic effects of prediction error and volatility on pupil size, consistent with slower belief updating. Finally, estimates of environmental volatility were predicted by baseline cardiac measures in all participants. Our results demonstrate that NA underpins behavioral and computational responses to uncertainty. These findings have important implications for understanding the impact of uncertainty on human biology and cognition.
Article
Four experiments are reported that investigate the relationship between action-outcome learning and the ability to ignore distractors. Each participant performed 600 acquisition trials, followed by 200 test trials. In the acquisition phase, participants were presented with a fixed action-outcome contingency (e.g., Key #1 ➔ green distractors), while that contingency was reversed in the test phase. In Experiments 1-3, a distractor feature depended on the participants' action. In Experiment 1, actions determined the color of the distractors; in Experiment 2, they determined the target-distractor distance; in Experiment 3, they determined target-distractor compatibility. Results suggest that with the relatively simple features (color and distance), exposure to action-outcome contingencies changed distractor cost, whereas with the complex or relational feature (target-distractor compatibility), exposure to the contingencies did not affect distractor cost. In Experiment 4, the same pattern of results was found (effect of contingency learning on distractor cost) with perceptual sequence learning, using visual cues ("X" vs. "O") instead of actions. Thus, although the mechanism of associative learning may not be unique to actions, such learning plays a role in the allocation of attention to task-irrelevant events.
Article
Full-text available
Learning to anticipate future states of the world based on statistical regularities in the environment is a key component of perception and is vital for the survival of many organisms. Such statistical learning and prediction are crucial for acquiring language and music appreciation. Importantly, learned expectations can be implicitly derived from exposure to sensory input, without requiring explicit information regarding contingencies in the environment. Whereas many previous studies of statistical learning have demonstrated larger neuronal responses to unexpected versus expected stimuli, the neuronal bases of the expectations themselves remain poorly understood. Here we examined behavioral and neuronal signatures of learned expectancy via human scalp-recorded event-related brain potentials (ERPs). Participants were instructed to listen to a series of sounds and press a response button as quickly as possible upon hearing a target noise burst, which was either reliably or unreliably preceded by one of three pure tones in low-, mid-, and high-frequency ranges. Participants were not informed about the statistical contingencies between the preceding tone 'cues' and the target. Over the course of a stimulus block, participants responded more rapidly to reliably cued targets. This behavioral index of learned expectancy was paralleled by a negative ERP deflection, designated as a neuronal contingency response (CR), which occurred immediately prior to the onset of the target. The amplitude and latency of the CR were systematically modulated by the strength of the predictive relationship between the cue and the target. Re-averaging ERPs with respect to the latency of behavioral responses revealed no consistent relationship between the CR and the motor response, suggesting that the CR represents a neuronal signature of learned expectancy or anticipatory attention. Our results demonstrate that statistical regularities in an auditory input stream can be implicitly learned and exploited to influence behavior. Furthermore, we uncover a potential 'prediction signal' that reflects this fundamental learning process.
Article
Full-text available
In sensorimotor integration, the brain needs to decide how its predictions should accommodate novel evidence by ‘gating’ sensory data depending on the current context. Here, we examined the oscillatory correlates of this process by recording magnetoencephalography (MEG) data during a new task requiring action under intersensory conflict. We used virtual reality to decouple visual (virtual) and proprioceptive (real) hand postures during a task in which the phase of grasping movements tracked a target (in either modality). Thus, we rendered visual information either task-relevant or a (to-be-ignored) distractor. Under visuo-proprioceptive incongruence, occipital beta power decreased (relative to congruence) when vision was task-relevant but increased when it had to be ignored. Dynamic causal modelling (DCM) revealed that this interaction was best explained by diametrical, task-dependent changes in visual gain. These novel results suggest a crucial role for beta oscillations in the contextual gating (i.e., gain or precision control) of visual vs proprioceptive action feedback, depending on concurrent behavioral demands.
Article
Full-text available
Self-generated touch feels less intense and less ticklish than identical externally generated touch. This somatosensory attenuation occurs because the brain predicts the tactile consequences of our self-generated movements. To produce attenuation, the tactile predictions need to be time-locked to the movement, but how the brain maintains this temporal tuning remains unknown. Using a bimanual self-touch paradigm, we demonstrate that people can rapidly unlearn to attenuate touch immediately after their movement and learn to attenuate delayed touch instead, after repeated exposure to a systematic delay between the movement and the resulting touch. The magnitudes of the unlearning and learning effects are correlated and dependent on the number of trials that participants have been exposed to. We further show that delayed touches feel less ticklish and non-delayed touches more ticklish after exposure to the systematic delay. These findings demonstrate that the attenuation of self-generated touch is adaptive.
Article
Full-text available
Hallucinations, perceptions in the absence of objectively identifiable stimuli, illustrate the constructive nature of perception. Here, we highlight the role of prior beliefs as a critical elicitor of hallucinations. Recent empirical work from independent laboratories shows strong, overly precise priors can engender hallucinations in healthy subjects and that individuals who hallucinate in the real world are more susceptible to these laboratory phenomena. We consider these observations in light of work demonstrating apparently weak, or imprecise, priors in psychosis. Appreciating the interactions within and between hierarchies of inference can reconcile this apparent disconnect. Data from neural networks, human behavior, and neuroimaging support this contention. This work underlines the continuum from normal to aberrant perception, encouraging a more empathic approach to clinical hallucinations.
Article
Full-text available
Models of action control suggest that predicted action outcomes are “cancelled” from perception, allowing agents to devote resources to more behaviorally relevant unexpected events. These models are supported by a range of findings demonstrating that expected consequences of action are perceived less intensely than unexpected events. A key assumption of these models is that the prediction is subtracted from the sensory input. This early subtraction allows preferential processing of unexpected events from the outset of movement, thereby promoting rapid initiation of corrective actions and updating of predictive models. We tested this assumption in three psychophysical experiments. Participants rated the intensity (brightness) of observed finger movements congruent or incongruent with their own movements at different timepoints after action. Across Experiments 1 and 2, evidence of cancellation—whereby congruent events appeared less bright than incongruent events—was only found 200 ms after action, whereas an opposite effect of brighter congruent percepts was observed in earlier time ranges (50 ms after action). Experiment 3 demonstrated that this interaction was not a result of response bias. These findings suggest that “cancellation” may not be the rapid process assumed in the literature, and that perception of predicted action outcomes is initially “facilitated.” We speculate that the representation of our environment may in fact be optimized via two opposing processes: The primary process facilitates perception of events consistent with predictions and thereby helps us to perceive what is more likely, but a later process aids the perception of any detected events generating prediction errors to assist model updating.
Article
Full-text available
Active inference provides a simple and neurobiologically plausible account of how action and perception are coupled in producing (Bayes) optimal behaviour. This can be seen most easily as minimising prediction error: we can either change our predictions to explain sensory input through perception. Alternatively, we can actively change sensory input to fulfil our predictions. In active inference, this action is mediated by classical reflex arcs that minimise proprioceptive prediction error created by descending proprioceptive predictions. However, this creates a conflict between action and perception; in that, self-generated movements require predictions to override the sensory evidence that one is not actually moving. However, ignoring sensory evidence means that externally generated sensations will not be perceived. Conversely, attending to (proprioceptive and somatosensory) sensations enables the detection of externally generated events but precludes generation of actions. This conflict can be resolved by attenuating the precision of sensory evidence during movement or, equivalently, attending away from the consequences of self-made acts. We propose that this Bayes optimal withdrawal of precise sensory evidence during movement is the cause of psychophysical sensory attenuation. Furthermore, it explains the force-matching illusion and reproduces empirical results almost exactly. Finally, if attenuation is removed, the force-matching illusion disappears and false (delusional) inferences about agency emerge. This is important, given the negative correlation between sensory attenuation and delusional beliefs in normal subjects-and the reduction in the magnitude of the illusion in schizophrenia. Active inference therefore links the neuromodulatory optimisation of precision to sensory attenuation and illusory phenomena during the attribution of agency in normal subjects. It also provides a functional account of deficits in syndromes characterised by false inference and impaired movement-like schizophrenia and Parkinsonism-syndromes that implicate abnormal modulatory neurotransmission.
Article
Full-text available
All bodily movements stimulate peripheral receptors that activate neurons in the brain and spinal cord through afferent feedback. How these reafferent signals are processed within the CNS during movement is a key question in motor control. We investigated cutaneous sensory-evoked potentials in the spinal cord, primary somatosensory and motor cortex, and premotor cortex in monkeys performing an instructed delay task. Afferent inputs from cutaneous receptors were suppressed at several levels in a task-dependent manner. We found two types of suppression. First, suppression during active limb movement was observed in the spinal cord and all three cortical areas. This suppression was induced by both bottom-up and top-down gating mechanisms. Second, during preparation for upcoming movement, evoked responses were suppressed exclusively in the motor cortical areas and the magnitude of suppression was correlated with the reaction time of the subsequent movement. This suppression could be induced by a top-down gating mechanism to facilitate the preparation and execution of upcoming movement.
Article
From the noisy information bombarding our senses, our brains must construct percepts that are veridical - reflecting the true state of the world - and informative - conveying what we did not already know. Influential theories suggest that both challenges are met through mechanisms that use expectations about the likely state of the world to shape perception. However, current models explaining how expectations render perception either veridical or informative are mutually incompatible. While the former propose that perceptual experiences are dominated by events we expect, the latter propose that perception of expected events is suppressed. To solve this paradox we propose a two-process model in which probabilistic knowledge initially biases perception towards what is likely and subsequently upweights events that are particularly surprising.
Article
The human ability to anticipate the consequences that result from action is an essential building block for cognitive, emotional, and social functioning. A dominant view is that this faculty is based on motor predictions, in which a forward model uses a copy of the motor command to predict imminent sensory action-consequences. Although this account was originally conceived to explain the processing of action-outcomes that are tightly coupled to bodily movements, it has been increasingly extrapolated to effects beyond the body. Here, we critically evaluate this generalization and argue that, although there is ample evidence for the role of predictions in the processing of environment-related action-outcomes, there is hitherto little reason to assume that these predictions result from motor-based forward models.
Article
Uncertainty in various forms plagues our interactions with the environment. In a Bayesian statistical framework, optimal inference and prediction, based on unreliable observations in changing contexts, require the representation and manipulation of different forms of uncertainty. We propose that the neuromodulators acetylcholine and norepinephrine play a major role in the brain's implementation of these uncertainty computations. Acetylcholine signals expected uncertainty, coming from known unreliability of predictive cues within a context. Norepinephrine signals unexpected uncertainty, as when unsignaled context switches produce strongly unexpected observations. These uncertainty signals interact to enable optimal inference and learning in noisy and changeable environments. This formulation is consistent with a wealth of physiological, pharmacological, and behavioral data implicating acetylcholine and norepinephrine in specific aspects of a range of cognitive processes. Moreover, the model suggests a class of attentional cueing tasks that involve both neuromodulators and shows how their interactions may be part-antagonistic, part-synergistic.
The short-latency dopamine signal: a role in discovering novel actions?
  • P Redgrave
  • K Gurney
Redgrave, P. and Gurney, K. (2006) The short-latency dopamine signal: a role in discovering novel actions? Nat. Rev. Neurosci. 7, 967-975
Action sharpens sensory representations of expected outcomes
Yon, D. et al. (2018) Action sharpens sensory representations of expected outcomes. Nat. Commun. 9, 4288