
Patric Bach- PhD
- Professor at University of Aberdeen
Patric Bach
- PhD
- Professor at University of Aberdeen
About
94
Publications
16,186
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,471
Citations
Introduction
I am professor for Psychology at the University of Aberdeen and head of the Action Prediction Lab (https://www.actionprediction.org). I investigate how people control their own behavior and understand that of others.
Current institution
Additional affiliations
May 2004 - August 2009
September 2009 - present
January 2005 - September 2009
Publications
Publications (94)
Social perception relies on the ability to understand the higher-order goals that drive other people’s behaviour. Under predictive coding views, this ability relies on a Bayesian-like hypothesis-testing mechanism, which translates prior higher-order information about another agent’s goals into perceptual predictions of the actions with which these...
For efficient human–robot interaction, human operators need to be able to efficiently represent the robot’s movements in space and predict its next steps. However, according to frameworks of Bayesian multisensory integration, features outside the motion itself—like the sounds a robot makes while it moves—should affect how otherwise identical motion...
Overt and imagined action seem inextricably linked. Both have similar timing, activate shared brain circuits, and motor imagery influences overt action and vice versa. Motor imagery is, therefore, often assumed to recruit the same motor processes that govern action execution, and which allow one to play through or simulate actions offline. Here, we...
Visual perspective taking (VPT) is a core process of social cognition, providing humans with insights into what the environment looks like from another’s point of view [1, 2, 3, 4]. While VPT is often described as a quasi-perceptual phenomenon [5, 6], evidence for this proposal has been lacking. Here, we provide direct evidence that another’s persp...
Recent proposals argue that our understanding of our dynamic environment emerges from a predictive process that integrates prior expectations with the sensory evidence, implementing a process of Bayesian-like hypothesis testing and revision. In two preregistered EEG/ERP experiments, we investigated the neuronal underpinnings of this integration. Pa...
Recent proposals argue that our understanding of our dynamic environment emerges from a predictive process that integrates prior expectations with the sensory evidence, implementing a process of Bayesian-like hypothesis testing and revision. In two preregistered EEG/ERP experiments, we investigated the neuronal underpinnings of this integration. Pa...
We present a conceptual framework for training Vision-Language Models (VLMs) to perform Visual Perspective Taking (VPT), a core capability for embodied cognition essential for Human-Robot Interaction (HRI). As a first step toward this goal, we introduce a synthetic dataset, generated in NVIDIA Omniverse, that enables supervised learning for spatial...
Facial expressions vary in ambiguity, making emotion perception challenging. ‘Precision-weighting’ theory posits that interpretation of ambiguous facial expressions would rely more strongly on prior expectations to help resolve uncertainty, yet evidence for this is lacking. To investigate this we presented emotional sentences that set expectations...
Facial expressions vary in ambiguity, making emotion perception challenging. ‘Precision-weighting’ theory posits that interpretation of ambiguous facial expressions would rely more strongly on prior expectations to help resolve uncertainty, yet evidence for this is lacking. To investigate this we presented emotional sentences that set expectations...
Visual Perspective Taking (VPT), the ability to spontaneously represent how another sees the world, underpins human social interaction, from joint action to predicting others' future actions and mentalizing about their goals and mental states. Due to highly customisable, re-peatable behaviours, robots provide an ideal platform to investigate cognit...
Visual Perspective Taking (VPT) underpins human social interaction, from joint action to predicting others' future actions and mentalizing about their goals and affective/mental states. Substantial progress has been made in developing artificial VPT capabilities in robots. However, as conventional VPT tasks rely on the (non-situated, disembodied) p...
The experience of pain, like other interoceptive processes, has recently been conceptualized in light of predictive coding models and the free energy minimization framework. In these views, the brain integrates sensory, proprioceptive, and interoceptive signals to generate probabilistic inferences about upcoming events, which heavily shape both the...
The experience of pain, like other interoceptive processes, has recently been conceptualized in light of predictive coding models and the free energy minimization framework. In these views, the brain integrates sensory, proprioceptive, and interoceptive signals to generate probabilistic inferences about upcoming events, which heavily shape both the...
Recent research suggests that expectations play an important role in emotion perception within a predictive processing framework. Prior expectations are proposed to help us resolve ambiguity and uncertainty in particular. We investigated for the first time whether facial expression intensity influences the degree of reliance on prior expectations t...
The experience of pain, like other interoceptive processes, has recently been conceptualized in light of predictive coding models and the free energy minimization framework. In these views, the brain integrates sensory, proprioceptive, and interoceptive signals to generate probabilistic inferences about upcoming events, which heavily shape both the...
Recent approaches conceptualize mental imagery as a simulatory mode of perceptual experience, which relies on the voluntary engagement of the same top-down prediction processes that shape our perception of the external world. If so, then imagery should induce similar predictive biases as those that are known to govern the perceptual representation...
For efficient human-robot interaction, human operators need to be able to efficiently represent the robot’s movements in space and predict its next steps. However, according to frameworks of Bayesian multisensory integration, features outside the motion itself – like the consequential sounds a robot makes while it moves – should affect how otherwis...
Humans take a teleological stance when observing others' actions, interpreting them as intentional and goal directed. In predictive processing accounts of social perception, this teleological stance would be mediated by a perceptual prediction of an ideal energy-efficient reference trajectory with which a rational actor would achieve their goals wi...
The perception of the internal milieu is thought to reflect beliefs and prior knowledge about the expected state of the body, rather than only actual interoceptive states. This study investigated whether heartbeat perception could be illusorily distorted towards prior subjective beliefs, such that threat expectations suffice to induce a false perce...
Visual perspective taking may rely on the ability to mentally rotate one’s own body into that of another. Here we test whether participants’ ability to make active body movements plays a causal role in visual perspective taking. We utilized our recent task that measures whether participants spontaneously represent another’s visual perspective in a...
2AbstractOvert and imagined action seem inextricably linked. Both follow similar timings, activate shared brain circuits, and motor imagery influences overt action and vice versa. Motor imagery is therefore often assumed to rely onthe motorprocesses governing action execution itself, which allow one to play through or simulate actions offline. Here...
Social difficulties in autism spectrum disorder (ASD) may originate from a reduced top-down modulation of sensory information that prevents the spontaneous attribution of intentions to observed behaviour. However, although people with autism are able to explicitly reason about others’ mental states, the effect of abstract intention information on p...
Using an established paradigm, we tested whether people derive motoric predictions about an actor's forthcoming actions from prior knowledge about them and the context in which they are seen. In two experiments, participants identified famous tennis and soccer players using either hand or foot responses. Athletes were shown either carrying out or n...
Predictive processing accounts of social perception argue that action observation is a predictive process, in which inferences about others' goals are tested against the perceptual input, inducing a subtle perceptual confirmation bias that distorts observed action kinematics toward the inferred goals. Here we test whether such biases are induced ev...
Social difficulties in Autism Spectrum Disorder (ASD) may originate from an impaired top-down modulation of sensory information that prevents the spontaneous attribution of intentions to observed behaviour. However, although autistic people remain able to explicitly reason about others’ mental states, the effect of abstract intention information on...
We adapted an established paradigm (Bach & Tipper, 2007; Tipper & Bach, 2010) to test whether people derive motoric predictions about an actor’s forthcoming actions from both prior knowledge about them, and the context in which they are seen. In two experiments, participants identified famous tennis and soccer players with either hand or foot respo...
Humans interpret others’ behaviour as intentional and expect them to take the most energy-efficient path to achieve their goals. Recent studies show that these expectations of efficient action take the form of a prediction of an ideal “reference” trajectory, against which observed actions are evaluated, distorting their perceptual representation to...
Recent predictive processing models argue that action understanding is a predictive process, in which goal inferences are constantly tested by comparing predictions of forthcoming behaviour against the actual perceptual input. In a recent series of studies, we showed that these predictions can be visible as a subtle shift in perceptual action judgm...
Primates interpret conspecific behaviour as goal-directed and expect others to achieve goals by the most efficient means possible. While this teleological stance is prominent in evolutionary and developmental theories of social cognition, little is known about the underlying mechanisms. In predictive models of social cognition, a perceptual predict...
It feels intuitive that our actions are intentional, but there is considerable debate about whether (and how) humans control their motor behavior. Recent ideomotor theories of action argue that action intentions are fundamentally perceptual, that actions are not only controlled by anticipating—imagining—their intended perceptual consequences, but a...
Humans interpret others’ behaviour as intentional and goal-directed, expecting others to take the most energy-efficient path to achieve their goals. Recent studies have shown that these expectations of efficient action provide a perceptual prediction of an ideal efficient trajectory, against which the observed action is evaluated, resulting in a di...
Observing someone else perform an action can lead to false memories of self-performance – the observation inflation effect. One explanation is that action simulation via mirror neuron activation during action observation is responsible for observation inflation by enriching memories of observed actions with motor representations. In three experimen...
Action observation is central to human social interaction. It allows people to derive what mental states drive others' behaviour and coordinate (and compete) effectively with them. Although previous accounts have conceptualised this ability in terms of bottom-up (motoric or conceptual) matching processes, more recent evidence suggests that such mec...
The perception of an action is shifted farther along the observed trajectory if the observer has prior knowledge of the actor’s intention. This intention-action prediction effect is explained by predictive perception models, wherein sensory input is interpreted in light of expectancies. This study altered the precision of the prediction by varying...
Prior research conceptualised action understanding primarily as a kinematic matching of observed actions to own motor representations but has ignored the role of object information. The current study utilized fMRI to identify (a) regions uniquely involved in encoding the goal of others’ actions, and (b) to test whether these goal understanding proc...
Stimuli list.
A list of all the video stimuli used in the experiment, along with the condition (and repetition) each clip was used for in relation to the participant’s task.
(DOCX)
Parametric Region of Interest analysis.
Combined Region of Interest analysis: Beta and p values for all main contrasts (across regions) correlated with ratings of the Apparentness of goal and Sensorimotor experience. Apparentness of action goal: Beta and p values for all Regions of Interest within each main contrast correlated with ratings of the a...
Predictions allow humans to manage uncertainties within social interactions. Here, we investigate how explicit and implicit person models-how different people behave in different situations-shape these predictions. In a novel action identification task, participants judged whether actors interacted with or withdrew from objects. In two experiments,...
a word document explaining the column headings in the raw data.
(DOCX)
Action observation is often conceptualized in a bottom-up manner, where sensory information activates conceptual (or motor) representations. In contrast, here we show that expectations about an actor's goal have a top-down predictive effect on action perception, biasing it toward these goals. In 3 experiments, participants observed hands reach for...
We investigated whether top-down expectations about an actor's intentions affect action perception in a representational momentum (RM) paradigm. Participants heard an actor declare an intention to either take or leave an object and then saw him either reach for or withdraw from it, such that action and intention were either congruent or incongruent...
Seeing a face gaze at an object elicits rapid attention shifts toward the same object. We tested whether gaze cueing is predictive: do people shift their attention toward objects others are merely expected to look at? Participants categorized objects while a face either looked at this object, at another object, or straight ahead. Unbeknownst to par...
Action understanding lies at the heart of social interaction. Prior research has often conceptualized this capacity in terms of a motoric matching of observed actions to an action in one’s motor repertoire, but has ignored the role of object information. In this manuscript, we set out an alternative conception of intention understanding, which plac...
Previous studies have shown that viewing others in pain activates cortical somatosensory processing areas and facilitates the detection of tactile targets. It has been suggested that such shared representations have evolved to enable us to better understand the actions and intentions of others. If this is the case, the effects of observing others i...
It is still controversial whether mental practice-the internal rehearsal of movements to improve later performance-relies on processes engaged during physical motor performance and, if so, which processes these are. We report data from 5 experiments, in which participants mentally practiced complex rhythms with either feet or hands while using the...
Sensorimotor regions of the brain have been implicated in simulation processes such as action understanding and empathy, but their functional role in these processes remains unspecified. We used functional magnetic resonance imaging (fMRI) to demonstrate that postcentral sensorimotor cortex integrates action and object information to derive the sen...
Embodied views of language hold that linguistic meaning is derived from the interaction experience of the listener/speaker and the sensory, motor, and internal states that go along with it. This article reviews three kinds of evidence that support such views. (1) Linguistic descriptions draw upon processes and brain regions that support the event's...
Merely viewing the faces of famous athletes affects the observers' motor system, suggesting that action-based information is a core feature of person representations, even when no specific action is visible (Bach & Tipper, 2006). Unexpectedly, these person-based motor priming effects were inhibitory. Foot responses were slower when identifying foot...
An important question for the study of social interactions is how the motor actions of others are represented. Research has demonstrated that simply watching someone perform an action activates a similar motor representation in oneself. Key issues include (1) the automaticity of such processes, and (2) the role object affordances play in establishi...
Observing other people’s actions activates a network of brain regions that is also activated during the execution of these actions. Here, we used functional magnetic resonance imaging to test whether these “mirror” regions in frontal and parietal cortices primarily encode the spatiomotor aspects or the functional goal-related aspects of observed to...
Across cultures, speakers produce iconic gestures, which add – through the movement of the speakers’ hands – a pictorial dimension to the speakers’ message. These gestures capture not only the motor content but also the visuospatial content of the message. Here, we provide first evidence for a direct link between the representation of perceptual in...
Observing the actions of another person activates similar action representations in the observer. A consequence of this perception-action matching process is that producing actions one simultaneously observes will be easier than producing different actions. For example, when observing another person kick a ball, a foot response to identify a stimul...
Observing other people's actions activates a network of brain regions that is also activated during the execution of these actions. Here, we used functional magnetic resonance imaging to test whether these "mirror" regions in frontal and parietal cortices primarily encode the spatiomotor aspects or the functional goal-related aspects of observed to...
The understanding of actions of tool use depends on the motor act that is performed and on the function of the objects involved in the action. We used event-related potentials (ERPs) to investigate the processes that derive both kinds of information in a task in which inserting actions had to be judged. The actions were presented as two consecutive...
The attribution of personal traits to other persons depends on the actions the observer performs at the same time (Bach & Tipper, 2007). Here, we show that the effect reflects a misattribution of appraisals of the observers' own actions to the actions of others. We exploited spatial compatibility effects to manipulate how fluently-how fast and how...
Humans use the same representations to code self-produced and observed actions. Neurophysiological evidence for this view comes from the discovery of the so-called mirror neurons in premotor cortex of the macaque monkey. These neurons respond when the monkey performs a particular action but also when it observes the same behavior in another individ...
When an observed action (e.g., kicking) is compatible to a to be produced action (e.g., a foot-key response as compared to a finger-key response), then the self-produced action is more fluent, that is, it is more accurate and faster. A series of experiments explore the notion that vision-action compatibility effects can influence personal-trait jud...
Observing an action activates the same representations as does the actual performance of the action. Here we show for the first time that the action system can also be activated in the complete absence of action perception. When the participants had to identify the faces of famous athletes, the responses were influenced by their similarity to the m...
A perceived action can be understood only when information about the action carried out and the objects used are taken into account. It was investigated how spatial and functional information contributes to establishing these relations. Participants observed static frames showing a hand wielding an instrument and a potential target object of the ac...
Meaningful and meaningless hand postures were presented to subjects who had to carry out a semantic discrimination task while electrical brain responses were recorded. Both meaningful and control sets of hand postures were matched as closely as possible. The ERPs elicited by meaningless hand postures showed an anteriorly distributed N300 and a cent...
Hand signs with symbolic meaning can often be utilized more successfully than words to communicate an intention; however, the underlying brain mechanisms are undefined. The present study using magnetoencephalography (MEG) demonstrates that the primary visual, mirror neuron, social recognition and object recognition systems are involved in hand sign...
The basic idea of the present study is that it is useful to conceptualize the processes involved in action comprehension in a similar manner as the processes involved in sentence comprehension. One important question then is whether order and meaning of action sequences are processed sequentially or in parallel (analogous to syntactic and semantic...