Article
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Several labels, such as neuroticism, negative emotionality, and dispositional negativity, indicate a broad dimension of psychopathology. However, largely separate, often disorder-specific research lines have developed that focus on different cognitive and affective characteristics that are associated with this dimension, such as perseverative cognition (worry, rumination), reduced autobiographical memory specificity, compromised fear learning, and enhanced somatic-symptom reporting. In this article, we present a theoretical perspective within a predictive-processing framework in which we trace these phenotypically different characteristics back to a common underlying "better-safe-than-sorry" processing strategy. This implies information processing that tends to be low in sensory-perceptual detail, which allows threat-related categorical priors to dominate conscious experience and for chronic uncertainty/surprise because of a stagnated error-reduction process. This common information-processing strategy has beneficial effects in the short term but important costs in the long term. From this perspective, we suggest that the phenomenally distinct cognitive and affective psychopathological characteristics mentioned above represent the same basic processing heuristic of the brain and are only different in relation to the particular type of information involved (e.g., in working memory, in autobiographical memory, in the external and internal world). Clinical implications of this view are discussed.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Concerning somatic symptom perception, liberal decisions were translated in terms of a "better safe than sorry" coping strategy, whereas conservative responses were related to a "wait and see" approach (17). Such a "better safe than sorry" strategy has been described as a potential general vulnerability factor for psychopathology, including somatization and medically unexplained symptoms (18). Research on decision strategies concerning bodily symptoms in PIA is sparse up to now. ...
... The predictive coding framework assumes that perception of symptoms is optimized by predicting consequences depending on context (18). In ambiguous situations, more extreme decision strategies can optimize decision making (26), such that a more liberal response strategy comes along with higher sensitivity (17). ...
... These predictions are shaped by experiences: predictions about somatic signals become increasingly precise (i.e., associated with a high level of confidence) after continuous experience with somatic signals (6). In somatic symptom disorder, symptom-related predictions become particularly powerful when sensitivity to sensory-perceptual evidence is low (18). In our study, sensitivity increased less in individuals with somatic symptom disorder when a visual cue was given. ...
Article
Objective: Symptom perception in pathological illness anxiety (PIA) might be biased so that somatic signals are overreported. In the somatic signal detection task (SSDT), performance in detecting weak tactile stimuli gives information on overreporting or underreporting of stimuli. This task has not yet been applied in PIA. Methods: Participants with PIA (n = 44) and healthy controls (n = 40) underwent two versions of the SSDT in randomized order. In the original version, tactile and auxiliary light-emitting diode (LED) stimuli were each presented in half of the trials. In the adapted version, illness or neutral words were presented alongside tactile stimuli. Participants also conducted a heartbeat mental tracking task. Results: We found significantly higher sensitivity and a more liberal response bias in LED versus no-LED trials, but no significant differences between word types. An interaction effect showed a more pronounced increase of sensitivity from no LED to LED trials in participants with PIA when compared with the adapted SSDT and control group (F(1,76) = 5.34, p = .024, η2 = 0.066). Heartbeat perception scores did not differ between groups (BF01 of 3.63). Conclusions: The increase in sensitivity from no LED to LED trials in participants with PIA suggests stronger multisensory integration. Low sensitivity in the adapted SSDT indicates that attentional resources were exhausted by processing word stimuli. Word effects on response bias might have carried over to the original SSDT when the word version was presented first, compromising group effects regarding bias. Trial registration: The study was preregistered on OSF (https://osf.io/sna5v/).
... Differences in adaptive behaviors and learning of threat and safety during fear conditioning are strongly associated with inter-individual differences in biological factors and personality traits. One of the most influential traits is the higher-order construct of "dispositional negativity" or "negative emotionality" (Barlow et al., 2014;Caspi et al., 2005;Markon et al., 2005;Van Den Berg et al., 2014;Van den Bergh et al., 2021;Widiger and Oltmanns, 2017), which describes the tendency to experience and express elevated and enduring levels of negative affect (Barlow et al., 2014;Carleton, 2016aCarleton, , 2016b. Importantly, integrating existing animal and human behavioral and (neuro-)biological data, dispositional negativity was hypothesized to result from increased reactivity to uncertain stressors . ...
... Thus, low dispositional negativity might be associated with high adaptivity to changing states of uncertainty. High uncertainty during initial extinction learning might have provoked a brief short-term switch to a "better-safe-thansorry" processing strategy ( Van den Bergh et al., 2021), resulting in an adaptive increase of responding to the now uncertain safety cue. After confirmatory trials of no CS-/US pairing, the safety cue was processed as a signal for safety again. ...
... Instead, they continued to respond according to learned threat associations, which were probably strongly established during the instructed fear acquisition training. High dispositional negativity has been suggested previously to result from a stagnating error-reduction process, which results from a generalized but non-adaptive better-safe-than-sorry strategy (Van den Bergh et al., 2021). In our study, this might have resulted in a failure to reassess risk as environmental conditions and uncertainty changed. ...
Article
It is hypothesized that the ability to discriminate between threat and safety is impaired in individuals with high dispositional negativity, resulting in maladaptive behavior. A large body of research investigated differential learning during fear conditioning and extinction protocols depending on individual differences in intolerance of uncertainty (IU) and trait anxiety (TA), two closely-related dimensions of dispositional negativity, with heterogenous results. These might be due to varying degrees of induced threat/safety uncertainty. Here, we compared two groups with high vs. low IU/TA during periods of low (instructed fear acquisition) and high levels of uncertainty (delayed non-instructed extinction training and reinstatement). Dependent variables comprised subjective (US expectancy, valence, arousal), psychophysiological (skin conductance response, SCR, and startle blink), and neural (fMRI BOLD) measures of threat responding. During fear acquisition we found strong threat/safety discrimination for both groups. During early extinction (high uncertainty), the low IU/TA group showed an increased physiological response to the safety signal, resulting in a lack of CS discrimination. In contrast, the high IU/TA group showed strong initial threat/safety discrimination in physiology, lacking discriminative learning on startle, and reduced neural activation in regions linked to threat/safety processing throughout extinction training indicating sustained but non-adaptive and rigid responding. Similar neural patterns were found after the reinstatement test. Taken together, we provide evidence that high dispositional negativity, as indicated here by IU and TA, is associated with greater responding to threat cues during the beginning of delayed extinction, and, thus, demonstrates altered learning patterns under changing environments.
... This corresponds well to results pointing to a transdiagnostic increase in ERN amplitudes across anxiety disorders and OCD . Moreover, it is in line with results suggesting that variation in psychopathology can be related to a limited number of dimensions, suggesting that negative emotionality represents a core vulnerability factor for psychopathology (Caspi et al., 2014;Kotov et al., 2017;Van den Bergh et al., 2021). Our results suggest that individuals high on an underlying anxious-misery trait are likely to show an increased neural response to errors, which may indicate increased alertness to potential harm and negative outcomes (Proudfit et al., 2013), irrespective of the specific phenotypic appearance of the anxiety-proneness. ...
... The specific clinical outcome and trajectory to symptoms could then be shaped by additional genetic and environmental factors contributing to increased risk or resilience. This interpretation can be complemented by a current conceptualization that describes dispositional negativity as a common vulnerability factor for internalizing psychopathology using a predictive-coding framework (Van den Bergh et al., 2021). Specifically, the authors argue that individuals with higher negative emotionality are characterized by an information processing style that follows a better safe than sorry rational and leads to an oversimplified input, allowing a greater speed in categorizing input as threat at the expense of detail and, in the long run, also resulting in poor updating of prior beliefs ( Van den Bergh et al., 2021). ...
... This interpretation can be complemented by a current conceptualization that describes dispositional negativity as a common vulnerability factor for internalizing psychopathology using a predictive-coding framework (Van den Bergh et al., 2021). Specifically, the authors argue that individuals with higher negative emotionality are characterized by an information processing style that follows a better safe than sorry rational and leads to an oversimplified input, allowing a greater speed in categorizing input as threat at the expense of detail and, in the long run, also resulting in poor updating of prior beliefs ( Van den Bergh et al., 2021). This framework can account for a variety of different symptoms and neurocognitive alterations, including increased neural error signals as a neuronal equivalent of a low-threshold alarm system (Proudfit et al., 2013) as a common vulnerability factor for internalizing psychopathology (Pasion & Barbosa, 2019). ...
Article
Full-text available
The error-related negativity (ERN), a neural response to errors, has been associated with several forms of psychopathology and assumed to represent a neural risk marker for obsessive-compulsive disorder (OCD) and anxiety disorders. Yet, it is still unknown which specific symptoms or traits best explain ERN variation. This study investigated performance-monitoring in participants (N = 100) recruited across a spectrum of obsessive-compulsive characteristics (n = 26 patients with OCD; n = 74 healthy participants including n = 24 with low, n = 24 with medium, and n = 26 with high OC-characteristics). Several compulsivity- and anxiety-associated characteristics were assessed and submitted to exploratory principal axis factor analysis. Associations of raw measures and derived factors with ERN and correct-related negativity (CRN) were examined. Patients with OCD showed increased ERN amplitudes compared to healthy participants. The ERN was associated with a variety of traits related to anxiety and negative affect. Factor analysis results revealed a most prominent association of the ERN with a composite measure of anxiety and neuroticism, whereas the CRN was specifically associated with compulsivity. Results support differential associations for the ERN and CRN and demonstrate that a dimensional recruitment approach and use of composite measures can improve our understanding of characteristics underlying variation in neural performance monitoring.
... Adaptation, survival, and reproduction are at the heart of evolutionary theory, and interdependent brain networks have evolved to increase adaptation to be able to survive and reproduce. Further, emerging findings suggest that the brain uses interoceptive and exteroceptive information to predict future conditions and needs to enable optimal adaptation to continuously changing internal and external environments (11)(12)(13)(14)(15)(16)(17)(18)(19)(20). Based on better understanding of how the brain works, we propose replacing "triune brain" with a term that better captures current understanding of brain function: the adaptive brain. ...
... To increase the power of prediction and subsequent adaptivity, the brain works to minimize prediction errors; that is, minimizing the difference between predicted outcomes and actual incoming interoceptive and exteroceptive information. The more the brain can minimize prediction error and accurately predict outcomes for different courses of action, the better it will be at anticipating and adequately responding to challenge and threat efficiently and rapidly, thus increasing adaptation and survival (20,29,30). Cutting across previously accepted boundaries of the triune brain, Barrett and Simmons (29) propose a neuroarchitecturally distinct brain interoceptive system consisting of visceromotor cortex in the medial and anterior cingulate cortex, the posterior ventromedial prefrontal cortex, the posterior orbitofrontal cortex, and the anterior insular cortex (31). ...
... Indeed, whether the adaptive brain's allocation of resources favors activity of the DMN, attention networks, or a combination in any given situation reflects its function as a predictor of both the internal and external environment, enabling selection of strategies to maintain homeostasis or to initiate allostasis when needed. Accumulating evidence suggests that the brain uses Bayesian statistical principles to predict environmental states and outcomes based on previous information that the brain has received (20,30). Structures involved in the brain's Bayesianlike prediction are also implicated in integrating, or differentially prioritizing, brain networks and their adaptive strategies. ...
Article
Full-text available
Theory impacts how research is conducted. A popular theory used to conceptualize brain functioning is the triune brain theory. The triune brain theory is an evolutionary theory of brain development that emphasizes three key brain regions consisting of the brainstem, the limbic system, and the cortex that function relatively independently in coping with stress via fight or flight, emotion, and cognition, respectively. However, modern neuroscience research demonstrates that the triune brain theory does not accurately explain how the brain functions in everyday life or during the stress response. Specifically, emotion and cognition are interdependent and work together, the limbic system is not a purely emotional center nor are there purely emotional circuits in the brain, and the cortex is not a purely cognitive center nor are there purely cognitive circuits in the brain. We propose a new evolutionarily based model, the adaptive brain, that is founded on adaptive prediction resulting from interdependent brain networks using interoception and exteroception to balance current needs, and the interconnections among homeostasis, allostasis, emotion, cognition, and strong social bonds in accomplishing adaptive goals.
... Differences in adaptive behaviors and learning of threat and safety during fear conditioning are strongly associated with inter-individual differences in biological factors and personality traits. One of the most influential traits is the higher-order construct of "dispositional negativity" or "negative emotionality" (Barlow et al., 2014;Caspi et al., 2005;Markon et al., 2005;Van Den Berg et al., 2014;Van den Bergh et al., 2021;Widiger and Oltmanns, 2017), which describes the tendency to experience and express elevated and enduring levels of negative affect (Barlow et al., 2014;Carleton, 2016aCarleton, , 2016b. Importantly, integrating existing animal and human behavioral and (neuro-)biological data, dispositional negativity was hypothesized to result from increased reactivity to uncertain stressors . ...
... Thus, low dispositional negativity might be associated with high adaptivity to changing states of uncertainty. High uncertainty during initial extinction learning might have provoked a brief short-term switch to a "better-safe-thansorry" processing strategy ( Van den Bergh et al., 2021), resulting in an adaptive increase of responding to the now uncertain safety cue. After confirmatory trials of no CS-/US pairing, the safety cue was processed as a signal for safety again. ...
... Instead, they continued to respond according to learned threat associations, which were probably strongly established during the instructed fear acquisition training. High dispositional negativity has been suggested previously to result from a stagnating error-reduction process, which results from a generalized but non-adaptive better-safe-than-sorry strategy (Van den Bergh et al., 2021). In our study, this might have resulted in a failure to reassess risk as environmental conditions and uncertainty changed. ...
Preprint
It is hypothesized that the ability to discriminate between threat and safety is impaired in individuals with high dispositional negativity, resulting in maladaptive behavior. A large body of research investigated differential learning during fear conditioning and extinction protocols depending on individual differences in intolerance of uncertainty (IU) and trait anxiety (TA), two closely-related dimensions of dispositional negativity, with heterogenous results. These might be due to varying degrees of induced threat/safety uncertainty. Here, we compared two groups with high vs. low IU/TA during periods of low (instructed fear acquisition) and high levels of uncertainty (delayed non-instructed extinction training and reinstatement). Dependent variables comprised subjective (US expectancy, valence, arousal), psychophysiological (skin conductance response, SCR, and startle blink), and neural (fMRI BOLD) measures of threat responding. During fear acquisition we found strong threat/safety discrimination for both groups. During early extinction (high uncertainty), the low IU/TA group showed an increased physiological response to the safety signal, resulting in a lack of CS discrimination. In contrast, the high IU/TA group showed strong initial threat/safety discrimination in physiology, lacking discriminative learning on startle, and reduced neural activation in regions linked to threat/safety processing throughout extinction training indicating sustained but non-adaptive and rigid responding. Similar neural patterns were found after the reinstatement test. Taken together, we provide evidence that high dispositional negativity, as indicated here by IU and TA, is associated with greater responding to threat cues during the beginning of delayed extinction, and, thus, demonstrates altered learning patterns under changing environments.
... To our knowledge, this study is the first to link heightened SNS responses to the threat and safety cues during extinction learning with internalizing, externalizing, and general psychopathology (p-factor) in youth. This pattern of increased reactivity following fear learning, to both learned threat and safety cues is consistent with the recently proposed "better safe than sorry" information processing strategy (Bergh et al., 2020). This strategy in the context of aversive learning is thought to reflect oversimplified perception of stimulus features (e.g., recognizing the bell's shape but not its color) that may lead one to expect that situations containing features of learned threat cues (e.g., the bell's shape) are dangerous, even if these situations also contain features that designate safety (e.g., the bell's color; Bergh et al., 2020). ...
... This pattern of increased reactivity following fear learning, to both learned threat and safety cues is consistent with the recently proposed "better safe than sorry" information processing strategy (Bergh et al., 2020). This strategy in the context of aversive learning is thought to reflect oversimplified perception of stimulus features (e.g., recognizing the bell's shape but not its color) that may lead one to expect that situations containing features of learned threat cues (e.g., the bell's shape) are dangerous, even if these situations also contain features that designate safety (e.g., the bell's color; Bergh et al., 2020). This "better safe than sorry" information processing strategy may result in chronic deviations of expectations from reality that lead to greater levels of psychopathology (Bergh et al., 2020). ...
... This strategy in the context of aversive learning is thought to reflect oversimplified perception of stimulus features (e.g., recognizing the bell's shape but not its color) that may lead one to expect that situations containing features of learned threat cues (e.g., the bell's shape) are dangerous, even if these situations also contain features that designate safety (e.g., the bell's color; Bergh et al., 2020). This "better safe than sorry" information processing strategy may result in chronic deviations of expectations from reality that lead to greater levels of psychopathology (Bergh et al., 2020). In situations indicative of safety, failure to differentiate between threat and safety cues may result in contextually inappropriate fear responses that contribute to increased risk for multiple forms of psychopathology in youth, ranging from anxiety and internalizing problems to reactive aggression. ...
Article
Childhood exposure to violence is strongly associated with psychopathology. High resting respiratory sinus arrhythmia (RSA) is associated with lower levels of psychopathology in children exposed to violence. High RSA may help to protect against psychopathology by facilitating fear extinction learning, allowing more flexible autonomic responses to learned threat and safety cues. In this study, 165 youth (79 female, aged 9–17; 86 exposed to violence) completed assessments of violence exposure, RSA, and psychopathology, and a fear extinction learning task; 134 participants returned and completed psychopathology assessments 2 years later. Resting RSA moderated the longitudinal association of violence exposure with post-traumatic stress disorder (PTSD) symptoms and externalizing psychopathology, such that the association was weaker among youths with higher RSA. Higher skin conductance responses (SCR) during extinction learning to the threat cue (CS+) was associated with higher internalizing symptoms at follow-up and greater SCR to the safety cue (CS–) was associated with higher PTSD, internalizing, and externalizing symptoms, as well as the p-factor, controlling for baseline symptoms. Findings suggest that higher RSA may protect against emergence of psychopathology among children exposed to violence. Moreover, difficulty extinguishing learned threat responses and elevated autonomic responses to safety cues may be associated with risk for future psychopathology.
... To our knowledge, this study is the first to link heightened SNS responses to the threat and safety cues during extinction learning with internalizing, externalizing, and general psychopathology (p-factor) in youth. This pattern of increased reactivity following fear learning, to both learned threat and safety cues is consistent with the recently proposed "better safe than sorry" information processing strategy (Bergh et al., 2020). This strategy in the context of aversive learning is thought to reflect oversimplified perception of stimulus features (e.g., recognizing the bell's shape but not its color) that may lead one to expect that situations containing features of learned threat cues (e.g., the bell's shape) are dangerous, even if these situations also contain features that designate safety (e.g., the bell's color; Bergh et al., 2020). ...
... This pattern of increased reactivity following fear learning, to both learned threat and safety cues is consistent with the recently proposed "better safe than sorry" information processing strategy (Bergh et al., 2020). This strategy in the context of aversive learning is thought to reflect oversimplified perception of stimulus features (e.g., recognizing the bell's shape but not its color) that may lead one to expect that situations containing features of learned threat cues (e.g., the bell's shape) are dangerous, even if these situations also contain features that designate safety (e.g., the bell's color; Bergh et al., 2020). This "better safe than sorry" information processing strategy may result in chronic deviations of expectations from reality that lead to greater levels of psychopathology (Bergh et al., 2020). ...
... This strategy in the context of aversive learning is thought to reflect oversimplified perception of stimulus features (e.g., recognizing the bell's shape but not its color) that may lead one to expect that situations containing features of learned threat cues (e.g., the bell's shape) are dangerous, even if these situations also contain features that designate safety (e.g., the bell's color; Bergh et al., 2020). This "better safe than sorry" information processing strategy may result in chronic deviations of expectations from reality that lead to greater levels of psychopathology (Bergh et al., 2020). In situations indicative of safety, failure to differentiate between threat and safety cues may result in contextually inappropriate fear responses that contribute to increased risk for multiple forms of psychopathology in youth, ranging from anxiety and internalizing problems to reactive aggression. ...
Preprint
Childhood exposure to violence is strongly associated with psychopathology. High resting respiratory sinus arrhythmia (RSA) has been found to protect against psychopathology in children who have experienced adversity. High RSA may protect against psychopathology by facilitating fear extinction learning, allowing individuals to more appropriately regulate autonomic responses to learned threat and safety cues. In the present study, 165 youth (79 female, ages 9-17; 86 exposed to violence) completed assessments of violence exposure, RSA, psychopathology, and a fear extinction learning task; 134 participants returned and completed psychopathology assessments two years later. Resting RSA moderated the association of violence exposure with PTSD symptoms and transdiagnostic psychopathology (p-factor) at follow-up, such that the association was weaker among youth with higher RSA. Higher skin conductance responses (SCR) when viewing the stimulus associated with an aversive noise (CS+) as well as when viewing the stimulus that was unassociated with the aversive noise (CS-) during extinction learning predicted higher internalizing symptoms and p-factor at follow-up. These findings suggest that higher RSA may protect against the onset of psychopathology among children exposed to violence. Moreover, our findings suggest that in addition to difficulty extinguishing learned threat responses, elevated autonomic responses to safety cues may contribute to psychopathology.
... For example, after a stress induction, participants exhibited increased habitual stimulus-response behaviour compared to goal-directed action-outcome behaviour 21 . Moreover, theoretical models have stated a common underlying "better-safe-than-sorry" processing strategy in threat-related psychopathology 22 . As an example, one study found that participants scoring high on neuroticism show greater avoidance behaviour for ambiguous stimuli that could potentially predict a shock 23 . ...
... The first option is that regardless of the target frequency manipulation (Rare, Frequent) the overall criterion would be lower, meaning participants would have an increased tendency to indicate the target is present for the threat-of-shock compared to the safe condition (i.e. a liberal response bias). This would be in line with the better-safe-than-sorry heuristic 22 . The second option is that under threat-of-shock (compared to safe), participants would rely more on the prior expectation and their decision criterion would follow the target frequency manipulation. ...
Article
Full-text available
Threatening situations ask for rapid and accurate perceptual decisions to optimize coping. Theoretical models have stated that psychophysiological states, such as bradycardia during threat-anticipatory freezing, may facilitate perception. However, it’s unclear if this occurs via enhanced bottom-up sensory processing or by relying more on prior expectations. To test this, 52 (26 female) participants completed a visual target-detection paradigm under threat-of-shock (15% reinforcement rate) with a manipulation of prior expectations. Participants judged the presence of a backward-masked grating (target presence rate 50%) after systematically manipulating their decision criterion with a rare (20%) or frequent (80%) target presence rate procedure. Threat-of-shock induced stronger heart rate deceleration compared to safe, indicative of threat-anticipatory freezing. Importantly, threat-of-shock enhanced perceptual sensitivity but we did not find evidence of an altered influence of the effect of prior expectations on current decisions. Correct target detection (hits) was furthermore accompanied by an increase in the magnitude of this heart rate deceleration compared to a missed target. While this was independent of threat-of-shock manipulation, only under threat-of-shock this increase was accompanied by more hits and increased sensitivity. Together, these findings suggest that under acute threat participants may rely more on bottom-up sensory processing versus prior expectations in perceptual decision-making. Critically, bradycardia may underlie such enhanced perceptual sensitivity.
... This "clinging" to the previous schema is considered by Van den Bergh et al. (2020) in the context of a predictive processing mechanism. The authors develop a perspective according to which many affective and cognitive features (e.g., worry, rumination, impaired anxiety learning, reduced specificity of autobiographical memory, somatic symptoms) of a wide variety of diagnoses can be traced back to a common underlying processing strategy. ...
... In the longer term, this gradually leads to more and more free energy, i.e., PEs, and ultimately to chronic uncertainty or chronic stress (Figure 1, left). In this respect, a more adaptive internal model can hardly be developed (Van den Bergh et al., 2020). ...
Article
Full-text available
We apply the Free Energy Principle (FEP) to cognitive behavioral therapy (CBT). FEP describes the basic functioning of the brain as a predictive organ and states that any self-organizing system that is in equilibrium with its environment must minimize its free energy. Based on an internal model of the world and the self, predictions—so-called priors—are created, which are matched with the information input. The sum of prediction errors corresponds to the Free Energy, which must be minimized. Internal models can be identified with the cognitive-affective schemas of the individual that has become dysfunctional in patients. The role of CBT in this picture is to help the patient update her/his priors. They have evolved in learning history and no longer provide adaptive predictions. We discuss the process of updating in terms of the exploration-exploitation dilemma. This consists of the extent to which one relies on what one already has, i.e., whether one continues to maintain and “exploit” one’s previous priors (“better safe than sorry”) or whether one does explore new data that lead to an update of priors. Questioning previous priors triggers stress, which is associated with increases in Free Energy in short term. The role of therapeutic relationship is to buffer this increase in Free Energy, thereby increasing the level of perceived safety. The therapeutic relationship is represented in a dual model of affective alliance and goal attainment alliance and is aligned with FEP. Both forms of alliance support exploration and updating of priors. All aspects are illustrated with the help of a clinical case example.
... Given that the tendency to adopt a "better-safe-thansorry" processing strategy has been proposed as a risk factor for mental and somatic health (Van den Bergh et al., 2021), the present results are also of clinical relevance. Several strategies (e.g., physical exercise, slow breathing, psychotherapeutic interventions) have been identified that are capable of increasing resting vmHRV (Vanderhasselt & Ottaviani, 2022), and Bornemann and colleagues (2016) have already reported that the ability to voluntarily upregulate vmHRV by the use of biofeedback predicts individual differences in altruistic prosocial behavior. ...
Article
Full-text available
Phylogenetic theories suggest resting vagally mediated heart rate variability (vmHRV) as a biomarker for adaptive behav- ior in social encounters. Until now, no study has examined whether vmHRV can predict individual differences in inferring personality traits and intentions from facial appearance. To test this hypothesis, resting vmHRV was recorded in 83 healthy individuals before they rated a series of faces based on their first impression of trustworthiness, dominance, typicality, familiarity, caring, and attractiveness. We found an association between individual differences in vmHRV and social attri- butions from facial appearance. Specifically, higher levels of vmHRV predicted higher scores on ratings of caring and trustworthiness, suggesting that strangers’ faces are more likely to be perceived as safer. The present results suggest that higher levels of vmHRV (compared with lower levels of vmHRV) are associated with the tendency to minimize social evaluative threat and maximize affiliative social cues at a first glance of others’ faces.
... Предполагается, что в основе разнообразия симптомов у пациентов с распространенными психиатрическими диагнозами лежит общий универсальный фактор уязвимости к психическим расстройствам 1 , который прямо коррелирует с выраженностью симптомов и увеличивает риск коморбидности [106]. Этим фактором, находящимся на вершине иерархической модели, является тревожность (нейротизм, негативная эмоциональность), то есть тенденция к постоянному ожиданию и преувеличению негативных последствий в будущем [51]. ...
Article
p style="text-align: justify;">Individuals with high-functioning autism have difficulties in decision-making in face of incomplete or ambiguous information, particularly in the context of social interaction. Tasks demanding an immediate response or deviation from the usual behavior make them feel excessive anxiety which restricts their social and professional activity. Attempts to camouflage their conservatism to others are one of the risk factors for comorbid depression. Therefore, they avoid new and non-routine situations, thus restricting their own social activity and professional development. On the other hand, insisting on sameness and clarity may give individuals with autism an advantage in long-lasting monotonous tasks. The aim of this review is to consider these symptoms from the perspective of predictive coding. A range of experimental studies has shown that most of the subjects with autism have difficulty in predicting the outcomes based on the cumulative history of interacting with the environment, as well as updating expectations as new evidence becomes available. These peculiarities of the analysis and pragmatic weighting of information may cause the trait intolerance of uncertainty and novelty avoidance of most people with autism.</p
... From our standpoint, the implications of IU as an absence of safety are several and have guided our collective thinking over the last seven years (and especially during the pandemic) about the nature of uncertainty, the effects of uncertainty, individual differences in one's relationship with uncertainty, and about how safety may be re-established alongside developing tolerance of uncertainty. Extrapolating from our reading of Brosschot and colleagues (Brosschot et al., 2016;Van den Bergh, Brosschot, Critchley, Thayer, & Ottaviani, 2021), further influenced by the work of others, and now informed by the somatic error theory of intolerance of uncertainty proposed here, we consider some clinical implications. Other theoretical frameworks may lead to similar conclusions, but the following are proposed from the current uncertainty perspective. ...
Article
Intolerance of uncertainty (IU) has gained widespread interest as a construct of broad interest from both transdiagnostic and trans-situational perspectives. We have approached this article inspired by the curiosity, clinical observation, consideration of different theoretical perspectives, speculation, optimism and indeed fun that can be seen in S. J. Rachman's work. We address some of what we know about IU before considering one way of conceptualizing IU from the standpoint of a felt sense or embodied experience. In the first part, we start with Woody and Rachman's (1994) observations of people with GAD. Second, we consider some key findings from the literature. Third, we consider two important perspectives on uncertainty, namely, Brosschot et al.’s (2016, 2018) influential Generalized Unsafety Theory of Stress and uncertainty as an emotion. In the second part, backing our clinical hunch about the importance of the felt sense of uncertainty, we consider IU from the perspective of interoception and the somatic error theory of anxiety (Khalsa & Feinstein, 2018). We propose the somatic error theory of intolerance of uncertainty, which places the experience of uncertainty at the heart of our understanding of intolerance of uncertainty. This is followed by predictions, unresolved questions, and potential clinical implications. Finally, we revisit Woody and Rachman's (1994) suggestions for treatment as internalizing “a sense of safety in a range of circumstances (p. 750)” and update this from the perspective of the felt sense of uncertainty. We finish by suggesting that uncertainty can be tolerated, perhaps accepted, and even embraced.
... Because the linearity between interoceptive sensations and their cognitive representations are questioned, for example, in patients with asthma, where the perceived severity of asthma symptoms has been shown to vary from patient to patient and to be strongly influenced by contextual information (Janssens et al., 2009), altered interoceptive accuracy could lead to distorted expectations (so-called prediction errors) regarding internal signals and thus play a key role in the development of false representations of the body, subsequently contributing to the development, severity, or maintenance of mental or, as some authors suggest, physical illness (Barrett & Simmons, 2015;Edwards et al., 2012;Van den Bergh et al., 2017;Van den Bergh et al., 2021). However, evidence on the relationship between interoceptive accuracy and psychopathology is rather inconsistent, as most studies find reduced interoceptive accuracy (typically measured by performance ratings, especially heartbeat-counting tasks) in most clinical disorders (e. g., depression, eating disorders, autism spectrum disorders, and functional somatic syndromes), whereas in anxiety and obsessive-compulsive disorders, the relationship often points in the opposite direction (see Brewer et al., 2021;Wolters et al., 2022, for an overview). ...
Article
Full-text available
Background: Only recently has interoception been discussed as a common risk factor for psychopathology. Recent approaches distinguish between the ability to accurately perceive (interoceptive accuracy) and the propensity to attend (interoceptive attention) to internal signals. Objective: To examine the latent structure of self-reported interoceptive accuracy and attention and their relationships to psychopathology. Methods: We used a confirmatory factor analysis to clarify the latent structure of interoceptive accuracy and attention. Structural equation modeling was utilized to determine relationships between both abilities with internalizing and somatoform symptomatology according to the HiTOP model (Kotov et al., 2017). Data from N = 619 persons from the German general population were analyzed. Results: Interoceptive attention showed significant positive relationships with all psychopathological traits (r = .221 to r = .377), whereas interoceptive accuracy was negatively associated with internalizing symptomatology (r = -.106). Conclusion: The present findings indicate that personal beliefs about interoceptive abilities have different influences on psychopathological developments.
... Expecting no harm in a dangerous situation ("missing a positive alarm") can be life-threatening, whereas expecting harm in a safe situation ("false alarm") leads to unnecessary avoidance behavior and hypervigilance. False predictions about an unlikely but, in the end, fatal outcome can be more decision-relevant than false predictions about a more likely positive outcome ("better-safe-than-sorry"; Van den Bergh et al., 2021). Lower costs for engaging in avoidance behavior foster its use, whereas higher costs for alternative behaviors reduce the likelihood of action. ...
Article
Full-text available
Expectations are a central maintaining mechanism in mental disorders and most psychological treatments aim to directly or indirectly modify clinically relevant expectations. Therefore, it is crucial to examine why patients with mental disorders maintain dysfunctional expectations, even in light of disconfirming evidence, and how expectation-violating situations should be created in treatment settings to optimize treatment outcome and reduce the risk of treatment failures. The different psychological subdisciplines offer various approaches for understanding the underlying mechanisms of expectation development, persistence, and change. Here, we convey recommendations on how to improve psychological treatments by considering these different perspectives. Based on our expectation violation model, we argue that the outcome of expectation violation depends on several characteristics: features of the expectation-violating situation; the dynamics between the magnitude of expectation violation and cognitive immunization processes; dealing with uncertainties during and after expectation change; controlled and automatic attention processes; and the costs of expectation changes. Personality factors further add to predict outcomes and may offer a basis for personalized treatment planning. We conclude with a list of recommendations derived from basic psychology that could contribute to improved treatment outcome and to reduced risks of treatment failures.
... Another possibility is that it represents a continuation of the learned response of functional avoidance of specific memories. Williams et al (2006) proposed that people may avoid retrieving negatively-valenced specific memories as a means of regulating their emotion, and that over time this tendency may become over compensatory and generalise to the tendency to avoid all specific memories, in the vein of "better safe than sorry" (van den Bergh et al., 2021). Therefore, although depressive symptoms may remit, the tendency to avoid specific memories may persist. ...
Article
Full-text available
Difficulty in accessing specific memories, referred to as reduced memory specificity or overgeneral memory, has been established as a marker of clinical depression. However, it is not clear if this deficit persists following the remission of depressive episodes. The current study involved a systematic review and meta‐analysis of empirical studies with the aim of establishing whether remitted depression was associated with retrieving fewer specific and more overgeneral autobiographical memories. Seventeen studies were identified as eligible. The results indicated that people with remitted depression recalled fewer specific memories (k = 15; g = ‐0.314, 95% CI[‐0.543; ‐0.085], z = ‐2.69, p = .007) and more categoric memories (k = 9; g = 0.254, 95% CI[0.007; 0.501], z = 2.02, p = .043) compared to people who had never been depressed. Given these deficits have elsewhere been shown to be prognostic of future depressive symptoms, these findings suggest that reduced memory specificity/overgeneral memory persists following remission and may be a risk factor for future episodes of depression in those that are in remission. The findings are discussed in terms of how this knowledge might influence clinical understanding of relapse prevention and maintenance of remission in those with a history of depression.
... CM severity and specific types of abuse (emotional, physical, sexual) and neglect (emotional/physical) have been associated with a range of outcomes, such as fewer and less intense positive emotions, and more frequent and intense negative emotions (Infurna et al., 2015;Lavi et al., 2019;Turiano et al., 2017). The differences in frequency and arousal of emotions that are seen in CM survivors are in keeping with a "better safe than sorry" or conservative behavioral strategy (Nesse, 2019;Van den Bergh et al., 2021), in which frequent negative emotions indicate alarm, whereas positive emotions signal exploration and downregulation of sympathetic arousal, as elaborated upon in the method section. In terms of mental health, CM predicts both internalizing and externalizing symptoms in later life (Spinhoven et al., 2016;Waxman et al., 2014). ...
Preprint
Full-text available
Childhood maltreatment (CM) is experienced by ~40% of all children at major personal and societal costs. Studies show adverse consequences of CM on emotional functioning and regulation. This article focuses on differential imprint of emotional, physical, and sexual abuse and/or neglect exteriences during childhood on emotional functionin later in life To study this, we calculated how intense, variable, unstable, inert, and diverse the daily emotions were of 290 Dutch adults (aged 19-73, measured thrice daily during 30 days (90 measurements per person, for five emotion dynamic indices). Participants described abuse/neglect retrospectively using the Childhood Trauma Questionnaire (CTQ). In our structural equation model (SEM), only physical abuse was unrelated to all five emotion dynamic indices. Abuse and neglect showed specific patterns, e.g., emotional abuse, sexual abuse, and physical neglect associated mostly with negative emotions, and emotional neglect predominantly with positive emotion dynamics. CM types were associated differentialy with low versus high arousal emotion dynamics (i.e sexual abuse associated with increased and emotional neglect with reduced emotion dynamics). Dissecting CM effects on adult emotion dynamics may inform theories on the ontogenesis and functioning of emotions, theories on abuse and neglect and the prevention of their developmental sequalia, and help to identify and understand well-adjusted and (dys-)functional emotional development.
... Among other important considerations for including physiological parameters, computational modelling is increasingly used to emulate, and predict, cognitive/psychological processes. Physiological measures are favourable for modelling such processes, which can contribute to the refinement of established cognitive models of interpretation bias (Eguchi et al., 2017;Van den Bergh et al., 2021). ...
Article
An important, yet under-explored area of interpretation bias research concerns the examination of potential physiological correlates and sequalae of this bias. Developing a better understanding of the physiological processes that underpin interpretation biases will extend current theoretical frameworks underlying interpretation bias, as well as optimising the efficacy of cognitive bias modification for interpretation (CBM-I) interventions aimed at improving symptoms of emotional disorders. To this end, systematic searches were conducted across the Web of Science, PsycInfo and Pubmed databases to identify physiological markers of interpretation bias. In addition, grey literature database searches were conducted to compliment peer-reviewed research and to counter publication bias. From a combined initial total of 898 records, 15 studies were included in qualitative synthesis (1 of which obtained from the grey literature). Eligible studies were assessed using a quality assessment tool adapted from the Quality Checklist for Healthcare Intervention Studies. The searches revealed seven psychophysiological markers of interpretation bias, namely event-related potentials, heart rate and heart rate variability, respiratory sinus arrythmia, skin conductance response, pupillometry, and electromyography. The respective theoretical and practical implications of the research are discussed, followed by recommendations for future research.
... One recent idea is that the experience of symptoms involves both a sensory-perceptual component and an affective-motivational component. In individuals high on N/NE, the perceptual detail input would be poor, whereas the affective component would be strong, making these individuals vulnerable to symptom reports not related to physiological dysfunction ( Van den Bergh et al., 2021). A similar theory is that N/NE is related to a negative interpretation bias in the form of negative disambiguation, i.e., interpretation of ambiguous signals in a catastrophic way. ...
Article
Full-text available
Are personality traits related to symptom overreporting and/or symptom underreporting? With this question in mind, we evaluated studies from 1979 to 2020 ( k = 55), in which personality traits were linked to scores on stand-alone validity tests, including symptom validity tests (SVTs) and measures of socially desirable responding (SDR) and/or supernormality. As to symptom overreporting ( k = 14), associations with depression, alexithymia, apathy, dissociation, and fantasy proneness varied widely from weak to strong ( r s .27 to .79). For underreporting ( k = 41), inconsistent links ( r s − .43 to .63) were found with narcissism, whereas alexithymia and dissociation were often associated with lower SDR tendencies, although effect sizes were small. Taken together, the extant literature mainly consists of cross-sectional studies on single traits and contexts, mostly offering weak correlations that do not necessarily reflect causation. What this field lacks is an overarching theory relating traits to symptom reporting. Longitudinal studies involving a broad range of traits, samples, and incentives would be informative. Until such studies have been done, traits are best viewed as modest concomitants of symptom distortion.
... 5), making it less likely that theory-derived hypotheses tested in current biopsychological studies of NART reflect our current state of knowledge of the human brain. First attempts to include recent neuroscientific concepts in understanding personality traits more generally [47] and NART more specifically [47,48 ] have been made but specific, testable neurobiological mechanisms have not been formulated, an issue that is observed across neurobiological personality research [49 ]. Moreover, a theory should provide a hypothesis about how distal different biopsychological variables are from personality traits, whereas most existing theories only allow binary (i. ...
Article
Fifty years of neurobiological research on negative affect related traits (NART) has, if anything, only found relatively few and small replicable effects. Here, we outline ten issues potentially contributing to a conceptual chasm between NART and neurobiological measures relating to (A) measurements of individual differences in neurobiological variables, (B) conceptualizations of trait variables, and (C) potential relationships between trait and neurobiological variables. We believe that these issues may transfer to neurobiological research on other traits and that addressing them in future work may contribute to replicable results and valid interpretations. The issues raised also suggest the need for a comprehensive neurobiological theory of NART and for well-designed theory-based studies that may narrow the chasm between traits and neurobiological markers.
... In current fear-avoidance models, pain-related fear and avoidance behavior are put forward as key determinants in the transition from acute to chronic pain 7,15,29,30 . Generalization of fear and avoidance behavior becomes a maladaptive learning process when extending to safe situations or safe movements following a better safe than sorry strategy 27,28 . Avoidance behavior can generalize to other movements if it is expected that similar movements will have the same protective effect in that context 11 . ...
Article
Full-text available
When pain persists beyond healing time and becomes a “false alarm” of bodily threat, protective strategies, such as avoidance, are no longer adaptive. More specifically, generalization of avoidance based on conceptual knowledge may contribute to chronic pain disability. Using an operant robotic-arm avoidance paradigm, healthy participants (N=50), could perform more effortful movements in the threat context (e.g. pictures of outdoor scenes) to avoid painful stimuli, whereas no pain occured in the safe context (e.g. pictures of indoor scenes). Next, we investigated avoidance generalization to conceptually related contexts (i.e. novel outdoor/indoor scenes). As expected, participants avoided more when presented with novel contexts conceptually related to the threat context than in novel exemplars of the safe context. Yet, exemplars belonging to one category (outdoor/indoor scenes) were not interchangeable; there was a generalization decrement. Posthoc analyses revealed that contingency-aware participants (n=27), but not non-aware participants (n=23), showed the avoidance generalization effect and also generalized their differential pain-expectancy and pain-related fear more to novel background scenes conceptually related to the original threat context. In contrast, the fear-potentiated startle response was not modulated by context. Perspective: This article provides evidence for contextual modulation of avoidance behavior and its generalization to novel exemplars of the learned categories based on conceptual relatedness. Our findings suggest that category-based generalization is a plausible mechanism explaining why patients display avoidance behavior in novel situations that were never directly associated with pain.
... Crucially, different factors are bound to interact with one another and bring about a complex constellation of vulnerability profiles. People may be differently equipped to deal with environmental stressors because of individual differences, such as personality traits (for example, neuroticism) or affective makeup (for example, a tendency toward anxiety or rumination) ( Van den Bergh et al. 2020;Nolen-Hoeksema, Wisco, and Lyubomirsky 2008). Equally, environmental circumstances may increase one's exposure to stressors and therefore affect individual-level factors, to the point that some events-for example, unemployment or childbirth-have come to be known as "turning points" because of their impact on mental health (Turner and Lloyd 1999). ...
Article
Full-text available
The notions of at-risk and subthreshold conditions are increasingly discussed in psychiatry to describe mild, brief, or otherwise atypical syndromes that fail to meet the criteria for clinical relevance. However, the concept of vulnerability is still underexplored in philosophy of psychiatry. This article discusses psychiatric vulnerability to clarify some conceptual issues about the various factors contributing to vulnerability, the notions of risk and protection, and the idea that there are multiple ways of crossing the threshold to clinical relevance. My goal is to lay the groundwork for a finer-grained discussion on psychiatric vulnerability that reflects the complex nature of mental conditions and illustrates the kind of thinking needed in clinical practice.
... Crucially, different factors are bound to interact with one another and bring about a complex constellation of vulnerability profiles. People may be differently equipped to deal with environmental stressors because of individual differences, such as personality traits (for example, neuroticism) or affective makeup (for example, a tendency toward anxiety or rumination) ( Van den Bergh et al. 2020;Nolen-Hoeksema, Wisco, and Lyubomirsky 2008). Equally, environmental circumstances may increase one's exposure to stressors and therefore affect individual-level factors, to the point that some events-for example, unemployment or childbirth-have come to be known as "turning points" because of their impact on mental health (Turner and Lloyd 1999). ...
Article
Full-text available
The notions of at-risk and subthreshold conditions are increasingly discussed in psychiatry to describe mild, brief, or otherwise atypical syndromes that fail to meet the criteria for clinical relevance. However, the concept of vulnerability is still underexplored in philosophy of psychiatry. This article discusses psychiatric vulnerability to clarify some conceptual issues about the various factors contributing to vulnerability, the notions of risk and protection, and the idea that there are multiple ways of crossing the threshold to clinical relevance. My goal is to lay the groundwork for a finer-grained discussion on psychiatric vulnerability that reflects the complex nature of mental conditions and illustrates the kind of thinking needed in clinical practice.
... This association with risk estimation complements previous knowledge, specifically studies linking reduced ERN amplitudes or reduced activity of the anterior cingulate cortex, i.e., the main generator of the ERN, to heightened risk taking (57,58) and increased anterior cingulate cortex activity to risk aversion during decision making (58). Furthermore, an association between ERN and risk perception corresponds with findings that conceptualize an elevated ERN as a low-threshold alarm signal in line with a better safe than sorry logic (20,59,60) and suggest relationships to harm avoidance (14) and threat sensitivity (30). Moreover, alterations in the ERN, and less consistently the CRN, have been observed across disorders such as OCD and generalized anxiety disorder [e.g., (32)], known to be characterized by cognitive biases such as overestimation of threat and risk aversion (61)(62)(63). ...
Article
Full-text available
Background The COVID-19 pandemic is a major life stressor posing serious threats not only to physical but also to mental health. To better understand mechanisms of vulnerability and identify individuals at risk for psychopathological symptoms in response to stressors is critical for prevention and intervention. The error-related negativity (ERN) has been discussed as a neural risk marker for psychopathology and the present study examined its predictive validity for perceived risk, stress, and psychopathological symptoms during the COVID-19 pandemic. Methods One hundred thirteen individuals who had participated as healthy control participants in previous EEG studies (2014-2019) completed a follow-up online survey during the first COVID-19 wave in Germany. Associations of pre-pandemic ERN and correct-response negativity (CRN) with perceived risk regarding COVID-19 infection, stress, and internalizing symptoms during the pandemic were examined using mediation models. Results Pre-pandemic ERN and CRN were associated with increased perceived risk regarding a COVID-19 infection. Via this perceived risk, ERN and CRN were associated with increased stress during the pandemic. Further, risk perception and stress mediated indirect effects of ERN and CRN on internalizing psychopathology, including anxiety, depression, and obsessive-compulsive symptoms, while controlling for the effects of pre-pandemic symptom-levels. Conclusions In summary, heightened pre-pandemic performance monitoring showed indirect associations with increases in psychopathological symptoms during the first COVID-19 wave, via effects on perceived COVID-19 risk and stress. These results further strengthen the notion of performance monitoring ERPs as transdiagnostic neural risk markers and highlight the relevance of stress as a catalyst for symptom development.
... 5,[11][12][13]93,103 In addition, accumulating evidence suggests that people with chronic pain overgeneralize expectancy of pain outcomes and pain-related fear to technically safe stimuli 45,46,66,67 and that it takes longer for them to update these expectancies and fear when presented with disconfirming information compared with healthy, pain-free controls. 68,104 Obviously, protective responses (pain-related fear and avoidance) are adaptive when they prevent further bodily harm, but when they spread to a myriad of safe activities or situations, they may paradoxically compromise daily functioning, leading to disability. ...
... These suggestions, however, are still speculative, and need to be rigorously evaluated in the field of pain before their clinical application. Despite the parallels that can be drawn between pain and anxiety disorders as regards the mechanisms that give rise to and maintain disability (e.g., compromised learning 42,78,112 ), each condition is also characterized by unique elements. Although in anxiety disorders the fear-inducing beliefs may often be exaggerated (e.g., fear of crowded places in agoraphobia), pain is essentially a biologically hardwired signal of threat, conveying urgency and yielding protective actions even when it does not necessarily correspond to bodily harm, and/ or when it becomes chronic 16 . ...
Article
Full-text available
Exposure in vivo is a theory-driven and widely used treatment to tackle functional disability in people with chronic primary pain. Exposure is quite effective; yet, in line with exposure outcomes for anxiety disorders, a number of patients may not profit from it, or relapse. In this focus article, we critically reflect on the current exposure protocols in chronic primary pain, and provide recommendations on how to optimize them. We propose several adaptations that are expected to strengthen inhibitory learning and/or retrieval of the extinction memory, thus likely decreasing relapse. We summarise the limited, but emerging experimental data in the pain domain, and draw parallels with experimental evidence in the anxiety literature. Our reflections and suggestions pertain to the use of the fear hierarchy, reassurance, positive psychology interventions, exposure with a range of stimuli and within different contexts, and the use of safety behaviours during treatment, as well as associating the fear-inducing stimuli with novel outcomes. In addition, we reflect on the importance of specifically tackling (the return of) pain-related avoidance behaviour with techniques such as disentangling fear from avoidance and reinforcing approach behaviours. Finally, we discuss challenges in the clinical application of exposure to improve functioning in chronic primary pain and possible avenues for future research. Perspectives: Inspired by recent advances in learning theory and its applications on the treatment of anxiety disorders, we reflect on the delivery of exposure treatment for chronic primary pain and propose strategies to improve its long-term outcomes.
... Concerning interoceptive signals, a person with a liberal strategy would affirm the perception of a bodily signal faster or categorize a percept as a symptom rather than as a sensation, while individuals with a more conservative strategy would rather "wait and see" (Macmillan & Creelman, 2005;Petersen et al., 2015). The importance of considering these strategies can be illustrated with the following example: When confronted with somatic symptoms, a liberal strategy would increase the likelihood to detect malignant diseases at the cost of time and ease of mind (this is sometimes labeled a "better-safe-than-sorry strategy," e.g., Van den Bergh et al., 2020). The costs and benefits of a conservative decision strategy are the polar opposite (i.e., lower likelihood of detecting malignant disease but the ease of mind and little investment of time). ...
Article
Interoception is essential for the maintenance of physical and mental health. Paradigms assessing cardioceptive accuracy do not separate sensitivity from bias or are very demanding. We present the piloting (study 1; N = 60) and psychometric evaluation and validation (study 2; N = 84) of a novel task for the assessment of cardiac interoceptive perception following the principles of signal detection theory. By disentangling sensitivity and response bias, we demonstrate that the previously used interoceptive accuracy score of the heartbeat mental tracking task represents an amalgam of sensitivity and response bias. The new task demonstrated adequate test-retest reliabilities for sensitivity (d') and response bias (c). Sensitivity was inversely related (β = -.36) to somatic symptom distress after statistically controlling for response bias. The novel cardiovascular signal detection task is easy to implement, feasible, and promising in terms of unraveling the role of (cardiac) interoceptive perception in psychopathology. (PsycInfo Database Record (c) 2021 APA, all rights reserved).
... In daily life, most prolonged physiological activity is not due to stressful events but to perseverative cognition about them. Since we are basically capable of worrying and ruminating about anything at any time, and since worry is controlled by the principle of "better safe than sorry" [78], the total time of worry/rumination and accompanied physiological stress responses can be highly frequent and long lasting and be an obstacle to recovery [79]. Worry/rumination can also interfere with sleep and sleep quality during subsequent nocturnal sleep [80], which is particularly problematic since sleep is important for recovery from daily strain, to prevent exhaustion [81] as well as to recovery from burnout [82]. ...
Article
Full-text available
Burnout is common in many countries and is associated with several other problems such as depression, anxiety, insomnia, and memory deficits, and prospectively it predicts long-term sick-leave, cardiovascular disease, and death. Clinical burnout or its residual symptoms often last several years and a common assumption is that recovery takes a long time by nature, despite full time sick-leave and the absence of work stress. The literature suggests models that hypothetically explain the development, but not maintenance, of the syndrome. Based on cognitive and behavioral principles, stress research, and stress theories, this paper describes a theoretical model explaining how clinical burnout can develop and be maintained. While the development of clinical burnout is mainly explained by prolonged stress reactions and disturbed recovery processes due to work related stress-ors, maintenance of the syndrome is particularly explained by prolonged stress reactions and disturbed recovery processes due to the new context of experiencing burnout and being on sick-leave. Worry about acquired memory deficits, passivity and excessive sleep, shame, fear of stress reactions, and the perception of not being safe are examples of responses that can contribute to the maintenance. The model has important implications for research and how to intervene in clinical burnout. For example, it can offer support to professional care providers and patients in terms of focusing on, identifying, and changing current contextual factors and behaviors that maintain the individ-ual's clinical burnout symptoms and by that facilitate burnout recovery. Regarding research, the model provides a highly important reason for researchers to study contextual factors and behaviors that contribute to the maintenance of clinical burnout, which has been neglected in research.
... In daily life, most prolonged physiological activity is not due to stressful events but to perseverative cognition about them. Since we are basically capable of worrying and ruminating about anything at any time, and since worry is controlled by the principle of "better safe than sorry" [96], the total time of worry/rumination and accompanied physiological stress responses can be highly frequent and long lasting and be an obstacle to recovery [97]. Worry/rumination in terms of "persistent thoughts about work" predicts disturbed sleep [41] and daily worry predicts low heart rate variability (indicating stress activation) during subsequent nocturnal sleep [98], which is particularly problematic since good sleep is crucial for recovery from daily strain [41,98] and burnout [46,99]. ...
Preprint
Full-text available
Burnout is common in many countries and is associated with several other problems, such as depression, anxiety, insomnia and memory deficits, and prospectively it predicts long-term sick-leave, cardiovascular disease and death. Clinical burnout or its residual symptoms often last several years and a common assumption is that recovery takes a long time by nature despite full time sick-leave and absence of work stress. Literature suggests models that hypothetically explain the development, but not maintenance, of the syndrome. Based on cognitive and behavioral principles and stress theory this paper describes a theoretical model explaining how clinical burnout can develop and be maintained. While the development of clinical burnout is mainly explained by prolonged stress reactions and disturbed recovery processes due to work related stressors, maintenance of the syndrome is particularly explained by prolonged stress reactions and disturbed recovery processes due to the new context of experiencing burnout and being on sick-leave. Worry about acquired memory deficits, passivity and excessive sleep, shame, fear of stress reactions, and the perception of not being safe are examples of responses that can contribute to the maintenance. The model has important implications for research and how to intervene clinical burnout.
Article
Distortions in the recollection of autobiographical memories are a transdiagnostic feature of multiple mental health difficulties including mood, anxiety, stressor-related, eating and psychotic disorders. These distortions can be categorized into three broad domains: relatively increased accessibility, affective impact and degree of detail for memories of negatively valenced events with corollary reductions along these dimensions for positive memories; unwanted and distressing intrusive memories of salient past events such as traumas; and a marked relative difficulty in the voluntary retrieval of specific, emotive autobiographical episodes in favour of general themes aggregated across multiple episodes or across extended autobiographical periods. In this Review, we summarize basic science investigations that have carefully mapped the nature of these recollective distortions transdiagnostically across a range of syndromes, and elucidate their causal roles in the onset, maintenance and recovery from disorder. The amenability of these distortions to improvement through cognitive training has led to the translation of this basic science into a number of exciting memory-based interventions that target distortions to generate downstream improvements in clinical symptoms. We review and evaluate these interventions. Finally, we offer a theoretical framework that integrates the basic and clinical research across these three domains and suggest key future research directions. Distortions of autobiographical memory recollection characterize a variety of mental health disorders. In this Review, Dalgleish and Hitchcock summarize key basic research findings in three domains of autobiographical memory distortion, and describe how these have been leveraged in pre-clinical and clinical interventions.
Article
Although fear learning mechanisms are implicated in the development, maintenance, exacerbation, and reduction of genital pain, systematic research on how fear of genital pain emerges, spreads, persists, and reemerges after treatment is lacking. This paper provides an overview of the literature on pain-related fear, integrates the ideas on learning and sexual arousal responding, and specifies the pathways through which compromised learning may contribute to the development and persistence of genital pain. In order to refine theories of genital pain and optimize treatments, we need to adopt a biopsychosocial framework to pain-related fear learning and uncover potential moderators that shape individual trajectories. This involves examining the role of physiological processes, subjective experiences, as well as partner and relational cues in fear acquisition, excessive generalization and impaired safety learning, extinction of fear, counterconditioning, and return of fear. Recent methodological advances in fear conditioning and sex research are promising to enable more symptom-specific and ecologically valid experimental paradigms.
Article
Full-text available
Active inference is a leading theory in neuroscience that provides a simple and neuro-biologically plausible account of how action and perception are coupled in producing (Bayes) optimal behavior; and has been recently used to explain a variety of psychopathological conditions. In parallel, morphogenesis has been described as the behavior of a (non-neural) cellular collective intelligence solving problems in anatomical morphospace. In this article, we establish a link between the domains of cell biology and neuroscience, by analyzing disorders of morphogenesis as disorders of (active) inference. The aim of this article is three-fold. We want to: (i) reveal a connection between disorders of morphogenesis and disorders of active inference as apparent in psychopathological conditions; (ii) show how disorders of morphogenesis can be simulated using active inference; (iii) suggest that active inference can shed light on developmental defects or aberrant morphogenetic processes, seen as disorders of information processing, and perhaps suggesting novel intervention and repair strategies. We present four simulations illustrating application of these ideas to cellular behavior during morphogenesis. Three of the simulations show that the same forms of aberrant active inference (e.g., deficits of sensory attenuation and low sensory precision) that have been used to explain psychopathological conditions (e.g., schizophrenia and autism) also produce familiar disorders of development and morphogenesis when implemented at the level of the collective behavior of a group of cells. The fourth simulation involves two cells with too high precision, in which we show that the reduction of concentration signaling and sensitivity to the signals of other cells treats the development defect. Finally, we present the results of an experimental test of one of the model's predictions in early Xenopus laevis embryos: thioridazine (a dopamine antagonist that may reduce sensory precision in biological systems) induced developmental (anatomical) defects as predicted. The use of conceptual and empirical tools from neuroscience to understand the morphogenetic behavior of pre-neural agents offers the possibility of new approaches in regenerative medicine and evolutionary developmental biology.
Article
Avoidance behavior is a core symptom of anxiety disorders that may hinder adaptation. Anxiety disorders are heterogeneous and previous research suggests to decompose anxiety into two dimensions: anxious apprehension and anxious arousal. How these two dimensions are associated with avoidance of and exposure to threatening stimuli, as well as their accompanying neural processes, is barely understood. We examined threat processing using event-related potentials (N1, LPP) from 134 individuals considering the influence of anxiety dimensions. During a two-phase picture-viewing task the participants watched neutral and threatening pictures, which they were instructed to either avoid or attend to during repeated presentations. Results showed that threatening compared to neutral pictures were associated with increased attention allocation (N1) and in-depth processing (LPP), modulated by task-instructions (lower during avoidance). Further, increased anxious apprehension was associated with heightened automatic attention (increased N1), followed by reduced LPP amplitudes for threatening pictures suggesting reduced in-depth processing. During re-exposure, threatening pictures were associated with increased in-depth processing, with no difference between previously avoided and maintained pictures. Together, these results illustrate that avoidance and high anxious apprehension seem to lead to similar neural changes in the processing of aversive images that may conflict with long-term adaptation.
Article
Pain-related fear and –avoidance crucially contribute to pain chronification. People with chronic pain may adopt costly avoidance strategies above and beyond what is necessary, aligning with experimental findings of excessive fear generalization to safe movements in these populations. Furthermore, recent evidence suggests that, when avoidance is costly, it can dissociate from fear. Here, we investigated whether concurrently measured pain-related fear and costly avoidance generalization correspond in one task. We also explored whether healthy participants avoid excessively despite associated costs, and if avoidance would decrease as a function of dissimilarity from a pain-associated movement. In a robotic arm-reaching task, participants could avoid a low-cost, pain-associated movement trajectory (T+), by choosing a high-cost non-painful movement trajectory (T-), at opposite ends of a movement plane. Subsequently, in the absence of pain, we introduced three movement trajectories (G1-3) between T+ and T-, and one movement trajectory on the side of T- opposite to T+ (G4), linearly increasing in costs from T+ to G4. Avoidance was operationalized as maximal deviation from T+, and as trajectory choice. Fear learning was measured using self-reported pain-expectancy, pain-related fear, and startle eye-blink EMG. Self-reports generalized, both decreasing with increasing distance from T+. In contrast, all generalization trajectories were chosen equally, suggesting that avoidance-costs and previous pain balanced each other out. No effects emerged in the EMG. These results add to a growing body of literature showing that (pain-related) avoidance, especially when costly, can dissociate from fear, calling for a better understanding of the factors motivating, and mitigating, disabling avoidance.
Article
Pain-related avoidance of movements that are actually safe (i.e., overprotective behavior) plays a key role in chronic pain disability. Avoidance is reinforced through operant learning: after learning that a certain movement elicits pain, movements that prevent pain are more likely to be performed. Proprioceptive accuracy importantly contributes to motor learning and memory. Interestingly, reduced accuracy has been documented in various chronic pain conditions, prompting the question whether this relates to avoidance becoming excessive. Using robotic arm-reaching movements, we tested the hypothesis that poor proprioceptive accuracy is associated with excessive pain-related avoidance in pain-free participants. Participants first performed a task to assess proprioceptive accuracy, followed by an operant avoidance training during which a pain stimulus was presented when they performed one movement trajectory, but not when they performed another trajectory. During a test phase, movements were no longer restricted to two trajectories, but participants were instructed to avoid pain. Unbeknownst to the participants, the pain stimulus was never presented during this phase. Results supported our hypothesis. Furthermore, exploratory analyses indicated a reduction in proprioceptive accuracy after avoidance learning, which was associated with excessive avoidance and higher trait fear of pain. Perspective: This study is the first to show that poorer proprioceptive accuracy is associated with excessive pain-related avoidance. This finding is especially relevant for chronic pain conditions, as reduced accuracy has been documented in these populations, and points toward the need for research on training accuracy to tackle excessive avoidance.
Article
Childhood maltreatment is a potent interpersonal trauma associated with dysregulation of emotional processes relevant to the development of psychopathology. The current study identified prospective links between patterns of maltreatment exposures and dimensions of emotion regulation in emerging adulthood. Participants included 427 individuals (48% Male; 75.9% Black, 10.8% White, 7.5% Hispanic, 6% Other) assessed at two waves. At Wave 1, children (10-12 years) from families eligible for public assistance with and without involvement with Child Protective Services took part in a research summer camp. Patterns of child maltreatment subtype and chronicity (based on coded CPS record data) were used to predict Wave 2 (age 18-24 years) profiles of emotion regulation based on self-report, and affective processing assessed via the Affective Go/No-Go task. Results identified associations between task-based affective processing and self-reported emotion regulation profiles. Further, chronic, multi-subtype childhood maltreatment exposure predicted difficulties with aggregated emotion dysregulation. Exposure to neglect with and without other maltreatment subtypes predicted lower sensitivity to affective words. Nuanced results distinguish multiple patterns of emotion regulation in a sample of emerging adults with high exposure to trauma and socioeconomic stress and suggest that maltreatment disrupts emotional development, resulting in difficulties identifying emotions and coping with emotional distress.
Article
Objective: Persistent somatic symptoms cause strong impairment in persons with somatic symptom disorder (SSD) and depressive disorders (DD). Specific negative psychological factors (NPF), such as catastrophizing, negative affectivity, and behavioral avoidance, are assumed to contribute to this impairment and may maintain symptoms via dysregulations of biological stress systems. We examined the associations between NPF and somatic symptoms in the daily life of women with SSD or DD and investigated the mediating role of psychobiological stress responses. Methods: Twenty-nine women with SSD and 29 women with DD participated in an ecological momentary assessment study. Over 14 days, intensity of and impairment by somatic symptoms, NPF, and stress-related biological measures (cortisol, alpha-amylase) were assessed five times per day using an electronic device and saliva samples. Multilevel models were conducted. Results: The greater the number of NPF, the higher the concurrent and time-lagged intensity of and impairment by somatic symptoms in both groups (12.0 to 38.6% of variance explained; Chi2(12) p < .001 for all models). NPF were associated with higher cortisol levels in women with DD and with lower levels in women with SSD (interaction NPF*group: B = -0.04, p = .042 for concurrent, B = -0.06, p = .019 for time-lagged). In women with SSD, lower cortisol levels were associated with higher intensity at the next measurement time point (group*cortisol: B = -1.71, p = .020). No mediation effects were found. Conclusions: NPF may be considered as transdiagnostic factors in the development and treatment of impairing somatic symptoms. Our findings will allow the development of new treatment strategies which use ecological momentary intervention approaches focusing on NPF.
Article
Full-text available
We review the predictive processing theory’s take on goals and affect, to shed new light on mental distress and how it develops into psychopathology such as in affective and motivational disorders. This analysis recovers many of the classical factors known to be important in those disorders, like uncertainty and control, but integrates them in a mechanistic model of adaptive and maladaptive cognition and behavior. We derive implications for treatment that have so far remained underexposed in existing predictive processing accounts of mental disorder, specifically with regard to the model-dependent construction of value, the importance of model validation (evidence), and the introduction and learning of new, adaptive beliefs that relieve suffering.
Article
Full-text available
Excessive generalization of fear and avoidance are hallmark symptoms of chronic pain disability, yet research focusing on the mechanisms underlying generalization of avoidance specifically, is scarce. Two experiments investigated the boundary conditions of costly pain-related avoidance generalization in healthy participants who learned to avoid pain by performing increasingly effortful (in terms of deviation and force) arm-movements using a robot-arm (acquisition). During generalization, novel, but similar arm-movements, without pain, were tested. Experiment 1 (N=64) aimed to facilitate generalization to these movements by reducing visual contextual changes between acquisition and generalization, whereas Experiment 2 (N=70) aimed to prevent extinction by increasing pain uncertainty. Both experiments showed generalization of pain-expectancies and pain-related fear. However, Experiment 2 was the first and only to also demonstrate generalization of avoidance, i.e. choosing the novel effortful arm-movements in the absence of pain. These results suggest that uncertainty about the occurrence of pain may delay recovery, due to reduced disconfirmation of threat beliefs when exploring, resulting in persistent avoidance. Perspective This article demonstrates generalization of instrumentally acquired costly pain-related avoidance in healthy people under conditions of uncertainty. The results suggest that targeting pain-related uncertainty may be a useful tool for clinicians adopting a psychological approach to treating excessive pain-related avoidance in chronic pain.
Article
Full-text available
[Background] Bodily symptoms are highly prevalent in psychopathology, and in some specific disorders, such as somatic symptom disorder, they are a central feature. In general, the mechanisms underlying these symptoms are poorly understood. However, also in well-known physical diseases there seems to be a variable relationship between physiological dysfunction and self-reported symptoms challenging traditional assumptions of a biomedical disease model. [Method] Recently, a new, predictive processing conceptualization of how the brain works has been used to understand this variable relationship. According to this predictive processing view, the experience of a symptom results from an integration of both interoceptive sensations as well as from predictions about these sensations from the brain. [Results] In the present paper, we introduce the predictive processing perspective on perception (predictive coding) and action (active inference), and apply it to asthma in order to understand when and why asthma symptoms are sometimes strongly, moderately or weakly related to physiological disease parameters. [Conclusion] Our predictive processing view of symptom perception contributes to understanding under which conditions misperceptions and maladaptive action selection may arise.
Article
Full-text available
Introduction: Recent advances in research on stress and, respectively, on disorders of perception, learning, and behaviour speak to a promising synthesis of current insights from (i) neurobiology, cognitive neuroscience and psychology of stress and post-traumatic stress disorder (PTSD), and (ii) computational psychiatry approaches to pathophysiology (e.g. of schizophrenia and autism). Methods: Specifically, we apply this synthesis to PTSD. The framework of active inference offers an embodied and embedded lens through which to understand neuronal mechanisms, structures, and processes of cognitive function and dysfunction. In turn, this offers an explanatory model of how healthy mental functioning can go awry due to psychopathological conditions that impair inference about our environment and our bodies. In this context, auditory phenomena—known to be especially relevant to studies of PTSD and schizophrenia—and traditional models of auditory function can be viewed from an evolutionary perspective based on active inference. Results: We assess and contextualise a range of evidence on audition, stress, psychosis, and PTSD, and bring some existing partial models of PTSD into multilevel alignment. Conclusions: The novel perspective on PTSD we present aims to serve as a basis for new experimental designs and therapeutic interventions that integrate fundamentally biological, cognitive, behavioural, and environmental factors.
Chapter
Full-text available
When extreme, anxiety can become debilitating. Anxiety disorders, which often first emerge early in development, are common and challenging to treat, yet the underlying mechanisms have only recently begun to come into focus. Here, we review new insights into the nature and biological bases of dispositional negativity, a fundamental dimension of childhood temperament and adult personality and a prominent risk factor for the development of pediatric and adult anxiety disorders. Converging lines of epidemiological, neurobiological, and mechanistic evidence suggest that dispositional negativity increases the likelihood of psychopathology via specific neurocognitive mechanisms, including attentional biases to threat and deficits in executive control. Collectively, these observations provide an integrative translational framework for understanding the development and maintenance of anxiety disorders in adults and youth and set the stage for developing improved intervention strategies.
Article
Full-text available
To infer the causes of its sensations, the brain must call on a generative (predictive) model. This necessitates passing local messages between populations of neurons to update beliefs about hidden variables in the world beyond its sensory samples. It also entails inferences about how we will act. Active inference is a principled framework that frames perception and action as approximate Bayesian inference. This has been successful in accounting for a wide range of physiological and behavioral phenomena. Recently, a process theory has emerged that attempts to relate inferences to their neurobiological substrates. In this paper, we review and develop the anatomical aspects of this process theory. We argue that the form of the generative models required for inference constrains the way in which brain regions connect to one another. Specifically, neuronal populations representing beliefs about a variable must receive input from populations representing the Markov blanket of that variable. We illustrate this idea in four different domains: perception, planning, attention, and movement. In doing so, we attempt to show how appealing to generative models enables us to account for anatomical brain architectures. Ultimately, committing to an anatomical theory of inference ensures we can form empirical hypotheses that can be tested using neuroimaging, neuropsychological, and electrophysiological experiments.
Article
Full-text available
Understanding the neural correlates of the neurotic brain is important because neuroticism is a risk factor for the development of psychopathology. We examined the correlation between brain structural networks and neuroticism based on NEO Five-Factor Inventory (NEO-FFI) scores. Fifty-one healthy participants (female, n = 18; male, n = 33; mean age, 38.5 ± 11.7 years) underwent the NEO-FFI test and magnetic resonance imaging (MRI), including diffusion tensor imaging and 3D T1WI. Using MRI data, for each participant, we constructed whole-brain interregional connectivity matrices by deterministic tractography and calculated the graph theoretical network measures, including the characteristic path length, global clustering coefficient, small-worldness, and betweenness centrality (BET) in 83 brain regions from the Desikan-Killiany atlas with subcortical segmentation using FreeSurfer. In relation to the BET, neuroticism score had a negative correlation in the left isthmus cingulate cortex, left superior parietal, left superior temporal, right caudal middle frontal, and right entorhinal cortices, and a positive correlation in the bilateral frontal pole, left caudal anterior cingulate cortex, and left fusiform gyrus. No other measurements showed significant correlations. Our results imply that the brain regions related to neuroticism exist in various regions, and that the neuroticism trait is likely formed as a result of interactions among these regions. This work was supported by a Grant-in-Aid for Scientific Research on Innovative Areas (Comprehensive Brain Science Network) from the Ministry of Education, Science, Sports and Culture of Japan.
Article
Full-text available
There has been a recent growth in investigations into the neural mechanisms underlying the problems recalling specific autobiographical events that are a core feature of emotional disorders. In this review we provide the first synthesis of this literature, taking into account brain as well as cognitive mechanisms. We suggest that these problems are driven by idiosyncratic activation in areas of the brain associated with assigning salience and self-relevance to emotional memories. Other areas associated with inhibiting distraction and constructing vivid memory representations are also important. Each of these mechanisms may work independently or in concert with one another. Importantly, this interaction between mechanisms may differ between diagnostic and demographic groups such that similar problems in specificity may be characterised by different mechanisms. Given this challenge, neuroimaging may prove useful in identifying patient-specific biomarkers for interventions.
Article
Full-text available
Serotonin has widespread, but computationally obscure, modulatory effects on learning and cognition. Here, we studied the impact of optogenetic stimulation of dorsal raphe serotonin neurons in mice performing a non-stationary, reward-driven decision-making task. Animals showed two distinct choice strategies. Choices after short inter-trial-intervals (ITIs) depended only on the last trial outcome and followed a win-stay-lose-switch pattern. In contrast, choices after long ITIs reflected outcome history over multiple trials, as described by reinforcement learning models. We found that optogenetic stimulation during a trial significantly boosted the rate of learning that occurred due to the outcome of that trial, but these effects were only exhibited on choices after long ITIs. This suggests that serotonin neurons modulate reinforcement learning rates, and that this influence is masked by alternate, unaffected, decision mechanisms. These results provide insight into the role of serotonin in treating psychiatric disorders, particularly its modulation of neural plasticity and learning.
Article
Full-text available
Understanding the neural substrates of depression is crucial for diagnosis and treatment. Here, we review recent studies of functional and effective connectivity in depression, in terms of functional integration in the brain. Findings from these studies, including our own, point to the involvement of at least four networks in patients with depression. Elevated connectivity of a ventral limbic affective network appears to be associated with excessive negative mood (dysphoria) in the patients; decreased connectivity of a frontal‐striatal reward network has been suggested to account for loss of interest, motivation, and pleasure (anhedonia); enhanced default mode network connectivity seems to be associated with depressive rumination; and diminished connectivity of a dorsal cognitive control network is thought to underlie cognitive deficits especially ineffective top‐down control of negative thoughts and emotions in depressed patients. Moreover, the restoration of connectivity of these networks—and corresponding symptom improvement—following antidepressant treatment (including medication, psychotherapy, and brain stimulation techniques) serves as evidence for the crucial role of these networks in the pathophysiology of depression.
Article
Full-text available
How do we navigate a deeply structured world? Why are you reading this sentence first - and did you actually look at the fifth word? This review offers some answers by appealing to active inference based on deep temporal models. It builds on previous formulations of active inference to simulate behavioural and electrophysiological responses under hierarchical generative models of state transitions. Inverting these models corresponds to sequential inference, such that the state at any hierarchical level entails a sequence of transitions in the level below. The deep temporal aspect of these models means that evidence is accumulated over nested time scales, enabling inferences about narratives (i.e., temporal scenes). We illustrate this behaviour with Bayesian belief updating - and neuronal process theories - to simulate the epistemic foraging seen in reading. These simulations reproduce perisaccadic delay period activity and local field potentials seen empirically. Finally, we exploit the deep structure of these models to simulate responses to local (e.g., font type) and global (e.g., semantic) violations; reproducing mismatch negativity and P300 responses respectively.
Article
Full-text available
Prolonged physiological stress responses form an important risk factor for disease. According to neurobiological and evolution-theoretical insights the stress response is a default response that is always “on” but inhibited by the prefrontal cortex when safety is perceived. Based on these insights the Generalized Unsafety Theory of Stress (GUTS) states that prolonged stress responses are due to generalized and largely unconsciously perceived unsafety rather than stressors. This novel perspective necessitates a reconstruction of current stress theory, which we address in this paper. We discuss a variety of very common situations without stressors but with prolonged stress responses, that are not, or not likely to be caused by stressors, including loneliness, low social status, adult life after prenatal or early life adversity, lack of a natural environment, and less fit bodily states such as obesity or fatigue. We argue that in these situations the default stress response may be chronically disinhibited due to unconsciously perceived generalized unsafety. Also, in chronic stress situations such as work stress, the prolonged stress response may be mainly caused by perceived unsafety in stressor-free contexts. Thus, GUTS identifies and explains far more stress-related physiological activity that is responsible for disease and mortality than current stress theories.
Article
Full-text available
Psychological accounts of symptom perception put forward that symptom experiences consist of sensory-perceptual and affective-motivational components. This division is also suggested by psychometric studies investigating the latent structure of symptom reporting. To corroborate the view that the general and symptom-specific factors of a bifactor model represent affective and sensory components, respectively, we performed bifactor models applying confirmatory factor analytic approaches to the Patient Health Questionnaire-15 and the Checklist for Symptoms in Daily Life completed by 1053 undergraduate students. Additionally, we explored the association of latent factors with negative affectivity (NA). For both questionnaires, a bifactor model with one general and several symptom-specific factors revealed the best fit to the data. NA yielded large associations with the general factor, but smaller ones with somatic symptom-specific factors in both questionnaires. The observed latent structure supports a distinction between sensory-perceptual and affective-motivational components, and the association between the NA and the general factor confirms the affective tone of the latter.
Article
Full-text available
This article outlines how a core concept from theories of homeostasis and cybernetics, the inference-control loop, may be used to guide differential diagnosis in computational psychiatry and computational psychosomatics. In particular we discuss (i) how conceptualizing perception and action as inference-control loops yields a joint computational perspective on brain-world and brain-body interactions and (ii) how the concrete formulation of this loop as a hierarchical Bayesian model points to key computational quantities that inform a taxonomy of potential disease mechanisms. We consider the utility of this perspective for differential diagnosis in concrete clinical applications.
Article
Full-text available
The reliability and validity of traditional taxonomies are limited by arbitrary boundaries between psychopathology and normality, often unclear boundaries between disorders, frequent disorder co-occurrence, heterogeneity within disorders, and diagnostic instability. These taxonomies went beyond evidence available on the structure of psychopathology and were shaped by a variety of other considerations, which may explain the aforementioned shortcomings. The Hierarchical Taxonomy Of Psychopathology (HiTOP) model has emerged as a research effort to address these problems. It constructs psychopathological syndromes and their components/subtypes based on the observed covariation of symptoms, grouping related symptoms together and thus reducing heterogeneity. It also combines co-occurring syndromes into spectra, thereby mapping out comorbidity. Moreover, it characterizes these phenomena dimensionally, which addresses boundary problems and diagnostic instability. Here, we review the development of the HiTOP and the relevant evidence. The new classification already covers most forms of psychopathology. Dimensional measures have been developed to assess many of the identified components, syndromes, and spectra. Several domains of this model are ready for clinical and research applications. The HiTOP promises to improve research and clinical practice by addressing the aforementioned shortcomings of traditional nosologies. It also provides an effective way to summarize and convey information on risk factors, etiology, pathophysiology, phenomenology, illness course, and treatment response. This can greatly improve the utility of the diagnosis of mental disorders. The new classification remains a work in progress. However, it is developing rapidly and is poised to advance mental health research and care significantly as the relevant science matures.
Article
Full-text available
Categorization is a fundamental ability for efficient behavioral control. It allows organisms to remember the correct responses to categorical cues and not for every stimulus encountered (hence eluding computational cost or complexity), and to generalize appropriate responses to novel stimuli dependant on category assignment. Assuming the brain performs Bayesian inference, based on a generative model of the external world and future goals, we propose a computational model of categorization in which important properties emerge. These properties comprise the ability to infer latent causes of sensory experience, a hierarchical organization of latent causes, and an explicit inclusion of context and action representations. Crucially, these aspects derive from considering the environmental statistics that are relevant to achieve goals, and from the fundamental Bayesian principle that any generative model should be preferred over alternative models based on an accuracy-complexity trade-off. Our account is a step toward elucidating computational principles of categorization and its role within the Bayesian brain hypothesis.
Chapter
Full-text available
Although affective value is fundamental in explanations of behavior, it is still a somewhat alien concept in cognitive science. It implies a normativity or directionality that mere information processing models cannot seem to provide. In this paper we trace how affective value can emerge from information processing in the brain, as described by predictive processing. We explain the grounding of predictive processing in homeostasis, and articulate the implications this has for the concept of reward and motivation. However, at first sight, this new conceptualization creates a strong tension with conventional ideas on reward and affective experience. We propose this tension can be resolved by realizing that valence, a core component of all emotions, might be the reflection of a specific aspect of predictive information processing, namely the dynamics in prediction errors across time and the expectations we, in turn, form about these dynamics. Specifically, positive affect seems to be caused by positive rates of prediction error reduction, while negative affect is induced by a shift in a state with lower prediction errors to one with higher prediction errors (i.e., a negative rate of error reduction). We also consider how intense emotional episodes might be related to unexpected changes in prediction errors, suggesting that we also build (meta)predictions on error reduction rates. Hence in this account emotions appear as the continuous non-conceptual feedback on evolving —increasing or decreasing—uncertainties relative to our predictions. The upshot of this view is that the various emotions, from “basic” ones to the non-typical ones such as humor, curiosity and aesthetic affects, can be shown to follow a single underlying logic. Our analysis takes several cues from existing emotion theories but deviates from them in revealing ways. The account on offer does not just specify the interactions between emotion and cognition, rather it entails a deep integration of the two.
Article
Full-text available
We propose a taxonomy of psychopathology based on patterns of shared causal influences identified in a review of multivariate behavior genetic studies that distinguish genetic and environmental influences that are either common to multiple dimensions of psychopathology or unique to each dimension. At the phenotypic level, first-order dimensions are defined by correlations among symptoms; correlations among first-order dimensions similarly define higher-order domains (e.g., internalizing or externalizing psychopathology). We hypothesize that the robust phenotypic correlations among first-order dimensions reflect a hierarchy of increasingly specific etiologic influences. Some nonspecific etiologic factors increase risk for all first-order dimensions of psychopathology to varying degrees through a general factor of psychopathology. Other nonspecific etiologic factors increase risk only for all first-order dimensions within a more specific higher-order domain. Furthermore, each first-order dimension has its own unique causal influences. Genetic and environmental influences common to family members tend to be nonspecific, whereas environmental influences unique to each individual are more dimension-specific. We posit that these causal influences on psychopathology are moderated by sex and developmental processes. This causal taxonomy also provides a novel framework for understanding the heterogeneity of each first-order dimension: Different persons exhibiting similar symptoms may be influenced by different combinations of etiologic influences from each of the 3 levels of the etiologic hierarchy. Furthermore, we relate the proposed causal taxonomy to transdimensional psychobiological processes, which also impact the heterogeneity of each psychopathology dimension. This causal taxonomy implies the need for changes in strategies for studying the etiology, psychobiology, prevention, and treatment of psychopathology. (PsycINFO Database Record
Article
Full-text available
We review evidence for training programmes that manipulate autobiographical processing in order to treat mood, anxiety, and stress-related disorders, using the GRADE criteria to judge evidence quality. We also position the current status of this research within the UK Medical Research Council's (2000, 2008) framework for the development of novel interventions. A literature search according to PRISMA guidelines identified 15 studies that compared an autobiographical episodic memory-based training (AET) programme to a control condition, in samples with a clinician-derived diagnosis. Identified AET programmes included Memory Specificity Training (Raes, Williams, & Hermans, 2009), concreteness training (Watkins, Baeyens, & Read, 2009), Competitive Memory Training (Korrelboom, van der Weele, Gjaltema, & Hoogstraten, 2009), imagery-based training of future autobiographical episodes (Blackwell & Holmes, 2010), and life review/reminiscence therapy (Arean et al., 1993). Cohen's d was calculated for between-group differences in symptom change from pre- to post-intervention and to follow-up. We also completed meta-analyses for programmes evaluated across multiple studies, and for the overall effect of AET as a treatment approach. Results demonstrated promising evidence for AET in the treatment of depression (d=0.32), however effect sizes varied substantially (from −0.18 to 1.91) across the different training protocols. Currently, research on AET for the treatment of anxiety and stress-related disorders is not yet at a stage to draw firm conclusions regarding efficacy as there were only a very small number of studies which met inclusion criteria. AET offers a potential avenue through which low-intensity treatment for affective disturbance might be offered.
Article
Full-text available
This paper outlines a hierarchical Bayesian framework for interoception, homeostatic/allostatic control, and meta-cognition that connects fatigue and depression to the experience of chronic dyshomeostasis. Specifically, viewing interoception as the inversion of a generative model of viscerosensory inputs allows for a formal definition of dyshomeostasis (as chronically enhanced surprise about bodily signals, or, equivalently, low evidence for the brain's model of bodily states) and allostasis (as a change in prior beliefs or predictions which define setpoints for homeostatic reflex arcs). Critically, we propose that the performance of interoceptive-allostatic circuitry is monitored by a metacognitive layer that updates beliefs about the brain's capacity to successfully regulate bodily states (allostatic self-efficacy). In this framework, fatigue and depression can be understood as sequential responses to the interoceptive experience of dyshomeostasis and the ensuing metacognitive diagnosis of low allostatic self-efficacy. While fatigue might represent an early response with adaptive value (cf. sickness behavior), the experience of chronic dyshomeostasis may trigger a generalized belief of low self-efficacy and lack of control (cf. learned helplessness), resulting in depression. This perspective implies alternative pathophysiological mechanisms that are reflected by differential abnormalities in the effective connectivity of circuits for interoception and allostasis. We discuss suitably extended models of effective connectivity that could distinguish these connectivity patterns in individual patients and may help inform differential diagnosis of fatigue and depression in the future.
Article
Full-text available
When extreme, anxiety can become debilitating. Anxiety disorders, which often first emerge early in development, are common and challenging to treat, yet the neurocognitive mechanisms that confer increased risk have only recently begun to come into focus. Here we review recent work highlighting the importance of neural circuits centered on the amygdala. We begin by describing dispositional negativity, a core dimension of childhood temperament and adult personality and an important risk factor for the development of anxiety disorders and other kinds of stress-sensitive psychopathology. Converging lines of epidemiological, neurophysiological, and mechanistic evidence indicate that the amygdala supports stable individual differences in dispositional negativity across the lifespan and contributes to the etiology of anxiety disorders in adults and youth. Hyper-vigilance and attentional biases to threat are prominent features of the anxious phenotype and there is growing evidence that they contribute to the development of psychopathology. Anatomical studies show that the amygdala is a hub, poised to govern attention to threat via projections to sensory cortex and ascending neuromodulatory systems. Imaging and lesion studies demonstrate that the amygdala plays a key role in selecting and prioritizing the processing of threat-related cues. Collectively, these observations provide a neurobiologically-grounded framework for understanding the development and maintenance of anxiety disorders in adults and youth and set the stage for developing improved intervention strategies.
Article
We advance a novel computational model that characterizes formally the ways we perceive or misperceive bodily symptoms, in the context of panic attacks. The computational model is grounded within the formal framework of Active Inference, which considers top-down prediction and attention dynamics as key to perceptual inference and action selection. In a series of simulations, we use the computational model to reproduce key facets of adaptive and maladaptive symptom perception: the ways we infer our bodily state by integrating prior information and somatic afferents; the ways we decide whether or not to attend to somatic channels; the ways we use the symptom inference to make decisions about taking or not taking a medicine; and the ways all the above processes can go awry, determining symptom misperception and ensuing maladaptive behaviors, such as hypervigilance or excessive medicine use. While recent existing theoretical treatments of psychopathological conditions focus on prediction-based perception (predictive coding), our computational model goes beyond them, in at least two ways. First, it includes action and attention selection dynamics that are disregarded in previous conceptualizations but are crucial to fully understand the phenomenology of bodily symptom perception and misperception. Second, it is a fully implemented model that generates specific (and personalized) quantitative predictions, thus going beyond previous qualitative frameworks. (PsycInfo Database Record (c) 2021 APA, all rights reserved).
Article
Interoceptive fears and biased interoception are important characteristics of somatic symptom disorders. Categorization of interoceptive sensations impacts perception of their intensity and unpleasantness. In this study we investigated whether making interoceptive categories threat-relevant further biases interoception of individual sensations compared to safe categories. Either a category containing low- or high-intensity stimuli was made threat-relevant by instructing (and occasionally experiencing) that interoceptive sensations could be followed by an unpredictable electrocutaneous stimulus. We replicated that categorization had a profound impact on perceived interoceptive sensations, with stimuli within categories being perceived as more similar than equidistant stimuli at the category border. We found some evidence for the impact of threat on perceived characteristics of stimuli (with the direction of these effects depending on whether interoceptive stimuli of low or high intensity were threat-relevant), but not for altered categorical choice behaviour. These results imply that the perception of respiratory stimuli is influenced strongly by top-down processes such as categorization, and suggest that interoceptive processing may flexibly adapt to contextual factors such as threat in healthy individuals. However, inflexible responding to repeated and/or severe threat to the internal body may compromise accurate interoception and may result in interoceptive illusions contributing to medically unexplained symptoms and syndromes.
Article
In both child and adult psychiatry, empirical evidence has now accrued to suggest that a single dimension is able to measure a person's liability to mental disorder, comorbidity among disorders, persistence of disorders over time, and severity of symptoms. This single dimension of general psychopathology has been termed "p," because it conceptually parallels a dimension already familiar to behavioral scientists and clinicians: the "g" factor of general intelligence. As the g dimension reflects low to high mental ability, the p dimension represents low to high psychopathology severity, with thought disorder at the extreme. The dimension of p unites all disorders. It influences present/absent status on hundreds of psychiatric symptoms, which modern nosological systems typically aggregate into dozens of distinct diagnoses, which in turn aggregate into three overarching domains, namely, the externalizing, internalizing, and psychotic experience domains, which finally aggregate into one dimension of psychopathology from low to high: p. Studies show that the higher a person scores on p, the worse that person fares on measures of family history of psychiatric illness, brain function, childhood developmental history, and adult life impairment. A dimension of p may help account for ubiquitous nonspecificity in psychiatry: multiple disorders share the same risk factors and biomarkers and often respond to the same therapies. Here, the authors summarize the history of the unidimensional idea, review modern research into p, demystify statistical models, articulate some implications of p for prevention and clinical practice, and outline a transdiagnostic research agenda.
Article
Through advances in both basic and clinical scientific research, Pavlovian fear conditioning and extinction have become an exemplary translational model for understanding and treating anxiety disorders. Discoveries in associative and neurobiological mechanisms underlying extinction have informed techniques for optimizing exposure therapy that enhance the formation of inhibitory associations and their consolidation and retrieval over time and context. Strategies that enhance formation include maximizing prediction-error correction by violating expectancies, deepened extinction, occasional reinforced extinction, attentional control and removal of safety signals/behaviours. Strategies that enhance consolidation include pharmacological agonists of NMDA (i.e. d -cycloserine) and mental rehearsal. Strategies that enhance retrieval include multiple contexts, retrieval cues, and pharmacological blockade of contextual encoding. Stimulus variability and positive affect are posited to influence the formation and the retrieval of inhibitory associations. Inhibitory regulation through affect labelling is considered a complement to extinction. The translational value of extinction will be increased by more investigation of elements central to extinction itself, such as extinction generalization, and interactions with other learning processes, such as instrumental avoidance reward learning, and with other clinically relevant cognitive–emotional processes, such as self-efficacy, threat appraisal and emotion regulation, will add translational value. Moreover, framing fear extinction and related processes within a developmental context will increase their clinical relevance. This article is part of a discussion meeting issue ‘Of mice and mental health: facilitating dialogue between basic and clinical neuroscientists'.
Article
Impaired selective fear learning has been advanced as a core mechanism involved in excessive spreading of protective responses such as pain-related fear and avoidance leading to disability in chronic pain conditions. Using the litmus test for selective learning effects, the blocking procedure, we tested the hypothesis that fibromyalgia patients show less selective threat learning than healthy controls. We introduce a novel selective learning task based around a clinical diary scenario. On a trial-by-trial basis, participants rated whether they expected certain situations (A, B, Z, X) in the diary of a fictive fibromyalgia patient would trigger pain in that patient. The procedure did not involve any experimental pain induction, since the verbal outcomes "pain" or "no pain" were used. During the elemental acquisition phase, one situation was followed by "pain" (A+, e.g. "Kim slept badly, and reports pain"), whereas another situation was followed by "no pain" (Z-, e.g. "Kim was stressed, and reports no pain"). During the compound acquisition phase, another situation (X), referred to as the blocked stimulus, was presented in compound with a previously pain-eliciting situation and also paired with "pain" (AX+, e.g., Kim slept badly" and "Kim has vacuumed", and reports pain). Simultaneously, a novel situation was introduced and also followed by "pain" (B+). Within-group comparisons showed blocking (i.e., significant difference between B and X) in the healthy controls, but not in the fibromyalgia patients. This study is the first in directly assessing differences in selective learning between fibromyalgia patients and healthy controls using a blocking procedure.
Article
The role of serotonin in human brain function remains elusive due, at least in part, to our inability to measure rapidly the local concentration of this neurotransmitter. We used fast-scan cyclic voltammetry to infer serotonergic signaling from the striatum of fourteen brains of human patients with Parkinson’s disease. Here we report these novel measurements and show that they correlate with outcomes and decisions in a sequential investment game. We find that serotonergic concentrations transiently increase as a whole following negative reward prediction errors, while reversing when counterfactual losses predominate. This provides initial evidence that the serotonergic system acts as an opponent to dopamine signaling, as anticipated by theoretical models. Serotonin transients on one trial were also associated with actions on the next trial in a manner that correlated with decreased exposure to poor outcomes. Thus, the fluctuations observed for serotonin appear to correlate with the inhibition of over-reactions and promote persistence of ongoing strategies in the face of short-term environmental changes. Together these findings elucidate a role for serotonin in the striatum, suggesting it encodes a protective action strategy that mitigates risk and modulates choice selection particularly following negative environmental events.
Article
Objective Clinical assessment and diagnostic processes heavily rely on memory-based symptom reports. The current study investigated memory for symptoms and the peak-end effect for dyspnea in patients with medically unexplained symptoms and healthy participants. Methods Female patients with medically unexplained dyspnea (MUD) (n = 22) and matched healthy controls (n = 22) participated in two dyspnea induction trials (short, long). Dyspnea ratings were collected: (1) continuously during symptom induction (concurrent with respiratory measures), (2) immediately after the experiment, and (3) after 2 weeks. Symptoms, negative affect, and anxiety were assessed at baseline and after every trial. The mediating role of state anxiety in symptom reporting was assessed. The peak-end effect was tested with forced-choice questions measuring relative preference for the trials. Results Compared to controls, dyspnea induction resulted in higher levels of symptoms, anxiety, concurrent dyspnea ratings, and minute ventilation in the patient group. In both groups, immediate retrospective ratings were higher than averaged concurrent ratings. No further increase in dyspnea ratings was observed at 2-week recall. Retrospective dyspnea ratings were mediated by both state anxiety and concurrent dyspnea ratings. Patients did not show a peak-end effect, whereas controls did. Conclusion The findings show that patients' experience of a dyspneic episode is subject to immediate memory bias, but does not change over a longer time period. The results also highlight the importance of affective state during symptom experience for both symptom perception and memory.
Article
Cognitive control can be activated by stimulus-stimulus (S-S) and stimulus-response (S-R) conflicts. However, whether cognitive control is domain-general or domain-specific remains unclear. To deepen the understanding of the functional organization of cognitive control networks, we conducted activation likelihood estimation (ALE) from 111 neuroimaging studies to examine brain activation in conflict-related tasks. We observed that fronto-parietal and cingulo-opercular networks were commonly engaged by S-S and S-R conflicts, showing a domain-general pattern. In addition, S-S conflicts specifically activated distinct brain regions to a greater degree. These regions were implicated in the processing of the semantic-relevant attribute, including the inferior frontal cortex (IFC), superior parietal cortex (SPC), superior occipital cortex (SOC), and right anterior cingulate cortex (ACC). By contrast, S-R conflicts specifically activated the left thalamus (TH), middle frontal cortex (MFC), and right SPC, which were associated with detecting response conflict and orienting spatial attention. These findings suggest that conflict detection and resolution involve a combination of domain-general and domain-specific cognitive control mechanisms.
Article
Objective: Induction of negative affective states can enhance bodily symptoms in high habitual symptom reporters among healthy persons, and in patients with irritable bowel syndrome. The aim of this study was to replicate this effect in patients with fibromyalgia and chronic fatigue syndrome and to investigate the role of moderators, focusing on alexithymia, negative affectivity (NA) and absorption. Methods: Patients with fibromyalgia and/or chronic fatigue syndrome (N=81) and healthy controls (HC, N=41) viewed series of neutral, positive and negative affective pictures. After every picture series, participants filled out a somatic symptom checklist and rated emotions experienced during the picture series on valence, arousal and perceived control. Results: Patients reported more somatic symptoms after viewing negative pictures (least square mean (LSM) = 19.40, standard error (SE) = 0.50) compared to neutral (LSM = 17.59, SE = 0.42; p < 0.001) or positive (LSM = 17.04, SE = 0.41; p < 0.001) pictures, while somatic symptom ratings of HC after viewing negative picture series (LSM = 12.07, SE = 0.71) did not differ from ratings after viewing neutral (LSM = 11.07; SE = 0.59; p = 0.065) or positive (LSM = 11.10, SE = 0.58; p = 0.93) pictures. NA did not moderate the symptom-enhancing effect of negative affective pictures, whereas the alexithymia factor 'difficulty identifying feelings (DIF)' and absorption did (p = 0.016 and p = 0.006, respectively). Conclusion: Negative affective states elicit elevated somatic symptom reports in patients suffering from fibromyalgia and/or chronic fatigue syndrome. This symptom-enhancing effect is greater in patients having higher difficulty to identify feelings and higher absorption scores. The results are discussed in a predictive coding framework of symptom perception.
Article
Why do only some individuals develop pathological anxiety following adverse events? Fear acquisition, extinction and return of fear paradigms serve as experimental learning models for the development, treatment and relapse of anxiety. Individual differences in experimental performance were however mostly regarded as ‘noise’ by researchers interested in basic associative learning principles. Our work for the first time presents a comprehensive literature overview and methodological discussion on inter-individual differences in fear acquisition, extinction and return of fear. We tell a story from noise that steadily develops into a meaningful tune and converges to a model of mechanisms contributing to individual risk/resilience with respect to fear and anxiety-related behavior. Furthermore, in light of the present ‘replicability crisis’ we identify methodological pitfalls and provide suggestions for study design and analyses tailored to individual difference research in fear conditioning. Ultimately, synergistic transdisciplinary and collaborative efforts hold promise to not only improve our mechanistic understanding but can also be expected to contribute to the development of specifically tailored (‘individualized’) intervention and targeted prevention programs in the future.
Article
Neuroticism has long been associated with psychopathology and there is increasing evidence that this trait represents a shared vulnerability responsible for the development and maintenance of a range of common mental disorders. Given that neuroticism may be more malleable than previously thought, targeting this trait in treatment, rather than its specific manifestations (e.g., anxiety, mood, and personality disorders), may represent a more efficient and cost-effective approach to psychological treatment. The goals of the current manuscript are to (a) review the role of neuroticism in the development of common mental disorders, (b) describe the evidence of its malleability, and (c) review interventions that have been explicitly developed to target this trait in treatment. Implications for shifting the focus of psychological treatment to underlying vulnerabilities, such as neuroticism, rather than on the manifest symptoms of mental health conditions, are also discussed.
Article
Introduction: Neuroticism is a complex personality trait encompassing diverse aspects. Notably, high levels of neuroticism are related to the onset of psychiatric conditions, including anxiety and mood disorders. Personality traits are stable individual features; therefore, they can be expected to be associated with stable neurobiological features, including the Brain Resting State (RS) activity as measured by fMRI. Several metrics have been used to describe RS properties, yielding rather inconsistent results. This inconsistency could be due to the fact that different metrics portray different RS signal properties and that these properties may be differently affected by neuroticism. To explore the distinct effects of neuroticism, we assessed several distinct metrics portraying different RS properties within the same population. Method: Neuroticism was measured in 31 healthy subjects using the Zuckerman-Kuhlman Personality Questionnaire; RS was acquired by high-resolution fMRI. Using linear regression, we examined the modulatory effects of neuroticism on RS activity, as quantified by the Amplitude of low frequency fluctuations (ALFF, fALFF), regional homogeneity (REHO), Hurst Exponent (H), global connectivity (GC) and amygdalae functional connectivity. Results: Neuroticism modulated the different metrics across a wide network of brain regions, including emotional regulatory, default mode and visual networks. Except for some similarities in key brain regions for emotional expression and regulation, neuroticism affected different metrics in different ways. Discussion: Metrics more related to the measurement of regional intrinsic brain activity (fALFF, ALFF and REHO), or that provide a parsimonious index of integrated and segregated brain activity (HE), were more broadly modulated in regions related to emotions and their regulation. Metrics related to connectivity were modulated across a wider network of areas. Overall, these results show that neuroticism affects distinct aspects of brain resting state activity. More in general, these findings indicate that a multiparametric approach may be required to obtain a more detailed characterization of the neural underpinnings of a given psychological trait.
Article
Biological phenomena arise through interactions between an organism's intrinsic dynamics and stochastic forces – random fluctuations due to external inputs, thermal energy, or other exogenous influences. Dynamic processes in the brain derive from neurophysiology and anatomical connectivity; stochastic effects arise through sensory fluctuations, brainstem discharges, and random microscopic states such as thermal noise. The dynamic evolution of systems composed of both dynamic and random effects can be studied with stochastic dynamic models (SDMs). This paper, Part I of a two-part series, offers a primer of SDMs and their application to large-scale neural systems in health and disease. The companion paper, Part II, reviews the application of SDMs to brain disorders. SDMs generate a distribution of dynamic states, which (we argue) represent ideal candidates for modeling how the brain represents states of the world. When augmented with variational methods for model inversion, SDMs represent a powerful means of inferring neuronal dynamics from functional neuroimaging data in health and disease. Together with deeper theoretical considerations, this work suggests that SDMs will play a unique and influential role in computational psychiatry, unifying empirical observations with models of perception and behavior.
Article
The relationship between the conscious experience of physical symptoms and indicators of objective physiological dysfunction is highly variable and depends on characteristics of the person, the context and their interaction. This relationship often breaks down entirely in the case of “medically unexplained” or functional somatic symptoms, violating the basic assumption in medicine that physical symptoms have physiological causes. In this paper, we describe the prevailing theoretical approach to this problem and review the evidence pertaining to it. We then use the framework of predictive coding to propose a new and more comprehensive model of the body-symptom relationship that integrates existing concepts within a unifying framework that addresses many of the shortcomings of current theory. We describe the conditions under which a close correspondence between the experience of symptoms and objective physiology might be expected, and when they are likely to diverge. We conclude by exploring some theoretical and clinical implications of this new account.
Article
The current meta-analysis investigated the extent to which personality traits changed as a result of intervention, with the primary focus on clinical interventions. We identified 207 studies that had tracked changes in measures of personality traits during interventions, including true experiments and prepost change designs. Interventions were associated with marked changes in personality trait measures over an average time of 24 weeks (e.g., d = .37). Additional analyses showed that the increases replicated across experimental and nonexperimental designs, for nonclinical interventions, and persisted in longitudinal follow-ups of samples beyond the course of intervention. Emotional stability was the primary trait domain showing changes as a result of therapy, followed by extraversion. The type of therapy employed was not strongly associated with the amount of change in personality traits. Patients presenting with anxiety disorders changed the most, and patients being treated for substance use changed the least. The relevance of the results for theory and social policy are discussed.