In praise of clinical judgment: Meehl's forgotten legacy

Adelphi University, غاردين سيتي، نيويورك, New York, United States
Journal of Clinical Psychology (Impact Factor: 2.12). 10/2005; 61(10):1257-76. DOI: 10.1002/jclp.20181
Source: PubMed


Although Paul E. Meehl demonstrated the limits of informal aggregation of data and prognostication by presumed experts, he remained convinced that clinical experience confers expertise of some kind. The authors explore this forgotten side of Meehl's legacy by reconsidering the validity of clinical judgment in its natural context, everyday clinical work. Three domains central to clinical practice are examined: diagnosis, interpretation of meaning, and intervention. It is argued that a more sanguine picture of clinical expertise emerges when the focus shifts from prediction at high levels of inference to (a) judgments at a moderate level of inference, (b) contexts for which clinical training and experience are likely to confer expertise, and (c) conditions that optimize the expression of that expertise (e.g., use of instruments designed for expert observers). The authors conclude by examining domains in which clinical judgment could prove useful in knowledge generation (e.g., hypothesis generation, identification of falsifying instances, item development).

Download full-text


Available from: Joel Weinberger, Jul 16, 2014
  • Source
    • "χ 2 Wald's chi-square, Est. estimate, R reaction condition, E event condition, Exp experience *=p <.05; **=p <.01; ***=p <.001 J Psychopathol Behav Assess influences whether they will express their sensitivity to these relationships (Westen and Weinberger 2005). Despite their differences on our conditional probability task, experienced and inexperienced staff performed similarly when they were asked to assess behavior change using scales from popular behavior checklists that do not emphasize situation-behavior contingencies. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This research examined how people’s ability to detect behavior change in simulated child targets is affected by their clinical experience and the assessment method they use. When using summary assessment methods that are widely employed in research and clinical practice, both inexperienced and experienced clinical staff detected changes in the overall frequency of targets’ aggressive behavior, but were not uniquely influenced by changes in targets’ reactions to social events. When using contextualized assessment methods that focused on conditional reactions, experienced staff showed greater sensitivity than novices to context-specific changes in targets’ aggressive and prosocial reactions to aversive events. Experienced staff also showed greater sensitivity to context-specific changes in their overall impressions of change, but only for aggression. The findings show how clinically experienced judges become more attuned to if…then… contingencies in children’s social behavior, and how summary assessment methods may hamper the detection of change processes.
    Full-text · Article · Sep 2014 · Journal of Psychopathology and Behavioral Assessment
  • Source
    • "Surveys indicate that therapists will rely on their clinical experience or their peer's experience to make clinical decisions rather than use research (Morrow-Bradley & Elliott, 1986; Stewart & Chambless, 2007; Stewart et al., 2012). This argument has also been used in the research literature, for example, Drew Westen and Joel Weinberger has argued that clinical experience should be used in psychotherapy as research does not tend to measure all the required variables to make a necessary judgement about which treatment is likely to help patients and that " Truth does not reveal itself without interpretation " (Westen & Weinberger, 2005). However, clinical experience has been shown to be subject to biases that research is not (Dawes, Faust, & Meehl, 1989; Grove et al., 2000) and purely basing decision-making on clinical experience is likely to contribute to 'therapist drift' from protocols that may negatively impact therapy and make it difficult to maintain treatment integrity (Brosan, Reynolds, & Moore, 2007; Stobie, Taylor, Quigley, Ewing, & Salkovskis, 2007; Waller, 2009). "
    [Show abstract] [Hide abstract]
    ABSTRACT: A large body of research has identified that many therapists do not use research to inform their practice, but few studies investigate the reasons behind this. Aims: The current study seeks to understand what sources therapists use to inform their practice and why they are chosen. Method: Thirty-three interviews with psychological therapists in the UK were undertaken. These were transcribed and analysed using Interpretative Phenomenological Analysis. Results: Two superordinate themes emerged. The former focused on the nature of evidence and the latter described why certain sources were used to make clinical decisions. When discussing evidence, participants felt that research studies, specifically Randomized Controlled Trials (RCTs), used unrepresentative samples. Therapists felt that research other than RCTs, particularly qualitative research, was important. Therapist specific factors were felt to be as, or more, important than the technique used to treat patients. When discussing the sources they used, therapists preferred to use their clinical experience or their patients’ experience to make clinical decisions. Theoretical or practical information was preferred to empirical research. The presentation of information was felt to be important to encourage the implementation of research, and therapists also felt tools such as outcome measures and manuals were too rigid to be useful. Finally, patients’ choice of treatment was felt to be important in treatment decisions. Conclusions: The views of therapists were heterogeneous, but this study highlighted some of the barriers to closing the gap between science and practice. This knowledge can be used to increase the translation of science into practice.
    Full-text · Article · Dec 2013 · Behavioural and Cognitive Psychotherapy
  • Source
    • "Although much less research is available on clinical expertise than on psychological interventions, an important foundation is emerging (Goodheart, 2006; Skovholt & Jennings, 2004; Westen & Weinberger, 2004). For example, research on case formulation and diagnosis suggests that clinical inferences, diagnostic judgments, and formulations can be reliable and valid when structured in ways that maximize clinical expertise (Eells, Lombart, Kendjelic, Turner, & Lucas, 2005; Persons, 1991; Westen & Weinberger, 2005). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Awareness of the need to monitor fidelity in delivery of evidence-based programs is increasing. We examined adherence reported by service providers who had delivered multi-session group or individual family-based Triple P-Positive Parenting Program interventions. Service providers completed session-specific checklists from the Triple P manual to indicate whether or not they had delivered the prescribed activities in their most recent session. We focus on service providers who reported less than 100 % adherence (n = 93) to explore patterns of adherence across sessions, delivery formats, and service providers’ experience. We also coded session activities into processes, therapeutic interventions, homework, and exercises to explore whether service providers’ adherence varied by type of session component. Adherence to therapeutic interventions, homework, and exercises was significantly lower in individual format compared to group format, perhaps reflecting the tailoring of interventions to the needs of the family. Adherence to processes and exercises was significantly higher among service providers with more years of experience. Results have implications for research, training, and supervision and will inform the delivery of evidence-based services for children and families in community settings.
    Full-text · Article · Jan 2013 · Journal of Child and Family Studies
Show more