Article
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Furthermore, besides pure calculation in Bayesian reasoning situations, it is also important that medical students learn, how to extract and asses evidence from scientific articles (Keller et al. 2017). In addition, using frequency trees to explain test results to patients might be a promising tool for doctor-patient communication and should be tested. ...
Article
Full-text available
Abstract When physicians are asked to determine the positive predictive value from the a priori probability of a disease and the sensitivity and false positive rate of a medical test (Bayesian reasoning), it often comes to misjudgments with serious consequences. In daily clinical practice, however, it is not only important that doctors receive a tool with which they can correctly judge—the speed of these judgments is also a crucial factor. In this study, we analyzed accuracy and efficiency in medical Bayesian inferences. In an empirical study we varied information format (probabilities vs. natural frequencies) and visualization (text only vs. tree only) for four contexts. 111 medical students participated in this study by working on four Bayesian tasks with common medical problems. The correctness of their answers was coded and the time spent on task was recorded. The median time for a correct Bayesian inference is fastest in the version with a frequency tree (2:55 min) compared to the version with a probability tree (5:47 min) or to the text only versions based on natural frequencies (4:13 min) or probabilities (9:59 min).The score diagnostic efficiency (calculated by: median time divided by percentage of correct inferences) is best in the version with a frequency tree (4:53 min). Frequency trees allow more accurate and faster judgments. Improving correctness and efficiency in Bayesian tasks might help to decrease overdiagnosis in daily clinical practice, which on the one hand cause cost and on the other hand might endanger patients’ safety.
... So einfach Faktenboxen und Icon Arrays auf den ersten Blick erscheinen, so oft ist es auch bei guter Datenlage nicht trivial, die dafür notwendige medizinische Evidenz in der Literatur zu identifizieren und relevante Zahlen zu Nutzen und Nebenwirkungen einer Maßnahme zu extrahieren. Die Autoren dieses Bei-trags zeigten jedoch, dass ein 90-minütiges Training zu Grundprinzipien der evidenzbasierten Medizin ausreicht, damit Ärzte Faktenboxen mit Daten aus randomisiert kontrollierten Studien befüllen und in der Kommunikation mit Patienten kritisch reflektieren und einsetzen können [9]. Ein weitgehend ungelöstes Problem besteht darin, wie die Unsicherheiten, mit denen alle Daten behaftet sind (z. ...
Article
Die Voraussetzung für eine informierte Behandlungsentscheidung ist ein genaues Verständnis der jeweiligen Risiken und Nebenwirkungen. Viele Patienten haben jedoch Schwierigkeiten beim Umgang mit Statistiken. Das numerische Risikoverständnis kann mit einer komplexitätsreduzierten Darstellung von Risikoinformationen, beispielsweise mittels Faktenboxen, verbessert werden. Grundprinzip ist eine vergleichende Darstellung von Häufigkeiten (1 von 1000) statt Prozentangaben (0,1 %) für unterschiedliche Risikogruppen oder für Interventions- vs. Kontrollgruppen. Über das individuelle Zahlenverständnis hinaus wird die Risikowahrnehmung auch von psychosozialen Variablen beeinflusst. Vor allem krankheitsspezifische Ängste, Motive, anekdotische Informationen und Vorerfahrungen des Patienten können zu einer Wahrnehmung von Risiken führen, die von den „objektiven“ Risiken abweicht. Eine Kenntnis dieser Einflüsse und ihrer Wirkungen kann Ärzte bei der Kommunikation von Risiken und einer partizipativen Entscheidungsfindung in der täglichen klinischen Arbeit unterstützen.
Many doctors, patients, journalists, and politicians alike do not understand what health statistics mean or draw wrong conclusions without noticing. Collective statistical illiteracy refers to the widespread inability to understand the meaning of numbers. For instance, many citizens are unaware that higher survival rates with cancer screening do not imply longer life, or that the statement that mammography screening reduces the risk of dying from breast cancer by 25% in fact means that 1 less woman out of 1,000 will die of the disease. We provide evidence that statistical illiteracy (a) is common to patients, journalists, and physicians; (b) is created by nontransparent framing of information that is sometimes an unintentional result of lack of understanding but can also be a result of intentional efforts to manipulate or persuade people; and (c) can have serious consequences for health. The causes of statistical illiteracy should not be attributed to cognitive biases alone, but to the emotional nature of the doctor–patient relationship and conflicts of interest in the healthcare system. The classic doctor–patient relation is based on (the physician's) paternalism and (the patient's) trust in authority, which make statistical literacy seem unnecessary; so does the traditional combination of determinism (physicians who seek causes, not chances) and the illusion of certainty (patients who seek certainty when there is none). We show that information pamphlets, Web sites, leaflets distributed to doctors by the pharmaceutical industry, and even medical journals often report evidence in nontransparent forms that suggest big benefits of featured interventions and small harms. Without understanding the numbers involved, the public is susceptible to political and commercial manipulation of their anxieties and hopes, which undermines the goals of informed consent and shared decision making. What can be done? We discuss the importance of teaching statistical thinking and transparent representations in primary and secondary education as well as in medical school. Yet this requires familiarizing children early on with the concept of probability and teaching statistical literacy as the art of solving real-world problems rather than applying formulas to toy problems about coins and dice. A major precondition for statistical literacy is transparent risk communication. We recommend using frequency statements instead of single-event probabilities, absolute risks instead of relative risks, mortality rates instead of survival rates, and natural frequencies instead of conditional probabilities. Psychological research on transparent visual and numerical forms of risk communication, as well as training of physicians in their use, is called for. Statistical literacy is a necessary precondition for an educated citizenship in a technological democracy. Understanding risks and asking critical questions can also shape the emotional climate in a society so that hopes and anxieties are no longer as easily manipulated from outside and citizens can develop a better-informed and more relaxed attitude toward their health.
Sons Ltd and The Association for the Study of Medical Education
  • John Wiley
John Wiley & Sons Ltd and The Association for the Study of Medical Education. MEDICAL EDUCATION 2017 really good stuff